(2018-04-16) Diresta The Webs Recommendation Engines Are Broken Can We Fix Them
Renee Diresta: The Web’s Recommendation Engines Are Broken. Can We Fix Them?
Recommendation engines have become The Great Polarizer.
As the consequences of curatorial decisions grow more dire, we need to ask: Can we make the internet’s recommendation engines more ethical? And if so, how?
they are doing precisely what they’re designed to do.
Pinterest algorithms don’t register a difference between suggesting duckie balloons and serving up extremist propaganda
We manage what we can measure. It’s much easier to measure time on site or monthly average user stats than to quantify the outcomes of serving users conspiratorial or fraudulent content.
Some of this work is already underway. Project Redirect, an effort by Google Jigsaw, redirects certain types of users who are searching YouTube for terrorist videos—people who appear to be motivated by more than mere curiosity. Rather than offer up more violent content, the approach of that recommendation system is to do the opposite—it points users to content intended to de-radicalize them
Giving people more control over what their algorithmic feed serves up is one potential solution. Twitter, for example, created a filter that enables users to avoid content from low-quality accounts. Not everyone uses it, but the option exists.
Simple keyword bans are often overbroad, and lack the nuance to understand if an account, Group, or Pin is discussing a volatile topic, or promoting it. Reactive moderation often leads to outcries about censorship.
Perhaps that involves creating a visible list of “Do Not Amplify” topics in line with the platform’s values.
Edited: | Tweet this! | Search Twitter for discussion
No backlinks!
No twinpages!