On algorithmic authority: depends on the algorithm

Lately, the Facebook “friend recommender” has been making “helpful” suggestions. I should “poke” Josh Silver, executive director of FreePress, an advocacy group in favor of net neutrality. I should “friend” Steve Case, founder of AOL. I should introduce friends to the largest real estate developer in Menlo Park, who clearly needs my help. I should write on my Mom’s wall, since we haven’t corresponded lately on Facebook. Facebook’s algorithm is doing a hilariously pathetic job at doing the sort of social assessment we do every day about maintaining social connections.

Facebook’s faith in algorithms is also failing when it comes to its new approach to status updates. Users now have two choices – the News Feed, where Facebook chooses what items are interesting to you, based on an opaque algorithm that users don’t have the opportunity to influce. And firehose “Live Feed”, with every single update from every Facebook friend. Facebook used to have some filtering tools that gave users some choice, but they have abandoned this approach, at least for now.

Louis Gray writes that this approach caused him to miss the news that his sister, who’d been regularly posting updates, had had a new baby. Facebook’s feed algorithm guessed what Louis was interested in, and guessed badly wrong.

In a provocative new post, Clay Shirky writes that “algorithmic authority” – algorithms that Google uses to prioritize search results, show stories in Google News – are becoming a new, accepted form of authority – something that people will accept as reliable by default. These algorithms choose what to show, instead of a human editor.

There’s merit to Clay’s idea – Google News really does use math to produce a reasonable simulacrum of what the news media collectively thinks is important. Google News does a fine job of composing a “front page” based on well-covered, well-trafficked stories. The domain is part of the reason – an earthquake, a war, a stock market crash, are items that many news organizations consider “stories” – there is a lot of convergent information to chew on.

Google News is replacing editorial judgement about what goes on the “front page” – but not about what to cover in the first place. The reason there are stories about plane crashes and missing white women is that conventional wisdom considers these things news. If local news about political battles or environmental hazards doesn’t get covered in media or blogs, Google News won’t find it either. The only thing that Google has to work with is content that some editorial staff or blogger has chosen to cover.

Facebook’s algorithms do less well than Google News or PageRank. Facebook’s failures involve much smaller data sets – hundreds of updates, hundreds of friends – and relevance, not to a broad swath of readers, but to an individual, who has rich context that facebook doesn’t to assess who’s a friend to reconnect with, who’s a relative who prefers other channels; who is appropriate for various levels of formality – no, I am not going to Poke Josh Silver.

Regarding the use of algorithms in social systems, there are very different sorts of problems and desires. Whether “an algorithm” can and should be considered a reliable source will depend on the algorithm and the domain. Where will number-crunching work best, and where will software work best to augment the neural network in our minds? This is an important question in the design and evolution of social software.

7 thoughts on “On algorithmic authority: depends on the algorithm”

Leave a Reply

Your email address will not be published. Required fields are marked *