The interestingness algorithm

At a panel on social media for music at CitizenSpace last year, with discussion among musicians and passionate fans, musicians talked about their efforts to engage fans using emails about upcoming shows and recordings. But what did the fans want from musicians? Several people mentioned that what they appreciated most was music recommendations from musicians themselves.

This rang true to me. I’ve been finding wonderful music just by following musicians on Twitter, and also surfing the last.fm streams of people with distinctive sensibilities. What’s especially cool is that these recommendations are different from the standard marketing recommendations by genre – they aren’t tied to any genre in particular – punk americana musician listens to a series of classical requiems; a steampunk bigband leader listens to instrumentally interesting, intense pop.

These recommendations from people work much better for me than the algorithms in Pandora or Apples “genius”. Pandora finds music that has similar instruments, chords, volume, tempo, and other measurable characteristics. But people reveal music with whatever ineffable characteristics I was seeking. Pandora gets the sound and people get the soul.

It’s a bit of Silicon Valley heresy, perhaps, to be distrustful of algorithms that find things that are “interesting”. And I think that in some circumstances algorithms can find relevant information. Algorithms may be good in some circumstances, but human filters are great. Fundamentally, I suspect that the interestingness algorithm is Turing-complete – an algorithm that could really predict interestingness would have evolved intelligence and humanity.

10 thoughts on “The interestingness algorithm”

  1. Hey Adina. I agree that humans are the best recommendation engines. Especially those you know and already trust. Take for example buying a product. You can read 1000 reviews, but if a close friend tells you their experience with a product, you tend to weight that higher. Where algorithms can help, is processing the huge quantities of data that exist. For example, if you are reading a story, having recommendations on similar stories can be useful. Or, recommendations not on similar content, but other works by that author. I’m much more interested in recommendations than ranking. For example, I do look at Youtube’s recommended videos, but could care less about Youtube’s rankings of videos. Knowing something go 1 star or 5 star means a lot less to me than “If you liked that, you may also like this.” I’ve discovered some great books on Amazon using the recommendations. However, as you point out, I’ve also discovered some great books directly from friends.

  2. Amazon’s recommendations do a great job with similarity – it will find books that are by the same author and/or on a highly related topic. The Amazon recs are useful – I discover books from them often. What they don’t do – and what a friend or very knowledgeable person can do – is to find books that are on topics that may not appear related but have something deep in common.

    For ranking, on Amazon, I find individual reviewers with expertise, and then look for the works that they rank highly. So I use the rankings, but through the individual people.

  3. soylant green is people. The interestingness algorithm is close–ish to what econometrics folks call ‘Stickyness’.
    What fans want from musicians has a lot of intervening variables in segment, age, ..music has a ‘hippness’ problem. The rock era & youth marketing created a cultural monster, in a few ways. Music analysis theory of long standing was more or less waiting to become software, and could use some further refinements to go next generation. It breaks down quite severely in the face of ethnomusicology.
    Lets divorce interestingness from music a little. Go Legit, early jazz, 19th C or early music, or other aesthetic fields. What *should* prolly oughta want, is education. You want to walk through the LA County with a painter & sculptor. You want to drink with a poet, and pull books off the shelf and start reading aloud. You want to sit at the Algonquin Round Table with Dotty Parker, Benchly, et al.
    The Socratic Dialogs were an early attempt to walk with a Smart Dude. The Britannica, the OED, got a little closer to the Art Learning Machine. The old Rolling Stone Record Guide, a rockstar friend observed, was often wrong, or way off base, about any given artist, but almost always got it right in terms of ranking a particular work within any musician’s given oeuvre.
    Turing Complete is like approaching zero, ya ain’t gonna get there, but sometimes you can fake it pretty well.
    For an Interestingness Algorithm, that’s the gig, getting better at faking it.

Leave a Reply

Your email address will not be published. Required fields are marked *