How Computers Know What We Want — Before We Do

Recommendation engines are the software that suggests what we should watch or read or listen to next. They help us deal with the millions of choices the Web offers. But can a computer really have good taste?

  • Photo-Illustration by C.J. Burton for TIME

    (3 of 3)

    The risk you run with recommendation engines is that they'll keep you in a rut. They do that because ruts are comfy places — though often they're deeper than they look. "By definition, we keep you in the same musical neighborhood you start in," says Westergren of the Music Genome Project, "so you could say that's limiting. But even within a neighborhood, there is a ton of room for discovery. Forty-five percent of the people who use Pandora buy more music after they start, and only 1% buy less." And not being based solely on data from its audience, Pandora isn't as vulnerable to peer pressure as most recommendation engines are. It doesn't follow the crowd.

    Pandora is unusual, though. The general effect of recommendation engines on shopping behavior is a hot topic among econometricians, if that's not an oxymoron, but the consensus is this: they introduce us to new things, which is good, but those new things tend to be a lot like the old things, and they tend to be drawn from the shallow pool of things other people have already liked. As a result, they create a blockbuster culture in which the same few runaway hits get recommended over and over again. It's the backlash against the "long tail," the idea that shopping online is all about near infinite selection and cultural diversity. It has a bad habit of eating its own tail and leaving you back where you started.

    But this isn't just about retail. The Web has transformed how we shop. Now it's transforming our social lives too, and recommendation engines are coming along for the ride. Just as Netflix reverse-engineers our response to art, dating sites like Match.com and eHarmony and OKCupid use algorithms to make predictions about that equally ineffable human phenomenon, love; or, failing that, lust. The idea is the same: they break down human behavior into data, then look for patterns in the data that they can use to pair up the humans.

    Even if you're not into online dating, you're probably on Facebook, currently the second most visited site on the Web. Facebook gives users the option of switching between a straight feed, which shows all their friends' news in chronological order, and an algorithmically curated selection of the updates Facebook's recommendation engine thinks they'd most like to see. And in the right-hand column, Facebook uses a different set of algorithms to recommend new friends. If you loved Jason, why not try Jordan?!

    And as for the first most trafficked site on the Web, if you cock your head only slightly to one side, Google is, effectively, a massive recommendation engine, advising us on what we should read and watch and ultimately know. It used to return the same generic results to everyone, but in December it put a service called Personalized Search into wide release. Personalized Search studies the previous 180 days of your searching behavior and skews its results accordingly, based on its best guess as to what you're looking for and how you look for it.

    The principle is almost endlessly generalizable. Anywhere the specter of unconstrained choice confronts us, we're meeting it by outsourcing elements of the selection process to software. Largely unconsciously, we radiate information about ourselves and our personal preferences all day long, and more and more recommendation engines of all shapes and sizes are hoovering up that data and feeding it back to us, reshaping our reality into a form that they fondly hope will be more to our liking — in an endless feedback loop. The effect is to create a customized world for each of us, one that is ever so slightly childproofed, the sharp edges sanded off, and ever so slightly stifling, like recirculated air.

    How far will it go? Will we eventually surf a Web that displays only blogs that conform to our political leanings? A social network in which we see only people of our race and religion? Our horizons, cultural and social, would narrow to a cozy, contented, claustrophobic little dot of total personalization.

    Let's hope not. People weren't built to play it safe all the time. We were meant to be bored and disappointed and offended once in a while. It's good for us. That's what forces us to evolve. Even if it means watching Rocky IV , with Dolph Lundgren. Who knows? You might even like it.

    1. 1
    2. 2
    3. 3
    4. Next Page