Выбрать главу

We filter by gatekeepers: Authorities, parents, priests, and teachers shield the bad and selectively pass on “the good stuff.”

We filter by intermediates: Sky high is the reject pile in the offices of book publishers, music labels, and movie studios. They say no much more often than yes, performing a filtering function for what gets wide distribution. Every headline in a newspaper is a filter that says yes to this information and ignores the rest.

We filter by curators: Retail stores don’t carry everything, museums don’t show everything, public libraries don’t buy every book. All these curators select their wares and act as filters.

We filter by brands: Faced with a shelf of similar goods, the first-time buyer retreats to a familiar brand because it is a low-effort way to reduce the risk of the purchase. Brands filter through the clutter.

We filter by government: Taboos are prohibited. Hate speech or criticism of leaders or of religion is removed. Nationalistic matters are promoted.

We filter by our cultural environment: Children are fed different messages, different content, different choices depending on the expectations of the schools, family, and society around them.

We filter by our friends: Peers have great sway over our choices. We are very likely to choose what our friends choose.

We filter by ourselves: We make choices based on our own preferences, by our own judgment. Traditionally this is the rarest filter.

None of these methods disappear in the rising superabundance. But to deal with the escalation of options in the coming decades, we’ll invent many more types of filtering.

What if you lived in a world where every great movie, book, and song ever produced was at your fingertips as if “for free,” and your elaborate system of filters had weeded out the crap, the trash, and anything that would remotely bore you. Forget about all the critically acclaimed creations that mean nothing to you personally. Focus instead on just the things that would truly excite you. Your only choices would be the absolute cream of the cream, the things your best friends would recommend, including a few “random” choices to keep you surprised. In other words, you would encounter only things perfectly matched to you at that moment. You still don’t have enough time in your life.

For instance, you could filter your selection of books by reading only the greatest ones. Just focus on the books chosen by experts who have read a lot of them and let them guide you to the 60 volumes considered the best of the very best in Western civilization—the canonical collection known as the Great Books of the Western World. It would take you, or the average reader, some 2,000 hours to completely read all 29 million words. And that’s just the Western world. Most of us are going to need further filtering.

The problem is that we start with so many candidates that, even after filtering out all but one in a million, you still have too many. There are more super great five-stars-for-you movies than you can ever watch in your lifetime. There are more useful tools ideally suited to you than you have time to master. There are more cool websites to linger on than you have attention to spare. There are, in fact, more great bands, and books, and gizmos aimed right at you, customized to your unique desires, than you can absorb, even if it was your full-time job.

Nonetheless, we’ll try to reduce this abundance to a scale that is satisfying. Let’s start with the ideal path. And I’ll make it personal. How would I like to choose what I give my attention to next?

First I’d like to be delivered more of what I know I like. This personal filter already exists. It’s called a recommendation engine. It is in wide use at Amazon, Netflix, Twitter, LinkedIn, Spotify, Beats, and Pandora, among other aggregators. Twitter uses a recommendation system to suggest who I should follow based on whom I already follow. Pandora uses a similar system to recommend what new music I’ll like based on what I already like. Over half of the connections made on LinkedIn arise from their follower recommender. Amazon’s recommendation engine is responsible for the well-known banner that “others who like this item also liked this next item.” Netflix uses the same to recommend movies for me. Clever algorithms churn through a massive history of everyone’s behavior in order to closely predict my own behavior. Their guess is partly based on my own past behavior, so Amazon’s banner should really say, “Based on your own history and the history of others similar to you, you should like this.” The suggestions are highly tuned to what I have bought and even thought about buying before (they track how long I dwell on a page deliberating, even if I don’t choose it). Computing the similarities among a billion past purchases enables their predictions to be remarkably prescient.

These recommendation filters are one of my chief discovery mechanisms. I find them far more reliable, on average, than recommendations from experts or friends. In fact, so many people find these filtered recommendations useful that these kinds of “more like this” offers are responsible for a third of Amazon sales—a difference amounting to about $30 billion in 2014. They are so valuable to Netflix that it has 300 people working on its recommendation system, with a budget of $150 million. There are of course no humans involved in guiding these filters once they are operational. The cognification is based on subtle details of my (and others’) behavior that only a sleepless obsessive machine might notice.

The danger of being rewarded with only what you already like, however, is that you can spin into an egotistical spiral, becoming blind to anything slightly different, even if you’d love it. This is called a filter bubble. The technical term is “overfitting.” You get stuck at a lower than optimal peak because you behave as if you have arrived at the top, ignoring the adjacent environment. There’s a lot of evidence this occurs in the political realm as welclass="underline" Readers of one political stripe who depend only on a simple filter of “more like this” rarely if ever read books outside their stripe. This overfitting tends to harden their minds. This kind of filter-induced self-reinforcement also occurs in science, the arts, and culture at large. The more effective the “more good stuff like this” filter is, the more important it becomes to alloy it with other types of filters. For instance, some researchers from Yahoo! engineered a way to automatically map one’s position in the field of choices visually, to make the bubble visible, which made it easier for someone to climb out of their filter bubble by making small tweaks in certain directions.

Second in the ideal approach, I’d like to know what my friends like that I don’t know about. In many ways, Twitter and Facebook serve up this filter. By following your friends, you get effortless updates on the things they find cool enough to share. The ease of shouting out a recommendation via a text or photo is so easy from a phone that we are surprised when someone loves something new but doesn’t share it. But friends can also act like a filter bubble if they are too much like you. Close friends can make an echo chamber, amplifying the same choices. Studies show that going to the next circle, to friends of friends, is sometimes enough to enlarge the range of options away from the expected.

A third component in the ideal filter would be a stream that suggested stuff that I don’t like but would like to like. It’s a bit similar to me trying a least favorite cheese or vegetable every now and then just to see if my tastes have changed. I am sure I don’t like opera, but a few years ago I again tried one—Carmen at the Met—teleprojected real time in a cinema with prominent subtitles on the huge screen, and I was glad I went. A filter dedicated to probing one’s dislikes would have to be delicate, but could also build on the powers of large collaborative databases in the spirit of “people who disliked those, learned to like this one.” In somewhat the same vein I also, occasionally, want a bit of stuff I dislike but should learn to like. For me that might be anything related to nutritional supplements, details of political legislation, or hip-hop music. Great teachers have a knack for conveying unsavory packages to the unwilling in a way that does not scare them off; great filters can too. But would anyone sign up for such a filter?