The Information Diet Read online

Page 2


  The best food journalists distill this complex world of choices into healthy ones. Michael Pollan, Knight Professor at the University of California at Berkeley, is a leading example. The beginning of his In Defense of Food (Penguin) is a seven-word diet guide: “Eat food. Not too much. Mostly plants.” And there it is, right up front. We can take those three simple rules—those seven words—into the grocery store, and win.

  While our collective sweet tooth used to serve us well, in the land of abundance it’s killing us. As it turns out, the same thing has happened with information. The economics of news have changed and shifted, and we’ve moved from a land of scarcity into a land of abundance. And though we are wired to consume—it’s been a key to our survival—our sweet tooth for information is no longer serving us well. Surprisingly, it too is killing us.

  Chapter 2. Information, Power, and Survival

  Knowledge Is Power

  With language came the ability to coordinate with each other more effectively. Nomadic tribes began to develop symbols to keep themselves better organized. Calendars appear to have been developed about 10,000 years ago, improving our ability to plant seasonal crops. Armed with the seeds of agriculture, we didn’t need to be as transient anymore; gradually, supported by the surpluses of expanding agriculture, nomadic tribes turned into civilizations, and more sophisticated governments emerged.

  The Sumerians and then the Egyptians started using glyphs to express the value of currency around 6,000 years ago. Once the Egyptians settled on a standard alphabet years later, they reaped another information technology boom and reached heights no other civilization previously known to man ever had, taking on massive engineering tasks, building new modes of transportation, and acquiring vast power across an empire.

  Yet carving symbols into stone tablets was painstaking work, and errors were costly. Stone tablets didn’t travel that well either, and if they broke, weeks, months, or even years of work could be lost in an instant. As such, production of this kind of information was largely relegated to a special class: scribes.

  Being a good scribe meant holding significant power in Egypt.[15] Not only were scribes exempt from the manual labor of the lower classes in Egypt, but many also supervised developments or large-scale government projects. They were considered part of the royal court, didn’t have to fight in the military, and had guaranteed employment not only for themselves, but for their sons as well.[16] In the case of one scribe of the third dynasty’s chief, Imhotep, it even meant post-mortem deification.[17]

  Later, another era in communication began with the creation of the first form of mass media: Gutenberg’s moveable type, fitted to a printing press, which enabled writing to be produced as type-set books, each copy identical to the last, without scribes.

  Again, society was transformed. Literacy spread along with printing. As books became plentiful and inexpensive, they could be acquired by any prosperous, educated person, not just by the ruling or religious classes. This set the stage for the Renaissance, the flowering of artistic, scientific, cultural, religious, and social growth that swept across Europe. Next came the revival and spread of democracy. By the time of the American Revolution, printing had made Thomas Paine’s pamphlets bestsellers that rallied the troops to victory. The modern metropolitan newspaper, radio, television—all were based on the same basic idea: that communication could be mass-produced from a central source.

  The latest transformational change came in earnest just three decades ago, when the personal computer and then the Internet converged to throw us firmly into the digital age. Today, five billion people have cell phones. A constantly flowing electron cloud encircles and unites a networked planet. Anyone with a broadband connection to the Internet has access to much, if not all, of the knowledge that came before, and the ability to communicate not just as a single individual but as a broadcaster. Smartphones are pocketsized libraries, printing presses, cameras, radios, televisions—all that came before, in the palm of your hand.

  The Arguments Against Progress

  Technical progress always comes with its critics. The greater the speed and power of this progress, the greater the criticism. Intel researcher Genevieve Bell notes that every time we have shifts in technology, we also have new moral panic. Panic? Here’s just one example: there were some who believed, during the development of the railway, that a woman’s uterus could go flying out of her body if she accelerated to 50 miles per hour.[18]

  Electricity came with a set of critics, too: the electric light could inform miscreants that women and children were home. The lightbulb was a recipe for total social chaos.

  These Luddite folk tales are funny, looking back. But other criticisms have gained traction over the centuries.

  Our connection to the teachings of Socrates, for instance, is through the written word of Plato, because Socrates was vehemently against the written word. Socrates thought that the book would do terrible things to our memories. We’d keep knowledge in books and not in our heads. And he was right: people don’t carry around stories like The Iliad in their heads anymore, though it was passed down in a verbal tradition for hundreds of years before the written word. We traded memorization for the ability to learn less about more—for choice.

  Critics of the printing press believed that books would cause the spread of sin and eventually destroy the relationship between people and the church. As author and New York University professor Clay Shirky rightfully points out, the printing press did indeed fuel the Protestant Reformation, and yes, growth in erotic fiction.[19]

  Though some critiques of the written word have fared better than others, all have faded over time. There just aren’t that many people today who think the printing press was a bad idea—not if five billion of them are voting with their purchases to carry one around with them all day.

  Despite this, critiques of technology on moral grounds look very similar today. The strongest critiques (as Bell notes) tend to be about women and children. As if it were the modern-day critique of electricity, the television show “To Catch a Predator” features sexual predators using the Internet to seduce children—the subtext is that this powerful new tool can be used to steal your babies.

  Still, there is a serious trend emerging in digital age critiques. Distinguished journalists, acclaimed scholars, and prominent activists are worried about what the information explosion is doing to our attention spans or even to our general intelligence. Bill Keller, former executive editor of The New York Times, equated allowing his daughter to join Facebook to passing her a pipe of crystal meth.[20]

  Nicholas Carr’s book The Shallows (W.W. Norton) is full of concerns that social media is making his brain demand “to be fed the way the Net fed it— and the more it was fed, the hungrier it became.”[21] In the Atlantic, Carr’s “Is Google Making Us Stupid” expresses similar fears: “Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory.”[22]

  In his book The Filter Bubble (Penguin), my friend and left-of-center activist Eli Pariser warns us of the dangers of personalization. Your Google search results, your Facebook newsfeed, and even your daily news are becoming tailored specifically to you through the magic of advanced technology. The result: an increasingly homogenized view of the world is delivered to you in such a fashion that you’re only able to consume what you already agree with.

  These kinds of critiques of the Web are nothing new. They’re as old as the Web itself—older, actually. In 1995, in the very early days of the World Wide Web, Clifford Stoll wrote in Silicon Snake Oil (Anchor), “Computers force us into creating with our minds and prevent us from making things with our hands. They dull the skills we use in everyday life.”

  Keller, Stoll, and Carr all point to something interesting: new technologies do create anthropological changes in society. Yet some of these critics seem to miss the mark; the Internet is not some kind of meta bogey man that’s sn
eaking into Mr. Carr’s room while he sleeps and rewiring his brain, nor did Mr. Stoll’s 1995 computer sneak up behind him and handcuff him to a keyboard.

  Moreover, the subtext of these theories—and ones like them—is that there may be some sort of corporate conspiracy to try to, as Jobs put it, “dumb us down.” Somehow I doubt that Larry Page and Sergey Brin, the founders of Google, woke up one morning with a plan to rewire our brains. Twitter CEO Jack Dorsey is probably not a super-villain looking to destroy the world’s attention span with the medium’s 140-character limit. Mark Zuckerberg is likely not trying to destroy the world through excessive friendship-building.

  Blaming a medium or its creators for changing our minds and habits is like blaming food for making us fat. While it’s certainly true that all new developments create the need for new warnings—until there was fire, there wasn’t a rule to not put your hand in it—conspiracy theories wrongly take free will and choice out of the equation. The boardroom of Kentucky Fried Chicken does not have public health as its top priority, true, but if everyone suddenly stopped buying the chicken, they’d be out of business in a month. Fried chicken left in its bucket will not raise your cholesterol. It does not hop from its bucket and deep-dive into your arteries. Fried chicken (thankfully) isn’t autonomous, of course, and isn’t capable of such hostility.

  As long as good, honest information is out there about what’s what, and people have the means to consume it, the most dangerous conspiracy is the unspoken pact between producer and consumer.

  Out of the four critiques—those of Keller, Carr, Pariser, and Stoll— Pariser’s is the one that makes the most sense to me. Personalization today is mostly a technical issue with consequences that the technologists at our major Internet companies are developing in order to keep us clicking. That said, personalization isn’t an evil algorithm telling us what our corporate overlords want us to hear; rather, it’s a reflection of our own behavior.

  If right-of-center links are not showing up in your Facebook feed, it’s likely because you haven’t clicked on them when you’ve had the opportunity. Should corporations building personalization algorithms include mutations to break a reader’s filter bubble? Should people be able to “opt out” of tracking systems? Absolutely. But readers should also accept responsibility for their actions and make efforts to consume a responsible, nonhomogenous diet, too. The problem isn’t the filter bubble, the problem is that people don’t know that their actions have opaque consequences.

  As with Socrates’ reluctance to embrace the written word, critics like Carr and Stoll are onto something, but they’re attacking the wrong thing. It wasn’t the written word that has stopped most of us from memorizing the epic Odyssey; rather, it is our choice not to memorize it anymore, and to read books instead.

  Anthropomorphized computers and information technology cannot take responsibility for anything. The responsibility for healthy consumption lies with human technology, in the software of the mind. It must be shared between the content provider and the consumer, the people involved.

  There Is No Such Thing as Information Overload

  Once we begin to accept that information technology is neutral and cannot possibly rewire our brains without our consent or cooperation, something else becomes really clear: there’s no such thing as information overload.

  It’s the best “first world problem” there is. “Oh, my inbox is so full,” or, “I just can’t keep up with all the tweets and status updates and emails” are common utterances of the digital elite. Though we constantly complain of it—of all the news, and emails, and status updates, and tweets, and the television shows that we feel compelled to watch—the truth is that information is not requiring you to consume it. It can’t: information is no more autonomous than fried chicken, and it has no ability to force you to do anything as long as you are aware of how it affects you. There has always been more human knowledge and experience than any one human could absorb. It’s not the total amount of information, but your information habit that is pushing you to whatever extreme you find uncomfortable.

  Even so, we not only blame the information for our problems, we’re arrogant about it. More disturbing than our personification of information is the presumption that the concept of information overload is a new one, specific to our time.

  In 1755, French philosopher Denis Diderot noted:

  “As long as the centuries continue to unfold, the number of books will grow continually, and one can predict that a time will come when it will be almost as difficult to learn anything from books as from the direct study of the whole universe. It will be almost as convenient to search for some bit of truth concealed in nature as it will be to find it hidden away in an immense multitude of bound volumes.”[23]

  Diderot was on target with the continuous growth of books, but he also made a common mistake in predicting the future. He presumed that technology would stay complacent. In this short verse, he didn’t anticipate that with an increasing number of books, new ways to classify and organize them would arise.

  A century after Diderot wrote, we had the Dewey Decimal system to help us search for those bits of truth “hidden away in an immense multitude of bound volumes.” Two and a half centuries later, the pages are bound not to bookbindings, but to electronic formats. It has never been faster and easier than with Amazon to find and buy a book in either a print or electronic version. And Google would be delighted if every word of every book were searchable—on Google.

  To say, therefore, that the Internet causes our misinformation ignores history. In the modern arms race between fact and fiction, it’s always been a close fight: we’re no better at being stupid or misinformed than our grandparents were. It’s the ultimate ironic form of generational narcissism. History is filled with entire cultures ending up misinformed and misled by ill-willed politicians and deluded masses.

  Just like Stoll and Carr, Diderot was onto something, but he was lured into the trap of blaming the information technology itself.

  The field of health rarely has this problem: one never says that a lung cancer victim dies of “cigarette overload” unless a cigarette truck falls on him. Why, then, do we blame the information for our ills? Our early nutritionist, Banting, provides some prescient advice. He writes in Corpulence:

  “I am thoroughly convinced, that it is quality alone which requires notice, and not quantity. This has been emphatically denied by some writers in the public papers, but I can confidently assert, upon the indisputable evidence of many of my correspondents, as well as my own, that they are mistaken.”[24]

  Banting’s letter gives us an idea of what the real problem is. It’s not information overload, it’s information overconsumption that’s the problem. Information overload means somehow managing the intake of vast quantities of information in new and more efficient ways. Information overconsumption means we need to find new ways to be selective about our intake. It is very difficult, for example, to overconsume vegetables.

  In addition, the information overload community tends to rely on technical filters—the equivalent of trying to lose weight by rearranging the shelves in your refrigerator. Tools tend to amplify existing behavior. The mistaken concept of information overload distracts us from paying attention to behavioral changes.

  The Information Overload Research Group, a consortium of “researchers, practitioners and technologists,” is a group set up to help “reduce information overload.” Its website offers a research section with 26 research papers on the topic, primarily focused on dealing with electronic mail and technology used to manage distractions and interruptions. If they mention user behavior at all, they’re focused on a person’s relationship with a computer and the tools within it.

  Now, don’t get me wrong. I appreciate a good spam filter as much as the next person, but what we need are new ways of thinking and of coping.

  Just as Banting triggered a wave of concern about diet as we shifted from a land of food scarcity to abundance, we have to start taking
responsibility ourselves for the information that we consume. That means taking a hard look at how our information is being supplied, how it affects us, and what we can do to reduce its negative effects and enhance its positive ones.

  Chapter 3. Big Info

  “For 200 years the newspaper front page dominated public thinking. In the last 20 years that picture has changed. Today television news is watched more often than people read newspapers, than people listen to radio, than people read or gather any other form of communication. The reason: people are lazy. With television you just sit-watch-listen. The thinking is done for you.”

  —Anonymous memo, Nixon Presidential Archives Largely attributed to Roger Ailes, Nixon Campaign Staffer and now FOX News Chairman[25]

  Choice Lessons

  The New Media

  Even more than television, Fox routinely tweaks the news on the Web to make the news more palatable to its audience. Even when it takes content from other sources like the Associated Press and puts it on its website, the organization tweaks the headlines to make them more attractive to its conservative audience. The AP’s story “Economic Worries Pose New Snags for Obama” turned into “Obama Has a Big Problem with White Women.” “Obama to Talk Economy, Not Politics, in Iowa” turned into “White House Insists Obama’s Iowa Stop for Economy, Not 2012.” And “Malaysia Police Slammed for Cattle-Branding Women” turned into “Malaysian Muslims Cattle-Brand Prostitutes.”