Tag: Eli Pariser

How Filter Bubbles Distort Reality: Everything You Need to Know

The Basics

Read the headline, tap, scroll, tap, tap, scroll.

It is a typical day and you are browsing your usual news site. The New Yorker, BuzzFeed, The New York Times, BBC, The Globe and Mail, take your pick. As you skim through articles, you share the best ones with like-minded friends and followers. Perhaps you add a comment.

Few of us sit down and decide to inform ourselves on a particular topic. For the most part, we pick up our smartphones or open a new tab, scroll through a favored site and click on whatever looks interesting. Or we look at Facebook or Twitter feeds to see what people are sharing. Chances are high that we are not doing this intending to become educated on a certain topic. No, we are probably waiting in line, reading on the bus or at the gym, procrastinating, or grappling with insomnia, looking for some form of entertainment.

We all do this skimming and sharing and clicking, and it seems so innocent. But many of us are uninformed about or uninterested in the forces affecting what we see online and how content affects us in return — and that ignorance has consequences.

The term “filter bubble” refers to the results of the algorithms that dictate what we encounter online. According to Eli Pariser, those algorithms create “a unique universe of information for each of us … which fundamentally alters the way we encounter ideas and information.”

Many sites offer personalized content selections, based on our browsing history, age, gender, location, and other data. The result is a flood of articles and posts that support our current opinions and perspectives to ensure that we enjoy what we see. Even when a site is not offering specifically targeted content, we all tend to follow people whose views align with ours. When those people share a piece of content, we can be sure it will be something we are also interested in.

That might not sound so bad, but filter bubbles create echo chambers. We assume that everyone thinks like us, and we forget that other perspectives exist.

Filter bubbles transcend web surfing. In important ways, your social circle is a filter bubble; so is your neighborhood. If you're living in a gated community, for example, you might think that reality is only BMWs, Teslas, and Mercedes. Your work circle acts as a filter bubble, too, depending on whom you know and at what level you operate.

One of the great problems with filters is our human tendency to think that what we see is all there is, without realizing that what we see is being filtered.

Eli Pariser on Filter Bubbles

The concept of filter bubbles was first identified by Eli Pariser, executive of Upworthy, activist, and author. In his revolutionary book Filter Bubbles, Pariser explained how Google searches bring up vastly differing results depending on the history of the user. He cites an example in which two people searched for “BP” (British Petroleum). One user saw news related to investing in the company. The other user received information about a recent oil spill.

Pariser describes how the internet tends to give us what we want:

Your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click.

Pariser terms this reflection a filter bubble, a “personal ecosystem of information.” It insulates us from any sort of cognitive dissonance by limiting what we see. At the same time, virtually everything we do online is being monitored — for someone else's benefit.

Each time we click, watch, share, or comment, search engines and social platforms harvest information. In particular, this information serves to generate targeted advertisements. Most of us have experienced the odd sensation of deja vu as a product we took a look at online suddenly appears everywhere we go online, as well as in our email inboxes. Often this advertising continues until we succumb and purchase the product.

Targeted advertisements can help us to find what we need with ease, but costs exist:

Personalization is based on a bargain. In exchange for the service of filtering, you hand large companies an enormous amount of data about your daily life — much of which you might not trust your friends with.

The internet has changed a great deal from the early days, when people worried about strangers finding out who they were. Anonymity was once king. Now, our privacy has been sacrificed for the sake of advertising revenue:

What was once an anonymous medium where anyone could be anyone—where, in the words of the famous New Yorker cartoon, nobody knows you’re a dog—is now a tool for soliciting and analyzing our personal data. According to one Wall Street Journal study, the top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each. Search for a word like “depression” on Dictionary. com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants. Share an article about cooking on ABC News, and you may be chased around the Web by ads for Teflon-coated pots. Open—even for an instant—a page listing signs that your spouse may be cheating and prepare to be haunted with DNA paternity-test ads. The new Internet doesn’t just know you’re a dog; it knows your breed and wants to sell you a bowl of premium kibble.

The sources of this information can be unexpected. Companies gather it from places we might not even consider:

When you read books on your Kindle, the data about which phrases you highlight, which pages you turn, and whether you read straight through or skip around are all fed back into Amazon’s servers and can be used to indicate what books you might like next. When you log in after a day reading Kindle e-books at the beach, Amazon can subtly customize its site to appeal to what you’ve read: If you’ve spent a lot of time with the latest James Patterson, but only glanced at that new diet guide, you might see more commercial thrillers and fewer health books.

One fact is certain. The personalization process is not crude or random. It operates along defined guidelines which are being refined every day. Honing occurs both on the whole and for individuals:

Most personalized filters are based on a three-step model. First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media. There’s just one flaw in this logic: Media also shape identity. And as a result, these services may end up creating a good fit between you and your media by changing … you.

In The Shallows, Nicholas Carr also covers online information collection. Carr notes that the more time we spend online, the richer the information we provide:

The faster we surf across the surface of the Web—the more links we click and pages we view—the more opportunities Google gains to collect information about us and to feed us advertisements. Its advertising system, moreover, is explicitly designed to figure out which messages are most likely to grab our attention and then to place those messages in our field of view. Every click we make on the Web marks a break in our concentration, a bottom-up disruption of our attention—and it’s in Google’s economic interest to make sure we click as often as possible.

Every single person who has ever spent time on the web knows how addictive the flow of stimulating information can be. No matter how disciplined we otherwise are, we cannot resist clicking related articles or scrolling through newsfeeds. There is a reason for this, as Pariser writes:

Personalized filters play to the most compulsive parts of you, creating “compulsive media” to get you to click things more.

In an attention economy, filter bubbles assist search engines, websites, and platforms in their goal to command the maximum possible share of our online time.

The Impact of Filter Bubbles

Each new technology brings with it a whole host of costs and benefits. Many are realized only as time passes. The invention of books led people to worry that memory and oral tradition would erode. Paper caused panic as young people switched from slates to this newfangled medium. Typewriters led to discussions of morality as female typists entered the job force and “distracted” men. The internet has been no exception. If anything, the issues presented by it are unique only in their complex intensity.

In particular, the existence of filter bubbles has led to widespread concern. Pariser writes:

Democracy requires citizens to see things from one another's point of view, but instead we're more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead we’re being offered parallel but separate universes.

… Personalization filters serve a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown.

Pariser quotes Jon Chait as saying:

Partisans are more likely to consume news sources that confirm their ideological beliefs. People with more education are more likely to follow political news. Therefore, people with more education can actually become mis-educated.

Many people have debated the impact of filter bubbles on the recent US election and the Brexit vote. In both cases, large numbers of people were shocked by the outcome. Even those within the political and journalistic worlds expected the inverse results.

“We become, neurologically, what we think.”

— Nicholas Carr

In the case of the Brexit vote, a large percentage of those who voted to leave the European Union were older people who are less active online, meaning that their views are less visible. Those who voted to remain tended to be younger and more active online, meaning that they were in an echo chamber of similar attitudes.

Democracy requires everyone to be equally informed. Yet filter bubbles are distorting our ideas of the world. In a paper for Princeton University, Jacob N. Shapiro revealed the extent of the influence on our voting:

The results of these experiments demonstrate that (i) biased search rankings can shift the voting preferences of undecided voters by 20% or more, (ii) the shift can be much higher in some demographic groups, and (iii) search ranking bias can be masked so that people show no awareness of the manipulation. We call this type of influence, which might be applicable to a variety of attitudes and beliefs, the search engine manipulation effect. Given that many elections are won by small margins, our results suggest that a search engine company has the power to influence the results of a substantial number of elections with impunity. The impact of such manipulations would be especially large in countries dominated by a single search engine company.

Filter bubbles do not just occur on the internet. Shapiro provides an example from a decade ago of TV shifting the results of elections:

It is already well established that biased media sources such as newspapers, political polls, and television sway voters. A 2007 study by DellaVigna and Kaplan found, for example, that whenever the conservative-leaning Fox television network moved into a new market in the United States, conservative votes increased, a phenomenon they labeled the Fox News Effect. These researchers estimated that biased coverage by Fox News was sufficient to shift 10,757 votes in Florida during the 2000 US Presidential election: more than enough to flip the deciding state in the election, which was carried by the Republican presidential candidate by only 537 votes. The Fox News Effect was also found to be smaller in television markets that were more competitive.

However, Shapiro believes the internet has a more dramatic effect than other forms of media:

Search rankings are controlled in most countries today by a single company. If, with or without intervention by company employees, the algorithm that ranked election-related information favored one candidate over another, competing candidates would have no way of compensating for the bias. It would be as if Fox News were the only television network in the country. Biased search rankings would, in effect, be an entirely new type of social influence, and it would be occurring on an unprecedented scale. Massive experiments conducted recently by social media giant Facebook have already introduced other unprecedented types of influence made possible by the Internet. Notably, an experiment reported recently suggested that flashing “VOTE” advertisements to 61 million Facebook users caused more than 340,000 people to vote that day who otherwise would not have done so.

In the US election and the Brexit vote, filter bubbles caused people to become insulated from alternative views. Some critics have theorized that the widespread derision of Trump and Leave voters led them to be less vocal, keeping their opinions within smaller communities to avoid confrontation. Those who voted for Clinton or to Remain loudly expressed themselves within filtered communities. Everyone, it seemed, agreed with each other. Except, they didn’t, and no one noticed until it was too late.

A further issue with filter bubbles is that they are something we can only opt out of, not something we consent to. As of March 2017, an estimated 1.94 billion people have a Facebook account, of which 1.28 billion log on every day. It is safe to assume that only a small percentage are informed about the algorithms. Considering that 40% of people regard Facebook as their main news source, this is worrying. As with cognitive biases, a lack of awareness amplifies the impact of filter bubbles.

We have minimal concrete evidence of exactly what information search engines and social platforms collect. Even SEO (search engine optimization) experts do not know for certain how search rankings are organized. We also don’t know if sites collect information from users who do not have accounts.

Scandals are becoming increasingly common, as sites and services are found to be harvesting details without consent. For example, Evernote came under fire when documents revealed that staff members can access documents, and Unroll’s nasty habit of selling details of user email habits led to criticism. Even when this information is listed in user agreements or disclaimers, it can be difficult for users to ascertain from the confusing jargon how their data are being used, by whom, and why.

In his farewell speech, President Obama aired his personal concerns:

[We] retreat into our own bubbles, … especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions. … And increasingly, we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there.

Filter bubbles can cause cognitive biases and shortcuts to manifest, amplifying their negative impact on our ability to think in a logical and critical manner. A combination of social proof, availability bias, confirmation bias, and bias from disliking/liking is prevalent. As Pariser writes:

The filter bubble tends to dramatically amplify confirmation bias—in a way, it’s designed to. Consuming information that conforms to our ideas of the world is easy and pleasurable; consuming information that challenges us to think in new ways or question our assumptions is frustrating and difficult. This is why partisans of one political stripe tend not to consume the media of another. As a result, an information environment built on click signals will favor content that supports our existing notions about the world over content that challenges them.

Pariser sums up the result of extensive filtration: “A world constructed from the familiar is the world in which there's nothing to learn.”

Filter Bubbles and Group Psychology

We have an inherent desire to be around those who are like us and reinforce our worldview. Our online behavior is no different. People form tribes based on interests, location, employment, affiliation, and other details. These groups — subreddits, Tumblr fandoms, Facebook groups, Google+ circles, etc. — have their own rules, conventions, in-jokes, and even vocabulary. Within groups (even if members never meet each other), beliefs intensify. Anyone who disagrees may be ousted from the community. Sociologists call this behaviour “communal reinforcement” and stress that the ideas perpetuated can have no relation to reality or empirical evidence.

“When you’re asked to fight a war that’s over nothing, it’s best to join the side that’s going to win.”

— Conor Oberst

Communal reinforcement can be positive. Groups geared towards people with mental health problems, chronic illnesses, addictions, and other issues are often supportive and assist many people who might not have another outlet.

However, when a group is encased within a filter bubble, it can lead to groupthink. This is a psychological phenomenon wherein groups of people experience a temporary loss of the ability to think in a rational, moral and realistic manner. When the members of a group are all exposed to the same confirmatory information, the results can be extreme. Symptoms include being excessively optimistic, taking risks, ignoring legal and social conventions, regarding those outside the group as enemies, censoring opposing ideas, and pressuring members to conform. As occurred with the US election and the Brexit vote, those experiencing groupthink within a filter bubble see themselves as in the right and struggle to consider alternative perspectives.

For example, imagine a Facebook group for Trump supporters in the months prior to the election. Members share pro-Trump news items, discuss policies and circulate cohesive information among themselves. Groupthink sets in, as the members selectively process information, fail to evaluate alternative viewpoints, fail to consider risks, haze any members who disagree, and even ignore the possibility of a negative outcome. From the outside, we can see the issues with a combination of filter bubbles and groupthink, but they can be hard to identify from the inside.

How Can We Avoid Filter Bubbles?

Thankfully, it is not difficult to pop the filter bubble if we make an effort to do so. Methods for doing this include:

  • Using ad-blocking browser extensions. These remove the majority of advertisements from websites we visit. The downside is that most sites rely on advertising revenue to support their work, and some (such as Forbes and Business Insider) insist on users' disabling ad blockers before viewing a page.
  • Reading news sites and blogs which aim to provide a wide range of perspectives. Pariser’s own site, Upworthy, aims to do this. Others, including The Wall Street Journal, the New Yorker, BBC, and AP news claim to offer a balanced view of the world. Regardless of the sources we frequent, a brief analysis of the front page will provide a good idea of any biases. In the wake of the US election, a number of newsletters, sites, apps, and podcasts are working to pop the filter bubble. An excellent example is Colin Wright's podcast, Let’s Know Things (http://letsknowthings.com/), which examines a news story in context each week.
  • Switching our focus from entertainment to education. As Nicholas Carr writes in The Shallows: “The Net’s interactivity gives us powerful new tools for finding information, expressing ourselves, and conversing with others. It also turns us into lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment.”
  • Using Incognito browsing, deleting our search histories, and doing what we need to do online without logging into our accounts.
  • Deleting or blocking browser cookies. For the uninitiated, many websites plant “cookies” (small text files) each time we visit them; those cookies are then used to determine what content to show us. Cookies can be manually deleted, and browser extensions are available which remove them. In some instances, cookies are useful, so removal should be done with discretion.

Fish don’t know they are in the water and we don’t know we are in a filter bubble unless we take the effort to (as David Bowie put it) leave the capsule — if you dare.

In shaping what we see, filter bubbles show us a distorted map and not the terrain. In so doing, they trick our brains into thinking that this is the reality. As technology improves and the ability of someone like the NYT, say, to show the same story to 100 different people using 100 different ways, the filter bubble becomes deeper. We lose track of what's filtered and what's not as the news becomes tailored to cement our existing opinions. After all, everyone wants to read a newspaper that agrees with them.

Systems — be they people, cultures, or web browsing, to name a few examples — naturally have to filter information and thus they reduce options. Sometimes people make decisions, sometimes cultures make them, and increasingly algorithms make them. As the speed of information flowing through these systems increases, filters will play an even more important role.

Understanding that what we see is not all there is will help us realize that we're living in a distorted world and remind us to take off the glasses.

For more information on filter bubbles, consider reading Filter Bubbles by Eli Pariser, So, You’ve Been Publicly Shamed by Jon Ronson, The Shallows by Nicholas Carr or The Net Delusion by Evgeny Morozov.

Trust Me, I’m Lying: Why Sites Like Gawker Manipulate You

“A newspaper is a business out to make money through advertising revenue.
That is predicated on its circulation and you know what the circulation depends on. …”
The Long Goodbye

***

trust me i'm lying

Ryan Holiday's book Trust Me, I'm Lying: Confessions of a Media Manipulator, offers a penetrating look at the incentives of media.

Holiday, himself, is a practitioner of the dark arts of media manipulation and uses these techniques to make a living.

“Usually, it is a simple hustle,” Holiday writes. “Someone pays me, I manufacture a story for them, and we trade it up the chain — from a tiny blog to Gawker to a website of a local news network to the Huffington Post to the major newspapers to cable news and back again, until the unreal becomes real. Sometimes I start by planting a story. Sometimes I put out a press release or ask a friend to break a story on their blog. Sometimes I ‘leak’ a document. Sometimes I fabricate a document and leak that. Really, it can be anything, from vandalizing a Wikipedia page to producing an expensive viral video. However the play starts, the end is the same: The economics of the Internet are exploited to change public perception — and sell product.”

For me, the most interesting part of the book was the history of the press, which begins with the Party Press, moves on to the Yellow Press and ends with the Modern Press (aka Subscription Press). Holiday gives us this history lesson to explain how news outlets sold their product over the years.

The Party Press

The earliest forms of newspapers were a function of political parties. These were media outlets for party leaders to speak to party members, to give them the information they needed and wanted. … These papers were not some early version of Fox News. They usually were one-man shops. The editor-publisher-writer-printer was the dedicated steward of a very valuable service to that party in his town. The service was the ability to communicate ideas and information about important issues. …

This first stage of journalism was limited in its scope and impact. Because of the size and nature of its audience, the party press was not in the news business. They were in the editorial business. It was a different time and style, one that would be eclipsed by changes in technology and distribution.

The Yellow Press
Newspapers changed the moment that Benjamin Day launched the New York Sun in 1833. It was not so much his paper that changed everything but his way of selling it: on the street, one copy at a time. He hired the unemployed to hawk his papers and immediately solved a major problem that had plagued the party presses: unpaid subscriptions. Day’s “cash and carry” method offered no credit. You bought and walked. The Sun, with this simple innovation in distribution, invented the news and the newspaper. A thousand imitators followed.

These papers weren’t delivered to your doorstep. They had to be exciting and loud enough to fight for their sales on street corners, in barrooms, and at train stations. Because of the change in distribution methods and the increased speed of the printing press, newspapers truly became newspapers. Their sole aim was to get new information, get it to print faster, get it more exclusively than their competition. It meant the decline of the editorial. These papers relied on gossip. …

… He (James Gordon Bennett) knew that the newspaper’s role was “not to instruct but to startle.” His paper was anti-black, anti-immigrant, and anti-subtlety. These causes sold papers—to both people who loved them for it and people who hated them for it. And they bought and they bought.

… The need to sell every issue anew each day creates a challenge I call the “One-Off Problem.” Bennett’s papers solved it by getting attention however they could.

The first issue of Bennett’s Herald looked like this: First page—eye-catching but quickly digestible miscellany; Second page—the heart of the paper, editorial and news; Third page—local; Fourth page—advertising and filler. There was something for everyone. It was short, zesty. He later tried to emphasize quality editorial instead of disposable news by swapping the first two pages. The results were disastrous. He couldn’t sell papers on the street that way.

The One-Off Problem shaped more than just the design and layout of the newspaper. When news is sold on a one-off basis, publishers can’t sit back and let the news come to them. There isn’t enough of it, and what comes naturally isn’t exciting enough. So they must create the news that will sell their papers. When reporters were sent out to cover spectacles and events, they knew that their job was to cover the news when it was there and to make it up when it was not.

Speaking of the markers of “yellow journalism” (the One-Off problem), author of Yellow Journalism and media historian W.J. Campbell wrote:

As practiced more than 100 years ago, yellow journalism was a robust, enterprising genre characterized by these practices and features:

  • the frequent use of multicolumn headlines that sometimes stretched across the front page.
  • a variety of topics reported on the front page, including news of politics, war, international diplomacy, sports, and society.
  • the generous and imaginative use of illustrations, including photographs and other graphic representations such as locator maps.
  • bold and experimental layouts, including those in which one report and illustration would dominate the front page. Such layouts sometimes were enhanced by the use of color.
  • a tendency to rely on anonymous sources, particularly in dispatches of leading correspondents.
  • a penchant for self-promotion, to call attention eagerly to the paper’s accomplishments. This tendency was notably evident in crusades against monopolies and municipal corruption.

As defined above and as practiced more than a century ago, yellow journalism could not be called predictable, boring, or uninspired — complaints of the sort that are not infrequently raised about U.S. newspapers in the early twenty-first century.

Does any of that sound familiar? It should. Just take a look at Gawker and The Huffington Post. It's the modern version of the One-Off problem. Instead of trying to sell you a copy of the newspaper by shouting on the street corner, today's media want page views. In yellow journalism, headlines and promotions were more important than content.

So what happened after the yellow press? Holiday continues:

The Modern Stable Press

… Adolph S. Ochs, publisher of the New York Times, ushered in the next iteration of news. Ochs, like most great businessmen, understood that doing things differently was the way to great wealth. In the case of his newly acquired newspaper and the dirty, broken world of yellow journalism, he made the pronouncement that “decency meant dollars.”

He immediately set out to change the conditions that allowed the Bennett, Hearst, Pulitzer, and their imitators to flourish. He was the first publisher to solicit subscriptions via telephone. He offered contests to his salesman. He gave them quotas and goals for the number of subscribers they were expected to bring in.

He understood that people bought the yellow papers because they were cheap—and they didn’t have any other options. He felt that if they had a choice, they’d pick something better. He intended to be that option. First, he would match his competitors’ prices, and then he would deliver a paper that far surpassed the value implied by the low price.

It worked. When he dropped the price of the Times to one cent, circulation tripled in the first year. He would compete on content. He came up with the phrase “All the News That’s Fit to Print” as a mission statement for the editorial staff, two months after taking over the paper. The less known runner-up says almost as much: “All the World’s News, But Not a School for Scandal.”

Of course, the transition to the modern press wasn't immediate. The subscription model, however, better aligned the incentives of the reader and newspaperman. Subscriptions change everything because readers who are misled unsubscribe. Content, not headlines, ruled the day.

“With Ochs’s move,” Holiday writes, “reputation began to matter more than notoriety. This was the era of the professionalization of journalism. “For the first time, it created a sense of obligation, not just to the paper and circulation, but also to the audience.”

While subscription journalism meant you didn't have to peddle papers on the street, that didn't make it a perfect system.

As the character Philip Marlowe observed in Raymond Chandler’s novel The Long Goodbye:

Newspapers are owned and published by rich men. Rich men all belong to the same club. Sure, there’s competition—hard tough competition for circulation, for newsbeats, for exclusive stories. Just so long as it doesn’t damage the prestige and privilege and position of the owners.

We've had a good run. For a long time journalism was primarily sold via subscriptions (the stable press model) but now we're moving quickly towards online à la Carte offerings. Journalism is no longer selling a package. Now each story is like a mini paper on the side of the street corner in the 1840's trying to be heard over all of the other stories.

Eli Pariser wrote in The Filter Bubble:

Our bodies are programmed to consume fat and sugars because they’re rare in nature. Thus, when they come around, we should grab them. In the same way, we’re biologically programmed to be attentive to things that stimulate: content that is gross, violent, or sexual and that gossip which is humiliating, embarrassing, or offensive. If we’re not careful, we’re going to develop the psychological equivalent of obesity. We’ll find ourselves consuming content that is least beneficial for ourselves or society as a whole.

… Each article ascends the most-forwarded lists or dies an ignominious death on its own…. The attention economy is ripping the binding, and the pages that get read are the pages that are frequently the most topical, scandalous, and viral.

Think about how you consume media today. You don't read one newspaper or blog. You read an assortment of many newspapers and blogs. And you don't pay for any of it. The trust relationship is fractured. Competition centers around who can create the most read story. That means journalism becomes about what spreads – not what's good.

MIT Media Studies Professor Henry Jenkins gives publishers and companies the following advice: “if it doesn't spread, it's dead.” Spreading is traffic. Traffic is money.

So what spreads?

Joseph Campbell and Katherine Milkman, of the Wharton School looked into over 7,000 articles that made it on to the New York Times Most Emailed List. They conclude:

Virality is partially driven by physiological arousal. Content that evokes high-arousal positive (awe) or negative (anger or anxiety) emotions is more viral. Content that evokes low-arousal, or deactivating, emotions (e.g., sadness) is less viral.

Basically, virality is determined by how much anger the article causes. Not all extreme emotions spread. Sadness doesn't spread. Anger spreads. If anger spreads and financial incentives are somehow aligned to ‘page views' more of our journalism will move towards what spreads.

Something analogous to Gresham's Law can be found in the One-Off problem. If each story has to find its own audience (i.e., it's no longer sold as a bundle) and compensation is derived from page views, we can expect incentives to favor a lot of low-cost articles with catchy headlines. In the end, we're likely to get the stories we want to read — not the ones we should read. If not part of subscription, we'll likely lose the in-depth reporting we've come to expect from some of the old media guards.

(side-note: I found the first part of Holiday's book — where he explains how page view media works and how he manipulated the system — pretty good. The book is not without it's “controversy” though. Overall it's a great read for anyone interested in the economics of new media.)

(update: a previous version of this post (ironically) sourced a site that accused Holiday of mis-quoting a study. Holiday contacted me and it appears that he used an older version of the study, which did, in fact, contain the quote he referenced in his book. The other site was wrong and my verification process was lax. My bad.)

***

“If you’re not paying for something,
you’re not the customer; you’re the product being sold.”

— Andrew Lewis

Still curious? If you want to know more about how the media is manipulated, read Trust Me, I'm Lying: Confessions of a Media Manipulator.

The Filter Bubble — What the Internet is Hiding From You

Just “googling it” might not be such a great idea after all.

The Filter Bubble, by Eli Pariser, puts forth an argument that we're increasingly trapped inside an algorithm that filters our news based on what it thinks is best for us.

Computers and the algorithms they run are increasingly aware of the things we seem to like. These algorithms even tailor results so we get more of what we like and less of what we don't like. That means two people googling the same thing are likely to see different results.

The problem with this, Pariser argues, is that you're not making a conscious choice to have your results filtered — it happens without your knowledge or consent. And that causes a whole host of issues, of which Pariser is primarily concerned with the social and political implications.

When technology's job is to show you the world, it ends up sitting between you and reality, like a camera lens.

“If we want to know what the world really looks like,” Pariser writes, “we have to understand how filters shape and skew our view of it.” It's useful to borrow from nutrition to illustrate this point:

Our bodies are programmed to consume fat and sugars because they’re rare in nature. Thus, when they come around, we should grab them. In the same way, we’re biologically programmed to be attentive to things that stimulate: content that is gross, violent, or sexual and that gossip which is humiliating, embarrassing, or offensive. If we’re not careful, we’re going to develop the psychological equivalent of obesity. We’ll find ourselves consuming content that is least beneficial for ourselves or society as a whole.

Consider for a moment where we are headed. If Google knows that I'm a democrat or republican they could now add a filter to my news to show me only the stories I'm predisposed to agree with. Based on their guess as to my education level, they may then tailor the article's words and language to maximize its impact on me. In this world I only see things I agree with and writing that I easily comprehend and that's a problem. Google might know that I don't read anything about republican tax cuts or democratic spending so they might just filter those articles out. Reality, as I see it, becomes what the lens shows me.

If you're not paying for something, you're not the customer; you're the product being sold.
—Andrew Lewis

When asked about the prospects for important but unpopular news, Media Lab's Nicholas Negroponte smiled. On one end of the spectrum, he said is sycophantic personalization — “you're so great and wonderful, and I'm going to tell you exactly what you want to hear.” On the other end is the parental approach: “I'm going to tell you this whether you want to hear this or not, because you need to know.” Currently, he argues, we're headed in sycophantic direction.

Whether you believe the book's conclusions are wholly convincing or not, it is worth thinking about. If nothing else, it is thought-provoking.

(If you want a search engine that won't “track you” try DuckDuckGo.com)