As bad as Facebook’s experiment on its members seems, the reality was worse

What’s the worst thing about the news that Facebook hosted what The Atlantic aptly called a “secret mood manipulation experiment,” conducted on 689,000 unwitting members of its network? That’s hard to say. There’s so much bad to choose from.

First, the nature of the experiment: It was cruel. Researchers from Facebook itself, Cornell, and University of California, San Francisco, were looking into whether emotional states could be spread via news shared by online media. That means, without direct personal contact.

The vehicle for this “emotional contagion” was to be Facebook’s News Feed: By tinkering with the feed, researchers altered the balance of positive and negative items people were exposed to. Then, by peeking at the messages that those same people subsequently posted—some 3 million during the week in January 2012 when the experiment was conducted—and categorizing them as positive or negative, the researchers could determine to what degree the members had been influenced.

Not surprisingly, researchers found what they called “experimental evidence for massive-scale contagion via social networks.”

Now, this isn’t a very nice thing to do to people—deliberately goad them into an emotional response just to show that you can. Worse still, half of the lab rats in this exercise were given an overabundance not of joyous news to make them happy, but of stories featuring dreary, gloomy, negative things. That feels very much like what qualifies as intentional infliction of emotional distress.

(Finding evidence that people share such misery, that it constitutes a “contagion,” apparently interests scholars. Why? I can’t imagine. Everybody’s been saddened by dismal news and told others about it. But some social scientists build careers by confirming the obvious.)

Second, not only was this cruel, it was deceptive. The whole appeal of Facebook’s News Feed is that it’s supposed to constitute a filtered selection of news that has been endorsed by the member’s community of “friends.” It’s meant to be what the people you care about seem to care about.

This feed wasn’t that at all. It wasn’t what your friends said was worth knowing. If you attached importance to it because you believed your network of like-minded souls had commended it to your attention, you were being tricked. Other Continue reading

Freed POW Bergdahl is an attractive candidate for scapegoat for a war U.S. public has abandoned

 

No accounting has been demanded for the U.S. wars in Iraq and Afghanistan. The architects of those disasters have shuffled off-stage now to their pensions and honorarias; some occasionally return to the public forum as learned commentators, as if their lethal stupidities of a decade ago now qualify them as experts on the bad choices to come.

Meanwhile the real leaders—Cheney, Bush, Rumsfeld—are hanging back, presumably hoping the next wave of revisionism will restore to them the reputations for wisdom and courage they imagine they deserve. Hey, why not? Their victims are either dead or ignored. Even Nixon was remembered for statesmanship by the time he died.

In this country, we don’t have truth and reconciliation commissions, which elsewhere invite the victims of social calamities to talk publicly, and which try to restore dignity to those who suffered and to lay the seeds for futures in which such debacles won’t recur.

Instead, the engrained U.S. response to catastrophes—such as laying waste to other lands without cause—is to change the channel. In Scarlett O’Hara’s words, “Tomorrow’s another day.” Or as Kinky Friedman put it, “Let Saigons be bygones.”

When the urge for retrospection arises, chances are it won’t be to demand accountability. It’ll be to seek scapegoats. Hence the Bowe Bergdahl affair, an unfolding chapter in how the longest war in U.S. history is being imagined.

Bergdahl was the only U.S. prisoner of war left in Afghanistan. He was freed last month after being held by the Taliban under often harrowing conditions for almost five years. When he was captured he was a 23-year-old private first class (he’s only a sergeant now because he was promoted while in Taliban hands.) Actually, he was a home-schooled grunt from Idaho who had a longing for adventure, a flair for wordplay, and a loathing for the miseries of war.

The initial jubilance surrounding his release quickly subsided. The first buzz-killer was the exchange that freed him, in which five ex-Taliban officials were released from Guantanamo. Obama critics condemned them immediately as “some of the worst outlaws in the U.S. war on terror,” or, as Sen. John McCain declared, “the five biggest murderers in world history,” killers with U.S. blood on their hands.

That’s quite a stretch. Actually three of the five had been in U.S. hands since November 2001 and the other two since Continue reading

On Google, the curse of the Permanent Record, and the right to be forgotten

 

Once, we came of age under the shadow of something called a Permanent Record. Nobody ever actually saw one, but it was a scary thing, and as youngsters we understood we had to keep our own permanent records clean, since any stains on them could do lifelong damage.

Plainly, the idea of an authoritative, ineradicable ledger on individual behavior is a powerful one. Widespread too. You see it in everything from the divine Book of Life to the gift list kept by Santa, who knows if you’ve been naughty or nice.

That permanent record meant somebody was paying attention, which was good, but it was also a brooding and oppressive background presence, since it enabled even trivial sins to curse our futures.

A good thing it was largely mythic. Back then, actual record-keeping was spotty and forgetful, and technology had zero ability to corral the manifold traces that we leave as we make our way through life into some all-knowing compendium.

No longer. Welcome to the digital age. Its mighty search engines have spawned a virtual permanent record for millions of individuals. It’s updated constantly, lasts forever, and is in fulltime public view.

What gets in it and with what prominence—those are mysteries, depending on the alchemy of particular search engines. Generally, it seems, they suck up most anything about someone that was published or resides in Internet-accessible public records. (The search engines don’t scour social media like Twitter and Facebook, yet.)

That means the fraternity house dustup that led to a sleepover in jail, or the rude remark at a political rally, or any of a thousand missteps and embarrassments that in a pre-modern age would have faded into oblivion—the debris of what Justice John Paul Stevens called the “practical obscurity” we used to inhabit—remain vivid, alive and, potentially, toxic.

Hence the importance of last month’s ruling by Europe’s highest court. It authorizes people to demand that links to material that threatens their privacy be scrubbed from search results.

The case involves Google, the California-based colossus that handles roughly 90 percent of Europe’s Internet searches. It was brought by a Spaniard who challenged a link to a 1998 item in a Catalan newspaper about the auction of his home, which was repossessed to pay off debts he owed. He reasoned that the matter had been resolved ages ago and there was no reason people who googled

Continue reading

Has U.S. journalism lost its nerve?

Much of normal news, the routine patter that fills our screens and spills from the airwaves, is a chummy co-production of authorized sources and compliant scribes. The rituals of normal news ensure that the public agenda is dominated by the concerns and perspectives of the powerful, whose priorities typically lead the newscasts.

Fortunately, that’s not all our news media do. We also have parallel traditions, among them a journalism of defiance. That’s when reporters ferret out and make public newsworthy realities that people in power would rather be ignored and sometimes even make it illegal to expose.

We’re in an era of spectacularly audacious disclosures of official secrets—commensurate with the most audacious expansion of official secrecy in the history of this or any country. Since Wikileaks, the online anti-secrecy network, posted in 2010 the classified gunsite footage of Iraqi civilians being slaughtered by a U.S. helicopter, news media worldwide have showcased stunning disclosures of U.S. secrets and the shadowy infrastructure through which the unprecedented post-9/11 regime of surveillance and data collection has been sustained.

The counterattack has been ferocious: The soldier who was Wikileaks’ source, Chelsea Manning, is doing 35 years in federal prison, and the mastermind who brokered the release to the news media, Julian Assange, is under de facto house arrest in London.

Meantime, top media continue to feast on secrets served up by Wikileaks’ successor, ex-U.S. intelligence analyst Edward Snowden. They include astonishing revelations about government data sweeps, penetration of the tech industry and the overreach of National Security Agency electronic snooping.

Amazing stuff. Political leaders continue to denounce Snowden as a spy, but the public isn’t convinced. One of the few surveys of broad opinion on the matter, a Quinnipiac University poll last year, found that by a huge 55-34 percent margin voters regard Snowden as a whistleblower, not a traitor, and that by 45-40 percent people believe official anti-terrorism efforts go too far in restricting civil liberties.

So you might think that U.S. journalists would feel emboldened: After all, here’s news of vast import, purloined in the name of civic purpose with evidence of public support. Seems like a sturdy basis on which to challenge the tired rituals of normal news, to re-energize that parallel tradition of defiance and independent truth-seeking.

A surprising new survey suggests this isn’t how today’s journalists see things. It’s the latest in a series of polls conducted every 10 years since 1971 by Indiana University researchers. What it found was a demoralized profession, one that has lost its nerve. Respondents are convinced the news industry is generally heading in the wrong direction and that its biggest problem is “declining profits.”

Most remarkable are signs of a dramatically growing rejection of the very reporting techniques that have nourished the journalism of defiance in recent years.

Consider this question: Might, “on occasion,” a reporter be justified in using “confidential business or government documents without authorization?” That means newsworthy information you’re not supposed to have.

Fewer than 58 percent of the 1,080 respondents in the 2013 poll approved, a major decline from nearly 82 percent in 1992. Continue reading

Net Neutrality Debate a Reminder that Content Was Never King

Media consumers are usually too busy paying attention to content to consider the channels through which it arrives. Yet the nature of those channels and the rules governing them have historically had a huge, unacknowledged role in creating and shaping what we read, watch and listen to.

The motion picture business was founded by people who owned tiny movie houses; the future moguls knew nothing about making films, but they owned the exhibition outlets, and needed content to sell tickets for. So they learned, and they shot, and they founded the studios. Channel preceded content, and gave birth to Hollywood.

Channel control has long prefigured media development. In the electronic age, wherever the creative artists went, the engineers had gotten there first.

Broadcasting started out as the late 1920s brainchild of people who made and sold radio sets. They wanted to give customers a reason to buy their receivers, so they then began making programs and transmitting them over the air. First came the distribution channels, content followed.

FM radio languished for 30 years until the 1960s, when regulators told station owners they could no longer fill the high-quality FM band with the same programs they were putting out on scratchy AM. Suddenly huge bandwidth opened up, perfect for audio engineered for clarity — and the revolution in alternative rock was born.

And the feds’ 1962 insistence that all TV sets be equipped to receive signals broadcast in the UHF range – another 60-some channels on top of the four or five that most consumers received — broke the network stranglehold on TV broadcasting and started the industry down the road to the multi-channel cable explosion.

That brief history lesson goes some way toward explaining why today’s controversy over so-called net neutrality matters.

Net neutrality is the policy that has barred the companies that furnish Internet connections from playing favorites. It means Internet

Continue reading

Saving Sources: Time to Stand Up

Introductory Remarks, 8th Annual Reva and David Logan Symposium on Investigative Reporting, organized by the Investigative Reporting Program of the Graduate School of Journalism, University of California, Berkeley, April 26, 2014

Welcome to Berkeley, where we’re observing the 50th anniversary of the Free Speech Movement, which is a natural segue into the theme of this year’s symposium, which focuses on the beleaguered condition of news sources.

This is something I’ve been talking and writing about for the past few years, so I’m very glad to be able to take a few minutes to tee this up before I hand off to the organizer and star of this gathering, Lowell Bergman.

I have three main points:

1. Press freedom is meaningless without source freedom.

2. Neither the media, nor the courts, nor even our frayed system of civic education has ever assigned sources the importance and respect they deserve.

3. And finally, the media need to step up institutionally for their sources.

1. To my first point, which should be obvious but apparently isn’t: press freedom Continue reading

Paying Reporters for Online Traffic Could Take a Toll

How news reporters are paid has always been a matter of some puzzlement, not least among reporters themselves. That’s partly because the value of what they do is hard to put a price on. It’s partly because to what extent their work requires subject knowledge, experience, sweat and even artistry is hard to determine.

And it’s partly because of a fundamental business uncertainty: How much does a given reporter contribute to the commercial success of the enterprise?

In the old days the model was straightforward: Publishers figured a more robust editorial product would mean more readers and a more hospitable space for advertisers to make their pitches. That meant higher ad rates and, the logic went, more revenue.

As for wages, though, the model wasn’t much help. The newsroom was a mildly anarchic micro-economy. It lavished money on supposed stars without evidence they brought in discernible payback, while the rest of the staff labored under a sluggish and generally stingy compensation system that rewarded longevity over performance.

The digital revolution promised to replace alchemy with science, by offering measurement tools to enable publishers to figure out what particular stories are actually worth—and, working backwards, what individual journalists should be paid.

In theory, the logic of calibrated journalism rests on the fact that both the costs and revenues of online news are potentially knowable: News managers already have a reasonable idea what producing a particular story costs. Now they could also forecast the revenue it’s likely to generate, since ads are being placed and priced with increasingly surgical precision, depending on where and alongside what they appear.

Knowing revenues and costs, the result would be a fairly reliable profit-and-loss projection for the assignment the editor was contemplating. The budget-minded editor would direct coverage accordingly.

We’re not quite there yet, but now the news business is taking a further step in that direction through a surge of payment schemes to encourage reporters for the vigor of their online engagement. A few weeks ago The Oregonian, the respected Portland daily, created a stir by introducing a compensation plan to pay journalists for the traffic their postings draw.

Continue reading