Could the surge in mobile destroy what’s best about the Internet?

Back in 2004 I heard a presentation by an eminent media analyst at a conference of broadcasters in New York. He talked about “the capacity explosion”—the wild proliferation of broadband channels into and out of homes—and then offered remarkable observations about storage capabilities: The capacity to keep movies, video, and data of all kinds at home was rising 72 percent per year: What it cost to store 57 movies then would cover more than 2,000 in five years.

That sounded like fun, but my interest was in the political implications of his analysis, which were vast: A degree of communication independence for ordinary people that would have been historically unprecedented.

With immense in-house storage capabilities and blazing upload speeds, the distinction between media consumer and producer would vanish. Each of us would possess our own hope chests of content and become autonomous centers of intelligence and initiative, drawing freely from resources that lay entirely within our control, captains of our destiny, free to convene our own networks of communicants.

Too bad this forecast was almost entirely wrong. It didn’t foresee the advent of the Cloud. Now we have apps based God-knows-where that own our family photo albums and music play lists, and induce businesses to entrust even back-office operations that were always handled in-house to invisible helpmates deriving heaven-knows what shadowy benefits from the assistance they proffer.

The upshot is that we are infinitely less, not more, independent of our informational masters than we were a decade ago.

Now, a penetrating research paper presented this month at an academic gathering in Montreal suggests that perhaps the most powerful force in what can only be called the subordination of the vast majority of Internet users—their continuing downward transformation into mere consumers with only meager access to the creative potential of the digital network—is the rampant growth of mobile technology.

The paper, titled “The Emerging Mobile Internet Underclass,” by Philip Napoli of Rutgers and Jonathan Obar of the University of Ontario, opens with a surprising observation from Sergey Brin, co-founder of Google.

In a 2013 TED talk, Brin said he found his smartphone “emasculating.”

The argument is that though limber and endlessly convenient, mobile technology offers dramatically inferior Internet access to that available through PCs: slower speeds, reduced functionality, far less content availability (as of 2012 only an estimated 10 percent of the Web was mobile-ready.) Continue reading

The sliming of George Zimmerman

If it had happened to anybody but George Zimmerman, more people might have cared. But when your own lawyer calls you America’s most hated man you can’t expect a groundswell of sympathy, even when an immensely powerful broadcaster slimes you.

Zimmerman was the neighborhood watch zealot in Florida who picked a fight the evening of Feb. 26, 2012, with 17-year-old Trayvon Martin, whom he mistook for a prowler. Martin rose to the bait, they fought; Zimmerman shot him dead and claimed self-defense.

Initially police in Sanford, Fla., did nothing, reasoning that the state’s “stand your ground” law authorized Zimmerman to shoot rather than back off the confrontation he’d provoked. After a public furor—the town has a legacy of racism, Martin was African-American and Zimmerman isn’t—he was charged with second-degree murder, only to be acquitted at trial.

I thought the acquittal was preposterous. But my intention here isn’t to rehash the unavenged wrong Zimmerman did. It’s to examine the lesser known wrong that was then done to him—and how it exemplifies the license that U.S. courts have given news media to mangle facts and defame powerless individuals.

On five occasions, NBC and its Miami station WTVJ broadcast excerpts from a recording of Continue reading

As bad as Facebook’s experiment on its members seems, the reality was worse

What’s the worst thing about the news that Facebook hosted what The Atlantic aptly called a “secret mood manipulation experiment,” conducted on 689,000 unwitting members of its network? That’s hard to say. There’s so much bad to choose from.

First, the nature of the experiment: It was cruel. Researchers from Facebook itself, Cornell, and University of California, San Francisco, were looking into whether emotional states could be spread via news shared by online media. That means, without direct personal contact.

The vehicle for this “emotional contagion” was to be Facebook’s News Feed: By tinkering with the feed, researchers altered the balance of positive and negative items people were exposed to. Then, by peeking at the messages that those same people subsequently posted—some 3 million during the week in January 2012 when the experiment was conducted—and categorizing them as positive or negative, the researchers could determine to what degree the members had been influenced.

Not surprisingly, researchers found what they called “experimental evidence for massive-scale contagion via social networks.”

Now, this isn’t a very nice thing to do to people—deliberately goad them into an emotional response just to show that you can. Worse still, half of the lab rats in this exercise were given an overabundance not of joyous news to make them happy, but of stories featuring dreary, gloomy, negative things. That feels very much like what qualifies as intentional infliction of emotional distress.

(Finding evidence that people share such misery, that it constitutes a “contagion,” apparently interests scholars. Why? I can’t imagine. Everybody’s been saddened by dismal news and told others about it. But some social scientists build careers by confirming the obvious.)

Second, not only was this cruel, it was deceptive. The whole appeal of Facebook’s News Feed is that it’s supposed to constitute a filtered selection of news that has been endorsed by the member’s community of “friends.” It’s meant to be what the people you care about seem to care about.

This feed wasn’t that at all. It wasn’t what your friends said was worth knowing. If you attached importance to it because you believed your network of like-minded souls had commended it to your attention, you were being tricked. Other Continue reading

Freed POW Bergdahl is an attractive candidate for scapegoat for a war U.S. public has abandoned

 

No accounting has been demanded for the U.S. wars in Iraq and Afghanistan. The architects of those disasters have shuffled off-stage now to their pensions and honorarias; some occasionally return to the public forum as learned commentators, as if their lethal stupidities of a decade ago now qualify them as experts on the bad choices to come.

Meanwhile the real leaders—Cheney, Bush, Rumsfeld—are hanging back, presumably hoping the next wave of revisionism will restore to them the reputations for wisdom and courage they imagine they deserve. Hey, why not? Their victims are either dead or ignored. Even Nixon was remembered for statesmanship by the time he died.

In this country, we don’t have truth and reconciliation commissions, which elsewhere invite the victims of social calamities to talk publicly, and which try to restore dignity to those who suffered and to lay the seeds for futures in which such debacles won’t recur.

Instead, the engrained U.S. response to catastrophes—such as laying waste to other lands without cause—is to change the channel. In Scarlett O’Hara’s words, “Tomorrow’s another day.” Or as Kinky Friedman put it, “Let Saigons be bygones.”

When the urge for retrospection arises, chances are it won’t be to demand accountability. It’ll be to seek scapegoats. Hence the Bowe Bergdahl affair, an unfolding chapter in how the longest war in U.S. history is being imagined.

Bergdahl was the only U.S. prisoner of war left in Afghanistan. He was freed last month after being held by the Taliban under often harrowing conditions for almost five years. When he was captured he was a 23-year-old private first class (he’s only a sergeant now because he was promoted while in Taliban hands.) Actually, he was a home-schooled grunt from Idaho who had a longing for adventure, a flair for wordplay, and a loathing for the miseries of war.

The initial jubilance surrounding his release quickly subsided. The first buzz-killer was the exchange that freed him, in which five ex-Taliban officials were released from Guantanamo. Obama critics condemned them immediately as “some of the worst outlaws in the U.S. war on terror,” or, as Sen. John McCain declared, “the five biggest murderers in world history,” killers with U.S. blood on their hands.

That’s quite a stretch. Actually three of the five had been in U.S. hands since November 2001 and the other two since Continue reading

On Google, the curse of the Permanent Record, and the right to be forgotten

 

Once, we came of age under the shadow of something called a Permanent Record. Nobody ever actually saw one, but it was a scary thing, and as youngsters we understood we had to keep our own permanent records clean, since any stains on them could do lifelong damage.

Plainly, the idea of an authoritative, ineradicable ledger on individual behavior is a powerful one. Widespread too. You see it in everything from the divine Book of Life to the gift list kept by Santa, who knows if you’ve been naughty or nice.

That permanent record meant somebody was paying attention, which was good, but it was also a brooding and oppressive background presence, since it enabled even trivial sins to curse our futures.

A good thing it was largely mythic. Back then, actual record-keeping was spotty and forgetful, and technology had zero ability to corral the manifold traces that we leave as we make our way through life into some all-knowing compendium.

No longer. Welcome to the digital age. Its mighty search engines have spawned a virtual permanent record for millions of individuals. It’s updated constantly, lasts forever, and is in fulltime public view.

What gets in it and with what prominence—those are mysteries, depending on the alchemy of particular search engines. Generally, it seems, they suck up most anything about someone that was published or resides in Internet-accessible public records. (The search engines don’t scour social media like Twitter and Facebook, yet.)

That means the fraternity house dustup that led to a sleepover in jail, or the rude remark at a political rally, or any of a thousand missteps and embarrassments that in a pre-modern age would have faded into oblivion—the debris of what Justice John Paul Stevens called the “practical obscurity” we used to inhabit—remain vivid, alive and, potentially, toxic.

Hence the importance of last month’s ruling by Europe’s highest court. It authorizes people to demand that links to material that threatens their privacy be scrubbed from search results.

The case involves Google, the California-based colossus that handles roughly 90 percent of Europe’s Internet searches. It was brought by a Spaniard who challenged a link to a 1998 item in a Catalan newspaper about the auction of his home, which was repossessed to pay off debts he owed. He reasoned that the matter had been resolved ages ago and there was no reason people who googled

Continue reading

Has U.S. journalism lost its nerve?

Much of normal news, the routine patter that fills our screens and spills from the airwaves, is a chummy co-production of authorized sources and compliant scribes. The rituals of normal news ensure that the public agenda is dominated by the concerns and perspectives of the powerful, whose priorities typically lead the newscasts.

Fortunately, that’s not all our news media do. We also have parallel traditions, among them a journalism of defiance. That’s when reporters ferret out and make public newsworthy realities that people in power would rather be ignored and sometimes even make it illegal to expose.

We’re in an era of spectacularly audacious disclosures of official secrets—commensurate with the most audacious expansion of official secrecy in the history of this or any country. Since Wikileaks, the online anti-secrecy network, posted in 2010 the classified gunsite footage of Iraqi civilians being slaughtered by a U.S. helicopter, news media worldwide have showcased stunning disclosures of U.S. secrets and the shadowy infrastructure through which the unprecedented post-9/11 regime of surveillance and data collection has been sustained.

The counterattack has been ferocious: The soldier who was Wikileaks’ source, Chelsea Manning, is doing 35 years in federal prison, and the mastermind who brokered the release to the news media, Julian Assange, is under de facto house arrest in London.

Meantime, top media continue to feast on secrets served up by Wikileaks’ successor, ex-U.S. intelligence analyst Edward Snowden. They include astonishing revelations about government data sweeps, penetration of the tech industry and the overreach of National Security Agency electronic snooping.

Amazing stuff. Political leaders continue to denounce Snowden as a spy, but the public isn’t convinced. One of the few surveys of broad opinion on the matter, a Quinnipiac University poll last year, found that by a huge 55-34 percent margin voters regard Snowden as a whistleblower, not a traitor, and that by 45-40 percent people believe official anti-terrorism efforts go too far in restricting civil liberties.

So you might think that U.S. journalists would feel emboldened: After all, here’s news of vast import, purloined in the name of civic purpose with evidence of public support. Seems like a sturdy basis on which to challenge the tired rituals of normal news, to re-energize that parallel tradition of defiance and independent truth-seeking.

A surprising new survey suggests this isn’t how today’s journalists see things. It’s the latest in a series of polls conducted every 10 years since 1971 by Indiana University researchers. What it found was a demoralized profession, one that has lost its nerve. Respondents are convinced the news industry is generally heading in the wrong direction and that its biggest problem is “declining profits.”

Most remarkable are signs of a dramatically growing rejection of the very reporting techniques that have nourished the journalism of defiance in recent years.

Consider this question: Might, “on occasion,” a reporter be justified in using “confidential business or government documents without authorization?” That means newsworthy information you’re not supposed to have.

Fewer than 58 percent of the 1,080 respondents in the 2013 poll approved, a major decline from nearly 82 percent in 1992. Continue reading

Net Neutrality Debate a Reminder that Content Was Never King

Media consumers are usually too busy paying attention to content to consider the channels through which it arrives. Yet the nature of those channels and the rules governing them have historically had a huge, unacknowledged role in creating and shaping what we read, watch and listen to.

The motion picture business was founded by people who owned tiny movie houses; the future moguls knew nothing about making films, but they owned the exhibition outlets, and needed content to sell tickets for. So they learned, and they shot, and they founded the studios. Channel preceded content, and gave birth to Hollywood.

Channel control has long prefigured media development. In the electronic age, wherever the creative artists went, the engineers had gotten there first.

Broadcasting started out as the late 1920s brainchild of people who made and sold radio sets. They wanted to give customers a reason to buy their receivers, so they then began making programs and transmitting them over the air. First came the distribution channels, content followed.

FM radio languished for 30 years until the 1960s, when regulators told station owners they could no longer fill the high-quality FM band with the same programs they were putting out on scratchy AM. Suddenly huge bandwidth opened up, perfect for audio engineered for clarity — and the revolution in alternative rock was born.

And the feds’ 1962 insistence that all TV sets be equipped to receive signals broadcast in the UHF range – another 60-some channels on top of the four or five that most consumers received — broke the network stranglehold on TV broadcasting and started the industry down the road to the multi-channel cable explosion.

That brief history lesson goes some way toward explaining why today’s controversy over so-called net neutrality matters.

Net neutrality is the policy that has barred the companies that furnish Internet connections from playing favorites. It means Internet

Continue reading