Maybe media shaming isn’t always a bad thing, especially if injustice otherwise goes unanswered

Media shaming is hot. It’s the new spectator sport in which hapless people say or do something that unexpectedly provokes general wrath, and get their skin torn off by online abuse from thousands of furious, abusive, and hateful strangers.

Justine Sacco was among the early headliners. She was a New York public relations practitioner who tweeted, as she got on plane for Capetown in December 2013, an ironic remark about how she probably wouldn’t get AIDs in South Africa because she was white. Within hours Sacco was inundated with ferocious, Twittered scorn and denunciation, and wound up losing her job.

Now comes Walter Palmer, the hapless dentist from Minnesota, who killed a lion in Zimbabwe in July. It was named Cecil, apparently had a devoted following, and was allegedly lured illegally from a wildlife sanctuary so the hunter could put an arrow in it. Palmer was the subject of 670,000 tweets in the first 24 hours after he owned up to shooting Cecil.

These cases and the many others that the feverish pace of online chatter is churning up involve misdeeds, often trivial, for which people are vilified and insulted, sometimes with serious consequences. They aren’t charged formally with wrongdoing that they can deny or explain. They can’t rely on a forum where they can argue they did no real harm, they can’t offer to remedy their wrongdoing, they can’t appeal to a disinterested panel.

If they could do that, they’d be in the realm of guilt. That’s where wrongs are clearly identified and punishment is pronounced after evidence is heard, explanations are offered, and some reflection is given to what might be necessary to set things right.

But this isn’t the republic of guilt, it’s the empire of shame. Both are mechanisms by which people are held accountable for wrongs, but they’re very different. Guilt is a response to a wrongful act, while shame instead blankets the person who appears to have done wrong with moral condemnation.

Guilt can be mitigated by showing that the action didn’t do much harm or that the harm was unintended. Any response by the wrongdoer that reduces the harm is pertinent. Punishment that seems to even the score, makes whoever was hurt whole again, ensures the wrong doesn’t continue or isn’t repeated—those are all part of a venerable formula for redress that seems rational and fair.

Shame is different. It’s about a lack of moral worth. The crook who robs the bank may be guilty, but the one who mugs the bank teller is shamed. In the old days, guilt got you flogged, shame got you shunned.

Shame can’t be relieved; it must be suffered and endured and, at best, can in time be forgiven, through an accumulation of evidence that the person who was shamed has done the prescribed penance.

Shaming rituals have elements about them that are pre-modern, almost tribal, and which are hard to square with fairness and proportionality. They constitute punishment without trial, inflicted by anonymous strangers acting under standards of their own, and they trigger reprisals that may be grossly excessive. (Why was Sacco fired? Should Palmer have been hounded into suspending his practice?)

That said, what should the response be to wrongdoing in instances when the rational machinery of deciding guilt isn’t up to the job?

Maybe, sometimes, shame has its place.

“Scorn or shaming are important in reaffirming the boundaries of what is considered acceptable behavior and helping ensure that people adhere to them,” Continue reading

Online news undergoes a reprofessionalization. Amen.

For more than a decade now, a steady refrain in the online media has been that the traditional practice of journalism was dying, the victim of technological advance and cultural insurgency.

It wasn’t just the economic collapse of the legacy press. The most widely followed online news sites were increasingly populated by articles, pictures, and audio tracks selected not by living editors but by continuous, automated samplings of user behavior and mathematically ingenious hunches about reader interest.

News itself was being reimagined as no longer mainly the job of salaried reporters. It was more and more the work of impassioned civilians equipped with handheld devices and driven by curiosity and a commitment to public illumination.

As a political matter, that meant the day when a newsroom elite superintended civic awareness was over. News consumers would now rely on their social media pals for guidance on what to pay attention to, and the power to create and sustain networks of attention was now in the hands of ordinary people.

This was all thought to be a very good thing, since it not only universalized a hugely expanded population of people as news sources. It also spread the net much wider so that realities that might have escaped notice became news. “Journalism gets better the more people who do it,” as one writer put it, paraphrasing New York University media theorist Jay Rosen.

The outlook was buoyant, even euphoric, and it was tempting to overlook what might be lost in the rush to the online news millennium—accuracy and taste, for starters. More troubling, where was the quality? Even with a decade of citizen mobilization behind us, it’s hard to point to genuinely good journalism that was truly attributable to this turbo-fed democratization, no matter how lavishly admired it has been.

Now, that’s not to say that great news tips and evocative videos haven’t come from civilians with the right tools, in the right place, at the right time. We’ve seen that in the past year in the awareness of police killings of young black men in this country. Nor is that to ignore instances of exuberantly successful mass mobilization – such as when some 20,000 Britons scoured the personal spending of their parliamentarians in 2009 under the auspices of the Guardian newspaper.

But by and large, the most dramatic impact of the digital explosion on journalism has been to widen the world of sources, not to transform the rituals of newsgathering. Quality journalism has remained, defiantly, a professional practice. The value of meticulous attention to accuracy, of careful confirmation, of sifting competing claims about truth and significance, of respect for privacy, of concern to avoid harm where possible—these cornerstone principles of traditional journalism (however often they’re violated) have not been replaced in the millennial rush to a digital populism.

So it comes as welcome news that some of the most successful web-based news operations are surrendering the algorithms that they’ve been using to make editorial decisions, and will now, as Wired magazine reports, “use real, live humans to curate the news, entertainment, and content they’ll deliver via their platforms.” Continue reading

Anger triggered by Rachel Dolezal affair remains puzzling, but illuminating

At first blush it’s hard to imagine why the racial identity of an obscure civil rights advocate from eastern Washington State deserved the nationwide media high beam: Hours of TV air time, acres of newsprint, a tidal flood of tweets and electronic chatter. All of it devoted to exposing, denouncing, defending, or puzzling over the case of Rachel Dolezal, the daughter of white parents, who over the course of her 37 years and her rise as an activist in the Pacific Northwest had reinvented herself—sometimes by deliberate implication and sometimes explicitly—as a black woman.

The story came to light after Dolezal was attacked in the Spokane, Wash., media for burnishing her application for an unpaid municipal advocacy job by falsely claiming African American parentage. Her biological parents, a Montana couple of European ancestry, said Dolezal had began to “disguise herself” in 2006 or 2007 after they adopted four black children.

Dolezal herself was brought up blonde and blue-eyed, then was educated at historically black Howard University in Washington, D.C., before building a career in rights advocacy and college teaching in western Idaho and Washington State. She married an African American man and gave birth to two children.

Those children, under the idiosyncratic rules of racial attribution in this country, are black, of course. And there seemed to be nothing questionable about her commitment to racial justice.

Still, Dolezal, media reports suggest, left a somewhat woozy trail of ambiguity and dissembling that made her hard to sympathize with. The Spokane Spokesman-Review reported that authorities in nearby Coeur d’Alene, Idaho, where she had worked as an advocate, were skeptical about whether racist provocations and threats supposedly directed at her were genuine.

Once doubts were raised in Spokane about her claim to be black, she dodged direct questions on camera about the race of her parents. At the college where she taught Africana studies colleagues assumed she was black, a conclusion she appears to have encouraged.

As the story continued to unfurl, Dolezal became a media luminary, giving numerous network and cable TV interviews, explaining her life, justifying her choices.

The larger question, to me, was less why she did what she did, but why everybody seemed to care so much, and why her case provoked so much anger—anger from whites, some of whom no doubt viewed her attempts to take improper advantage of racial preferences as proof that their own ancestral privileges had been upended; anger from blacks, who resent her claiming a make-good share she did nothing to deserve from a legacy of privation from which she suffered not at all.

One of the kinder and, in my view, more insightful commentaries on the affair came from a Stanford historian, Allyson Hobbs, author of a history of racial “passing,” the practice of posing as a member of a different race. Hobbs Continue reading

Upending the NSA’s illegal data sweep is a major triumph for the press, but claiming credit would mean crediting Edward Snowden

Week of May 24, 2015

The National Security Agency’s bulk capture of the phone records of millions of U.S. citizens was sweeping and invasive. Now we know it was also illegal, since a federal appeals court has said so. Meanwhile the man who exposed the program, the former NSA contractor Edward Snowden, is still a fugitive in Russia and faces charges of espionage if he returns to this country.

Snowden’s June 2013 leaks won the Pulitzer Prize for The Guardian of London and The Washington Post. Yet the U.S. news media have been slow to recognize—let alone applaud—his contribution, the single most effective and beneficial act of journalistic defiance in recent memory.

The New York Times report on the court’s May 7 ruling declaring the data collection illegal was especially circumspect, with Snowden’s name not even mentioned until paragraph 29. (The Guardian, on the other hand, put him in the headline.)

There would’ve been no legal challenge if not for Snowden. Nor would the House of Representatives ever have voted decisively to halt the program, since Congress wouldn’t even have known about it.

The fate of the program is still the subject of mud-wrestling on Capitol Hill, but there’s no question that Snowden’s impact has been historic. It’s hard to recall a comparable instance where media disclosures produced such a dramatic shift in public attitudes and such a repudiation of official policy.

Press histories extol the Pentagon Papers case, where The New York Times resisted White House pressure and published a secret history of Vietnam-era bungling and deceit. But the war went on for three more years, unimpeded by Continue reading

Overlooked Miami Beach police shooting exposes legal system’s limp response to needless killing

Raymond Hérisse never made it to his hangover. It was waiting for him, the sour remnant of all the Hennessy he’d been imbibing early the morning of May 30, 2011. But he was still drunk that Memorial Day when he was shot to death at the wheel of his borrowed Hyundai Sonata by 12 police officers in Miami Beach, Fla. He was hit 16 times; later, investigators recovered 124 spent rounds.

Hérisse was 22. It was Urban Beach Weekend, an annual bacchanal that draws to the seaside town a huge outpouring of largely black, largely young, revelers from throughout South Florida. Hérisse was visiting from his home 60-some miles up the coast in Boynton Beach. In the minutes before he died he had been driving his friend’s car foolishly and dangerously along the city’s main thoroughfare Collins Avenue, two blocks from the ocean, careening off several other cars, swerving onto the wrong side of the street and even onto the sidewalk, sending pedestrians and bicycle-riding police scurrying for safety—and drawing gunfire from eight cops as his Sonata rolled along.

Finally the car stopped at 13th Street. It’s not known whether Hérisse had been hit by any of the 44 bullets already fired at the car. The Sonata remained motionless for over a minute; through the tinted windows witnesses thought Hérisse was moving. Maybe he was reaching for a gun. When he didn’t respond to police screaming at him to get out, they started shooting again—eight of them, popping away with their Glocks and Sig Sauers, 81 more rounds for investigators to pick up and count.

By the time it was over a total of 12 officers from three different forces enlisted to handle the weekend crowds had fired on Raymond Hérisse. They hit four bystanders, wounding them severely, and they killed Hérisse.

The affair got a bit of notoriety at the time after police seized and destroyed the smartphone of one of several onlookers who had the presence of mind to record the fusillade. He managed to get his video to CNN only by hiding the memory card in his mouth. So for a moment the incident became a press freedom issue.

As it happens, I was asleep about 10 blocks away at the time of the shooting, and as the facts trickled in was appalled by what seemed to be the grotesque and murderous disproportionality of the police response. What was going on? Why did it take three days for police to announce they’d found a gun in the car, which it turned out was under a seat, wrapped in a shirt, untouched? And in a city that had been convulsed repeatedly in the ‘80s and ‘90s by racial violence triggered by lethal predation by police, would there be any reckoning?

It took local authorities nearly four years to come up with a determination as to whether anybody, apart from Raymond Hérisse, did anything wrong that pre-dawn morning. Finally on March 16, the Miami-Dade State Attorney’s Office released its findings, which cleared police of any criminal wrongdoing (the civil Continue reading

It’s OK now for someone to shoot photos of you through your apartment window? NY court leaves a dark picture of privacy imperiled

When the first court ruling came down in 2013, this came down, with signature restraint, from The New York Post: “Judge backs the right of creepy Tribeca artist to photograph people through their windows.” Two weeks ago an appeals court upheld that decision, and The Hollywood Reporter announced: “Artist who spied on neighbors with telephoto lens beats privacy claims.”

Welcome to the affair of Arne Svenson, a fine arts photographer of considerable skill. He spent a year using a 500mm telephoto lens he got from a birdwatching friend to shoot thousands of pictures of his unwitting neighbors through the windows of their plush apartments across the street from his in Lower Manhattan. When some of Svenson’s photos were offered for sale at a high-end New York gallery in 2013, he was sued for, among other things, invasion of privacy.

Although they’re described in some quarters as “voyeuristic,” there’s nothing salacious about his pictures. They’re tasteful, and they’re cool. They’re photos of legs and of dogs, of reclining figures, of shadows and profiles, of children shot from behind, of blurred couples interacting and, perhaps, arguing. By and large, you can’t tell who the people are, and that anonymity, it’s suggested, universalizes the images. They’re framed by the crisp lines of their living-room windows, and the results look very much like art.

“I find the unrehearsed, unconscious aspects of life the most beautiful to photograph, as they are most open to interpretation, to a narrative,” Svenson said in 2013. “A dramatic moment has the single power of action, but tiny, linked moments are how we mark time on this earth. I am much more interested in recording the breath between words than I am the actual words themselves.”

So? How much of that matters? Here those people are, inhabiting “the breath between words,” tending to household pleasures, fussing or napping, scolding their kids, eating their prunes or flirting with their mates or scratching their butts, and some faceless guy across the way using some mid-tech peephole captures these images—and decides how much of what’s captured should be shared with (and peddled to) a vast audience of strangers. Don’t expectations of privacy matter? Does artistic flutter trump personal sanctuary?

The New York courts, applying a century-old law intended to keep an individual’s likeness from being used in advertisements without consent, decided that state privacy protections had to yield to Svenson’s expressive freedom under the First Amendment. That’s because a huge “newsworthy and public concern exemption” to privacy safeguards had been carved out, and although there’s nothing remotely newsworthy about Svenson’s images, it applies to literature, movies and works of art as well.

Now, newspeople are used to basing their right to ignore primordial privacy claims on such matters as where the photographer is shooting the picture from (a sidewalk, e.g.) or where the figures being shot are standing (if they’re in a Continue reading

British daily’s campaign to prevent climate change raises bold questions about role of press advocacy

The Guardian, the London daily that has risen from a respected but fringe player on the British political scene to a major transatlantic voice of liberal thought, did something notable and gutsy a few weeks ago, and just about nobody on this side of the ocean paid any attention.

Flanked by stories and commentaries, Alan Rusbridger, the editor credited with leading the Guardian’s rise, announced March 6 that his organization was launching a campaign intended to head off the climate catastrophe that the scientific consensus has concluded is unavoidable without deep changes in public policy and industrial practice.

The Guardian’s objective is to slow the production of fossil fuels by pressing to halt the exploitation of new energy reserves. The world has much more coal, oil and gas in the earth than it can safely burn. “Leave it in the ground,” is the rallying cry.

The immediate goal is to encourage investors to dump their holdings in fossil fuel-based companies, on grounds that pulling money out would slow the use of suicidal fuels and goad the energy giants into investing in environmentally palatable power sources. As a first step, the Guardian would lead a drive to persuade two immensely rich—and by reputation, socially responsible—foundations, UK’s Wellcome and the U.S.-based Gates, to unload the $1.5-plus billion worth of energy shares they own.

The Guardian was allying with an organization called 350.org, associated with the influential U.S. environmentalist Bill McKibben. It takes its name from the prediction that unless the concentration of carbon dioxide in the atmosphere is cut from its current 400 parts per million (ppm) to 350, we’re all in very big trouble.

In a personal note, Rusbridger explained that as he looked back on his two decades as The Guardian’s editorial chief—he retires this summer—his greatest regret was “that we had not done justice to this huge, overshadowing, overwhelming issue of how climate change will probably, within the lifetime of our children, cause untold havoc and stress to our species.”

Now I’m a big fan of the Guardian, and I’ve watched its ascent under Rusbridger with admiration. It stood shoulder-to-shoulder with the world’s top news organizations in the Wikileaks stories and in the publication of the Edward Snowden leaks, for which it shared a Pulitzer, and I applauded.

And I was moved by Rusbridger’s epiphany that, for all the fights he’s fought and won, none will matter 20 or 30 years from now if climate science is right, and we have lost it all.

So I don’t wonder why Rusbridger took this move, but I do wonder why it has gone unnoticed. I’ve poked around online and can find almost no mention of this campaign—to which The Guardian devoted lavish space, solid reporting, and gorgeous graphics—in the U.S. media. It’s as if, to borrow an English image, the quirky matriarch had now become the batty old aunt in the attic. Continue reading