[This review was originally posted at the Nieman Journalism Lab on July 11, 2014.]
This week's essential reads
: The key pieces this week are NYU's Jay Rosen (in an article
and an interview
) on Facebook's legitimacy and control, journalism professor Alberto Cairo
on solving data journalism's current crisis, and Cornell's Tarleton Gillespie
on algorithms, journalism, news judgment, and personalization.
Facebook, online research, and control
: A week after Facebook's experimental study
on News Feed content and user emotions initially prompted an uproar, observers continued to talk about its implications for research, social media, and control. A privacy group filed a complaint
with the U.S. Federal Trade Commission late last week, and the journal that published the study published a formal
"expression of concern."
Others pushed back against the outrage at Facebook over the study: Microsoft researcher Duncan Watts argued
that our emotions are being manipulated all the time by marketers and politicians and said we should insist that companies like Facebook "perform and publish research on the effects of the decisions they're already making on our behalf." Likewise, Forbes' Jeff Bercovici said Facebook's study was much more innocuous
than it's being made out to be.
Former Facebook data scientist Andrew Ledvina questioned
why there was so much outrage about this particular study when Facebook conducts constant experiments on user behavior. The only change Ledvina saw emerging from this episode was not that Facebook would stop doing these types of experiments, but that it would stop making them public. Similarly, Microsoft's Andrés Monroy-Hernández said the incendiary tone of the discussion surrounding this study makes him more reluctant
to share his own research publicly.
Blogger and developer Dave Winer this incident damages Facebook
by destroying its users' illusion of an organically generated News Feed: "Facebook just broke the fourth wall. Of a sudden we see the man behind the curtain. And it's ugly and arrogant and condescending and all around not a nice feeling." On the other hand, Gigaom's Tom Krazit said most users already knew that Facebook manipulates their News Feed
, but we don't know how they're doing it, so we can no longer consider it a useful source of news.
NYU's Jay Rosen argued late last week
that Facebook has a very "thin" form of legitimacy for its experiments because of its lack of attunement to ethical concerns, and in an interview this week
, he warned that Facebook needs to be careful not to derive false confidence from its dominance of the social media market. "'Where else are they going to go?" is a long way from trust and loyalty. It is less a durable business model than a statement of power," he wrote.
From a bigger-picture viewpoint, Stanford professor Michael Bernstein said researchers need to rethink
how they apply the principle of informed consent to online environment, and Gigaom founder Om Malik looked at the responsibility
that needs to come with the unprecedented level of data and automation that's involved in modern life.
Improving data journalism with education
: A couple of conversations this week converged on two important issues facing the news industry: data journalism and journalism education. Miami journalism professor Alberto Cairo diagnosed the underwhelming output
at some of the prominent new data-oriented journalism sites, concluding that "Even if data journalism is by no means a new phenomenon, it has entered the mainstream quite recently, breezed over the peak of inflated expectations, and precipitously sank into a valley of gloom."
Cairo made a set of prescriptions for data journalism, including devoting more time, resources, and careful critical thinking to it. At Gigaom, Derrick Harris added that data journalism could use more influence from data science
, especially in finding and developing new datasets. And sociology professor Zeynep Tufekci used this week's World Cup shocker
to look at flaws in statistical prediction models.
In a weeklong series, PBS MediaShift took a deeper look at what journalism schools are doing to meet the growing demand for skilled, critically thinking data journalists. The series included an interview on the state of data journalism
with Alex Howard of Columbia's Tow Center, tips from The Upshot's Derek Willis on "interviewing" data
to understand it and find its stories, a look at how journalism schools are teaching data journalism
, and practical suggestions
of ways to incorporate data into journalism education.
Elsewhere in journalism education, the American Journalism Review's Michael King examined enrollment declines
at American journalism schools, noting the tricky question of whether these declines are a leading or lagging indicator — something primarily indicative, in other words, of journalism's last several years or next several years. And David Ryfe, director of the University of Iowa's journalism school looked at the difficulty
of fitting the dozens of skills desired by employers into a relatively small number of journalism courses.
Google raises censorship concerns
: One story on Google and censorship that came to a head late last week: In response to a May ruling by a European court, Google began removing pages from certain European search results per requests filed based on irrelevant or false information. By last week, that began including news articles, most notably a BBC column
on Merrill Lynch, but also several other articles
. In a pair of helpful posts, The Guardian's Charles Arthur explained the removals in general
, and Marketing Land's Danny Sullivan explained
the removals of news articles.
Google reversed its decision
to remove several links to Guardian stories as British news organizations criticized Google's implementation
of the law. Gigaom's Mathew Ingram
and The Next Web's Paul Sawers
both posited that Google, which opposed the ruling, is enforcing it in deliberately draconian so as to create a controversy about the censorship it entails. But Danny Sullivan countered
that it's highly unlikely that Google is doing this to sabotage the ruling, since the links it's removing span much broader than easily outraged media outlets.
: A few other bits and pieces going on during this slow week:
— The U.S. National Security Agency documents leaked by Edward Snowden revealed another finding on surveillance this week: The Washington Post reported
that the number of ordinary American Internet users in communications intercepted by the NSA far outnumber the legally targeted foreigners. The Intercept also reported
on several prominent Muslim American academics and activists who were monitored by the NSA. The Intercept's Glenn Greenwald talked to Wired
about why the story is important, while PandoDaily's Paul Carr questioned Greenwald's hesitation
in publishing the story.
— The Wall Street Journal celebrated its 125th anniversary
this week with an archive
looking at how they've covered big news events in the past as well as a special report
. The Lab's Joseph Lichterman took a closer look
at the Journal's anniversary offerings, and Capital New York talked to Journal managing editor Gerard Baker
. The Atlantic's Adrienne LaFrance, meanwhile, had a fascinating look at the Journal's design
through the years.
— The Pew Research Center released a study
on the declining number of reporters covering U.S. state legislatures and what's being done to fill the gaps. The Lab's Joseph Lichterman wrote a good summary
— This week's handiest piece: Sarah Marshall's summary
of the tips Johanna Geary, head of news at Twitter UK, gave to British journalists about using Tweetdeck as a reporting tool.
— Finally, the Lab's Caroline O'Donovan talked to Cornell's Tarleton Gillespie
about a wide range of issues surrounding algorithms, including personalized news, news judgment, and clickbait.