[This review was originally posted at the Nieman Journalism Lab on Feb. 28, 2014.]
The Comcast/Netflix deal explained: Two weeks after Comcast announced it would buy Time Warner Cable and a month after a federal court overruled the U.S.’ net neutrality regulations, Comcast signed an agreement with Netflix in which Netflix will pay Comcast for a direct traffic-sharing connection to its network in order to improve the quality of its streaming video. The deal, called “paid peering” or “transit,” is likely to be the first of several for Netflix, as Verizon and AT&T both quickly said they’re negotiating similar arrangements with Netflix as well.
The Lab’s Ken Doctor looked at the business end of the deal: Netflix is clearing hurdles to its video streaming quality as it prepares to introduce additional tiered pricing, and Comcast is removing Netflix as a possible objector to regulatory approval of its purchase of Time Warner Cable. Variety’s Todd Spangler said Netflix has now fixed some of its key costs and is solving its biggest streaming quality problems, though Peter Cohan of Forbes said the deal doesn’t tell us much about how Comcast will treat other video-streaming services, especially after its Time Warner merger.
It’s important to note that this deal would not have been covered by net neutrality regulations. As CNET’s Marguerite Reardon and Consumerist’s Chris Moran explained well, peering isn’t about stopping intentional slowdowns of traffic quality or about giving preferential treatment to some services, both things that net neutrality would be built to stop. Instead, it’s about Netflix being allowed to connect its own content delivery network — most companies pay for third-party networks to deliver their content around the web, but Netflix has built its own to account for its incredibly high volumes of data — directly to Internet service providers like Comcast.
That doesn’t mean it doesn’t raise concerns about the future of the Internet, however. The Washington Post’s Timothy B. Lee argued that deals like this transform the Internet from its classic structure in which all sites’ content flow together to ISPs through a few big “pipes” — the structure on which the net neutrality ideal was built — to one in which each major content provider uses its own pipe which can be easier to individually manipulate.
Gizmodo’s Eric Limer said this deal relies on both sides’ size and encourages further consolidation: Comcast had enough leverage to sit back and wait for Netflix to pay up to fix its streaming quality problem, and Netflix was able to solve it being big enough to build “its own private highway.” Now, he wrote, “established champs who can pay for a separate tube have the advantage of not having to fight with a bunch of other traffic. It’s about to get harder than ever for something like Netflix to come along again.” Free Press pointed to the deal as evidence of the need for stronger anti-consolidation regulatory forces in Washington, and The New York Times’ Vikas Bijaj made a similar point in calling for the FCC to revisit its net neutrality stance.
On the other hand, Wired’s Robert McMillan argued that smaller players may not need or want a direct connection to ISPs like Netflix has, since they already pay third-party networks for that same connection and don’t stream nearly enough data to make it a serious problem like Netflix has. StreamingMedia’s Dan Rayburn said there’s nothing nefarious or threatening to net neutrality about this deal; Netflix is just shifting its costs for connecting to Comcast’s network from a third-party network directly to Comcast. “This is how the Internet works, and it’s not about providing better access for one content owner over another,” he wrote.
Elsewhere in telecommunication, The Wall Street Journal reported on telecom giants’ fight against net neutrality laws in Europe and the U.S., and at In These Times, Jay Cassano and Michael Brooks said net neutrality’s erosion will disproportionately impact the mobile Internet and therefore lower-income people who depend on it.
A FCC news study becomes a political football: The controversy surrounding a proposed U.S. Federal Communications Commission study continued to boil over late last week into this week, prompting the FCC to suspend and revamp the study. The flareup started earlier this month with an op-ed in The Wall Street Journal by FCC commissioner Ajit Pai that raised an alarm about the FCC’s wide-ranging proposed study on “critical information needs” of communities, which included plans to interview journalists about how they select stories and what their news organization’s philosophy is. Pai said those questions represented an inappropriate government intrusion into newsrooms and raised the specter of government policing of journalism.
The concerns were picked up widely across conservative media outlets, and the outcry led the FCC to axe the interviews of journalists, though it still plans to go ahead with the majority of the study involving surveys and interviews of citizens about the news they get. Even after that concession, a Republican congressman said he plans to hold hearings on the study and introduce legislation to block it entirely.
Others weighed with their views as well: USA Today’s Rem Rieder said the study isn’t an Obama-driven plot to control the press, but a poorly-thought-out attempt to determine whether citizens are getting key information. “The last thing we need is journalism cops flooding into newsrooms to check up on how the sausage is being made,” he wrote. Likewise, The Atlantic’s Conor Friedersdorf said the study hardly portends the return of the Fairness Doctrine, but looks like a waste of public money regardless.
FCC commissioner Mignon Clyburn defended the study, saying it’s simply meant to determine if there are any barriers to market entry keeping communities from receiving important information. Likewise, Wisconsin professor Lewis Friedland, who led the literature review that preceded the study, told the Columbia Journalism Review there was no government monitoring, intimidation, or coercion ever intended with the now-dropped journalist questions — “it was simply to get their point of view of how they understood the information needs of their local communities.” Techdirt’s Karl Bode delved into the study proposal and concluded that “It’s a fairly routine and entirely voluntary field survey designed to gather data. Nothing more.” Bode chided the FCC for kowtowing to conservative pressure to gut the study.
Why Piers Morgan never clicked for CNN: The New York Times’ David Carr reported this week that Piers Morgan, the former editor of the defunct British tabloid News of the World, will have his prime-time CNN show canceled this spring. Morgan was given the key slot once occupied by Larry King, and the 81-year-old King told The Daily Beast’s Lloyd Grove he’d be willing to come back if CNN would have him.
Carr surmised that Morgan’s show never took off because his irrepressible Britishness never fit with an American audience and that his tirades against guns “have clanked hard against the CNN brand, which, for good or ill, is built on the middle way.” Time’s James Poniewozik said Morgan’s show was rife with problems, including his Britishness, his abrasive personality, and his longform style. Slate’s David Weigel argued that Morgan was ultimately a poor interviewer — either too deferential or too bullying — and The Washington Post’s Erik Wemple said that without a coherent overarching perspective, Morgan’s show was left to rely on the devalued commodity of the long-form interview: “In today’s America, there are so many outlets producing interviews, so many outlets for your message — that an hour-long interview program is almost programmed for obsolescence.”
Options to break up the NSA: There were a couple of new revelations on government surveillance this week: Glenn Greenwald at The Intercept published documents from the Edward Snowden leak that detail how a group of the British spy agency GCHQ plants false information online to ruin the reputation of its targets and infiltrate online discourse to try to drive targeted groups apart. A German paper also reported that the U.S. National Security Agency has stepped up its spying on other German officials after it was told by President Obama not to spy on German President Angela Merkel.
The Wall Street Journal (paywalled) reported that the Obama administration is considering overhauling NSA surveillance in a variety of ways. Gizmodo has a good, quick summary of the options Obama is considering — let the phone companies oversee the phone metadata collection, letting a different federal agency hold the data, letting a different third party hold the data, or abolish the data collection program completely. At CNN, cybersecurity expert Bruce Schneier offered his own plan for breaking up the NSA that includes moving all surveillance of Americans to the FBI to bring it under U.S. law.
Reading roundup: A few other conversations and developments that bubbled up this week:
— Reddit is testing a live blogging-style update form for breaking news stories, something Gigaom’s Mathew Ingram said could be a real boon for the site and for social journalism. PandoDaily’s Nathaniel Mott said technical changes won’t necessarily change the site’s spotty track record for accuracy on news events, but Circa’s Anthony De Rosa said Reddit shouldn’t be dismissed as a potentially valuable link in the online news chain.
— Upworthy ran a correction this week for a faulty video it had run earlier, and it was distinct in that it consisted mostly of complaints from readers interspersed with apologetic GIFs from Upworthy staffers. Poynter’s Craig Silverman looked at the debate surrounding the correction and talked to Upworthy about why it made the correction that way. Upworthy’s Matt Savener also defended the site’s track record and editorial process.
— Politico’s Dylan Byers wrote a thorough piece on the failures of text-based news sites in producing compelling live video, and the Lab’s Joshua Benton looked at the consumers’ side of the problem as well.
— Finally, a few thought-provoking pieces from the week: The Atlantic’s Robinson Meyer and the Lab’s Joshua Benton on the boxy style that’s ubiquitous in newly redesigned news sites, Stack Exchange founder Jeff Atwood decried the proliferation of dumb apps, and journalism professor Jeff Jarvis gave some prescriptions for the relationship between philanthropy and news.
[This review was originally posted at the Nieman Journalism Lab on Jan. 31, 2014.]
What Project X might mean: Ezra Klein officially said goodbye to The Washington Post last weekend and announced his move to Vox Media, the company that owns The Verge, SB Nation, and Polygon. Meanwhile, The Post’s publisher, Katharine Weymouth, defended the paper’s decision to turn down Klein’s proposal, saying it wasn’t guaranteed to be profitable and would have been a distraction, and noting that Bezos wasn’t involved in the decision. The Post also announced it would be hiring for a new data-driven journalism site as part of a broader expansion that includes a redesign and several revamped sections.
Klein’s description of his new site would be was vague, but touched on the need to add more context, education, and explanation to news. Vox CEO Jim Bankoff talked to CNN’s Brian Stelter and Ad Age’s Tim Peterson about the business side of the venture with relatively few details, though he said the reports of a $10 million investment are “way high.” Klein gave a few more hints in a Q&A with BuzzFeed — it’s not going to a bigger version of his Post site WonkBook.
Gigaom’s Mathew Ingram said it’s fine that there doesn’t seem to be much structure yet to Klein’s new project, and CUNY professor Jeff Jarvis voiced his excitement at the prospect of Klein specializing in the “explainer” form of news. The Post’s Matt McFarland was also intrigued by the idea, outlined in a job description for the new site, of building “the world’s first hybrid news site/encyclopedia,” wondering why those two forms couldn’t be rejoined. Mark Potts, a former Post staffer, noted that the Post kicked around an idea for that kind of hybrid back in the late ’90s.
In a perceptive column, The New York Times’ David Carr saw Klein’s move as an indicator that digital publishing has come into its own, rather than serving as an additional platform for traditional media. “In digital media, technology is not a wingman, it is The Man.” He followed that up with a look at a few varieties of the new breeds of digital media operations. Likewise, media analyst Ken Doctor explained what seems distinct about the form of digital journalism Klein is embarking on, and laid out the economic reasons it’s becoming easier to start a new site. NYU’s Jay Rosen argued that Klein is leaving the Post’s supply-side logic to start something based on the “keep me informed” logic of the demand side.
The New Yorker’s George Packer was skeptical of the distinctiveness and quality of this new brand of digital journalism as Carr explained it, but BuzzFeed’s Charlie Warzel said that many media critics like Packer are missing the fact that tech isn’t just a smokescreen or accessory for a venture like Klein’s, but “the difference between a successful new media venture and a flop.” The Columbia Journalism Review’s Dean Starkman pointed out that we don’t know if Vox’s ad-based business model can translate from niches like sports and tech to the drier topic of public policy. Jack Shafer of Reuters also raised some cautions about the venture, arguing that with low costs of entry and a fluid talent pool, it’s not very well protected from competition.
Inside and Facebook get into mobile news aggregation: Two new entries into the now-crowded field of mobile news aggregation services were announced this week: Jason Calacanis, the tech entrepreneur who founded Silicon Alley Reporter, Weblogs Inc., and Mahalo, launched a new app called Inside, and Facebook announced it’ll launch a social news reader called Paper next week.
At the Lab, Staci Kramer has all the details on Inside in a thorough interview with Calacanis. The app is built around updates of 300 characters — about as much as can fit on the typical smartphone screen — summarizing a single source (with a link) on a news story. The updates will be human-written by a full-time staff of 15 and a crew of freelancers, and Calacanis stressed the value of human judgment in finding high-quality sources of original reporting and summarizing them intelligently. The app has enough money from the sunsetting Mahalo to run without ads for the first two years, but when it does add them, it’ll most likely go the native-ad route.
Gigaom’s Mathew Ingram highlighted a few of these points, and talked to Inside editor Gabriel Snyder (formerly of Gawker and The Atlantic’s Wire) about the opportunity in mobile news that ““feels akin to 2002 and the web, when everyone knew the web was going to be huge but it wasn’t clear how people would get the news.” The Verge’s Nathan Ingraham described the app’s mechanics and pointed out that Inside faces the aggregation conundrum of saving people time but also delivering clicks to their sources. Kramer also reviewed the app, calling it promising but inconsistent, particularly in its organization.
Poynter’s Sam Kirkland noticed the word “curate” in Inside’s App Store description and explored the aversion that Inside, Circa, and other news reading services have to calling what they do “aggregation.” Kirkland argued that it’s time to reclaim aggregation as a term: “Aggregation isn’t always bad, but automatically framing unoriginal reporting as curation helps these news middlemen avoid debate about whether we should be troubled by their methods.”
Facebook’s Paper is on the other end of the aggregation spectrum from Inside: It’s an automated and human-selected feed of news and content from your Facebook friends and the loads of the public content that organizations post on Facebook. The Verge’s Dieter Bohn looked at the app’s interface, concluding that its more relaxed feel “stands in direct opposition to the high-volume, high-noise vertical feeds we’re used to on Twitter and Facebook.”
Josh Constine of TechCrunch argued that Paper’s combination of content selection through editors, automation, and your friends leads to a strong sense of serendipity, and Recode’s Mike Isaac said the app taps into a broader network of discovery than the standard Facebook experience. Mashable’s Lance Ulanoff noted that Facebook still isn’t creating its own content here. Gigaom’s Mathew Ingram, meanwhile, noted that the market for social reading apps is becoming packed, and Facebook runs the risk of Paper feeling “like a tabloid stapled together from items they’ve already seen in their news feed.”
Reading roundup: A few other stories that grabbed some attention this week:
— This week’s U.S. National Security Agency surveillance happenings: The Obama administration reached an agreement with tech companies allowing them to disclose some vague information about the number of user data requests they get from the government; it was an improvement over the current information blackout, but many privacy advocates still aren’t too pleased. The NSA was also reported to be using “leaky” smartphone apps, like Google Maps and Angry Birds, to collect user information, and U.S. Director of National Intelligence James Clapper obliquely referred to journalists as leaker Edward Snowden’s “accomplices.”
— CNN announced a partnership with Twitter and the social analytics company Dataminr that will allow it to spot breaking news stories on Twitter more quickly and efficiently. Gigaom’s Mathew Ingram explained why it makes sense on Twitter’s end, and Alex Weprin of Capital New York explained what’s in it for CNN.
— A growing backlash against the preoccupation with long-form journalism crystallized a bit last weekend in a New York Times column by Jonathan Mahler using the failings of Grantland’s Dr. V story to critique the fetishization of long-form. Instapaper founder Marco Arment explained why he eschewed the long-form push to focus on substance rather than length per se. BuzzFeed’s Ben Smith broke down the backlash and explored some of the differences between long-form done well and done poorly.
— The other shoe dropped at Patch, where new owner Hale Global had AOL lay off hundreds of staffers — possibly two-thirds of the editorial staff. Meanwhile, as the Lab’s Ken Doctor explained, GoLocal24 is doing its own national ramp-up of online local news.
— Finally a few resources and pieces to think on: Poynter’s Craig Silverman released a handbook for verifying digital content, particularly in breaking-news situations; NYU’s Jay Rosen gave some thoughts on how a networked beat structure might be built; and at Journalism.co.uk, Alastair Reid wondered if journalists are properly equipped to handle the massive amounts of data being released by organizations and governments around the world.