The signal and the noise (photo by stribs)
Last week I attended South by Southwest (SXSW) for the first time. As a “South By” virgin, I was determined to make the most of my experience. I went expecting to have to sift through many talks to find some gems that matched my particular interests. Instead, I found myself confronting a tsunami of interesting panels, presentations, and interviews. I was often faced with choice paralysis. Fortunately, I managed to attend many excellent presentations, though it took some serious, daily research and planning to do so. And I managed to avoid attending too many stinkers (and there were some). Here’s just a handful of the subjects and themes from presentations that really resonated with me.
Transparency & Versioning
New York Times public editor Margaret Sullivan and Eric Price, an MIT grad student, hosted a discussion on “Version-Controlling the News.” In the real-time arena of digital news-gathering, it’s easy to see how a story posted moments ago could quickly become outdated and need to be revised as facts stream in. However, most news organizations (and content publishers in general) aren’t particularly effective at communicating the hows and whys behind those sometimes myriad changes. Sullivan and Price argued for more transparency in versioning such content and for granting users access to the previous versions of stories, so they can evaluate them for themselves.
Additional considerations: What happens when a new story changes so much it’s not the same story? Shouldn’t both be displayed? Sometimes a story starts out with one set of authors, but the assigned authors change. Even the direction, tone and theme of stories can change significantly after they’ve already been published. With that in mind, Price developed NewsDiffs, which tracks and archives changes in articles after publication — currently within The Times, CNN.com, Politico and bbc.co.uk. Increasingly, then, we need to provide readers with access to previous versions of content, highlighting the rationale at least for the more significant changes. (Spelling errors, missing commas, grammatical errors, etc, are not as vital, though these can still be catalogued as a second, less significant but nonetheless visible category of changes). Apply this concept to content everywhere. As readers better understand versioning, it’s not hard to understand why they’d demanded it, and how content versioning (and usable experiences coupled to it) will become key in maintaining brand transparency.
Crafting Stories from Data
“Big data” was a big theme at SXSW this year, and one very engaging panel, “Journalism by #s,” examined the role of data in journalism. Participants described how journalists can discover new stories without leaving their desks. The Wall Street Journal’s Kevin Helliker, for example, explained how his brother broke a groundbreaking Nature story about the core temperature of plants, simply by examing existing data, available to anyone on the Internet. James Grimaldi (also of the WSJ) drummed home the importance of knowing the tools at your disposal: Every journalist should be familiar with data-oriented applications like Excel and Access. (You can certainly count me among those writers who have come to appreciate the redeeming organizational features of an ostensibly boring application like Excel.)
On the other hand, Grimaldi said, we also bear a responsibility for the context within which we present data and the accompanying details. When NY paper The Journal News decided to publish a list of gun owners with their addresses, it did no one any favors. Grimald called the effort a “disaster” and “a data dump with little analysis.” Data when exposed like this no longer lives within a moral vacuum.
Sure, this session was conducted by journalists, but as content strategists it’s not difficult to imagine how any company we’re working with could have a wealth of data (about the company, about consumers, about users) which is rich with stories, just waiting to be discovered. It’s up to us to dig for them — or, at least, to train our clients how to dig for them.
Saving Languages With Content Strategy
Participants on the “Indigenous Tweets, Visible Voices & Technology” panel were all, in one way or another, engaged in enterprises which give an online voice to endangered languages. They discussed strategies for saving or at least preserving languages via content and social platforms, via blogs, podcasts, video, etc. — basically, whatever creative online means made sense for a particular community. Kevin Scannell, founder of the site Indigenous tweets.com, tracks the use of endangered languages on Twitter, including languages limited to as few as a single voice. He also explained how neither Google nor Facebook are allowing new languages and translations of their platforms to be added, so he’s leading crowd-sourced, unofficial translations of these endangered languages, which can be implemented via Greasemonkey scripts. Similarly, Kara Andrade has worked to develop localized content management systems, which allow Guatemalans to create and disseminate content in their own language. This panel served as a salient reminder for the real-world good we can do as content strategists, when we apply our skills creatively to such issues around us.
Treating Twitter as a Source
Members of the “Global News After the Twitter Revolutions” panel shared how that social platform can serve as a source of valuable information, stressing, of course, the need to ensure individual sources are reliable. NPR’s Andy Carvin explained how he actually searches on expletives in order to find sources close to a breaking events: As in “What the fuck was that?” getting tweeted after an earthquake. He also described his method of building Twitter lists in advance of trackable events like hurricanes, so he’ll have a stream of reliable information from first responders, based on geolocation where possible, as the event unfolds. And he warned of people parroting news terms they don’t fully understand like “breaking news,” “confirmed” or “reports,” which might lead to the spread of misinformation and even hoaxes.
CNN’s Meredith Artley discussed strategies for tweeting at high-traffic times of day. For example, CNNbrk developed the Lunch break tweet, a tweet about something, an interesting story, that isn’t their typical breaking news, but they know Twitter users can enjoy on their lunchtime break, giving a little boost of traffic to CNN.
There’s a lot more I could share. Data genius Stephen Wolfram proved fascinating in his explanation of how, “Computation is going to become central to every field.” The founder of Architecture for Humanity Cameron Sinclair showed how “resiliency [in damaged communities] is not by chance,” but by design. Certainly a principle we can apply to our own work. And I saw an excellent panel on “Copyright & Disruptive Technologies,” wherein Cheezburger Network CEO Ben Huh proclaimed, “Don’t sue me for what other people do on my service,” and litigator Andrew Bridges demonstrated the incredible disparity between punishment for copyright violations and other crimes.
If you’re interested in learning more, here are my notes from each of these sessions:
- Connected for Reconstruction, Architecture for Humanity
- Copyright & Disruptive Technologies
- Global News After the Twitter Revolutions
- Indigenous Tweets, Visible Voices & Technology
- Journalism by #s
- Stephen Wolfram: The Computational Future
- Version Controlling the News