Neil Heyside’s Oct. 18 story on PBS MediaShift about how newspapers should analyze their content by source type – staff-produced, wire/syndicated or free from citizen journalists – got me thinking about other ways content should be analyzed to craft audience-driven hyper-local and paid-content strategies.
Most news sites have navigation that mimics traditional media products – News, Sports, Business, Opinion, and Entertainment. However, those types of broad titles don’t work well with digital content because people consume news and information in bits and pieces rather than in nicely packaged 56-page print products and 30-minute TV programs.
Each chunk, each piece of content – story, photo, video, audio, whatever – should be tagged or classified with a geographic area and a niche topic so a news org can determine how much content it has for each of its highest priority audience segments – and how much traffic each type of content is getting.
By geographic area I mean hyper-local. East Cyberton, not Cyberton. Maybe even more hyper – East Cyberton north of Main Street, for example, or wherever there’s a distinct audience segment that has different characteristics and thus different news needs and news consumption patterns.
Similarly, news orgs need hyper-topic codes, especially for hyper-local topics. The Cyberton community orchestra – not Classical Music, Music, or Entertainment. If a news org is looking at web traffic data for “Music” it should know whether that traffic is for rock music or classical, and whether the content was about a local, regional, national or international group.
Oh, and there’s one more aspect to this hyper-coding. Content should be coded across the site. Ruthlessly. For example, to really understand whether it needs to add or cut coverage in East Cyberton, a news org needs to add up those East Cyberton stories in Local News, plus those East Cyberton Main St. retail development stories in Business, and those editorials and op-eds in Opinion about how ineffective the East Cyberton neighborhood council is, and….
Sometimes these hyper-codes are in content management systems but not in web analytics systems like Omniture or Google Analytics. Knowing what you’ve got is great – but knowing how much traffic each hyper-coded chunk of content is equally if not more important.
Whether the hyper-codes and thus the data are there only makes a difference if a news org is willing to take a hard, nontraditional look at itself. The data may suggest it needs to radically change what it covers and the way it allocates its news resources so it can produce “relevant, non-commodity local news that differentiates” it, as Neil Heyside’s PBS MediaShift story points out.
Heyside’s study of four U.K. newspaper chains has some interesting ideas about how a news org can cut costs but still maintain quality by changing the ways it uses staff-produced, wire, and free, citizen journalist content. The news orgs in the study “had already undergone extensive staff reductions. In the conventional sense, all the costs had been wrung out. But newspapers have to change the way they think in order to survive. If you’ve wrung out all the costs you can from the existing content creation model, then it’s time to change the model itself.”
If a news org doesn’t know, in incredibly painful detail, what type of content it has and how much traffic each type is getting, then it’s not arming itself with everything it can to mitigate the risks of making radical changes such as investing what it takes to succeed in hyperlocal news and in setting up pay walls. Both are pretty scary, and it’s going to take a lot of bold experimentation – and data – to get it right.