Google’s new mobile, social media and cloud storage products have made it more omnipresent than usual. That’s why I was surprised to hear Brian Schmidt, Google’s Americas sales director, say that Google still considered itself a search company.
This seemed a little disingenuous at first, just something Schmidt would say to sooth me and the other Online News Association conference field trippers at Google’s Boston office last week.
But then Schmidt explained that everything Google does is “dependent on users opting in and finding value in the experience.” With so much information available on so many different devices (see my New York Times Room for Debate forum commentary on paying for device-based convenience), Google’s success is based on people making a conscious choice to use Google products to find and link them to what they want.
“Search” means more than people typing keywords in a little box. “Search” now means “I’m coming to you – and not someone else – to make my life better by finding what I need and connecting me to it.”
Thus, news organizations should be search companies, too. But they’re not going to get there by counting page views and unique visitors. For some news orgs it seems to be a point of pride that the bulk of the visits to their sites start with their all-things-to-all-people home pages. How about defining success with metrics that indicate whether people are finding what they want?
“People produce timely answers, correctly if possible, whereas computers produce correct answers, quickly if possible. Chatbots are also extraordinarily tenacious: such a machine has nothing better to do and it never gets bored.”
This spring the nine hardy students who took my USC Annenberg web analytics class came up with wonderful insights that could never have come just from reading a report straight from Google Analytics, Omniture or any other chatbot. Part of their grade was based on whether the (equally hardy) participating news and nonprofit organizations were actually going to use their analyses for decision-making. This meant each student had to really understand the organization’s strategies, goals and personalities before he/she dug into the data. Here are some of the things we learned.
Content is indeed king, but only if it’s coded
None of the organizations coded site content with enough detail to make decisions about what to do with their sites. Data coming straight out of Google Analytics or Omniture was coded only by date published and by broad categories such as “News.” This is the equivalent of marking a box of books “MISC” – or putting in “stuff” in any search engine.
For example, let’s say an organization believes it can build and engage audiences by adding “more local politics and government coverage.” To track whether it did indeed produce “more,” and what coverage did result in increased visits and engagement, the org needs to track how many politics stories it has, by topic and local geographic area, and how much traffic each topic and/or area gets.
Each student developed a taxonomy of codes the organization could use to classify its content, and then manually (I told you they were hardy!) coded sample data pulled out from chatbots, er, Google Analytics or Omniture.
Track traffic by day, week and by topic, to determine if the site is getting the traffic it should
Many organizations look at monthly data, and celebrate traffic spikes. Hidden in monthly data, however, are clues to where to build targeted audiences and advertisers. Health/fitness section traffic, for example, should increase the second week in January, perhaps fall off after Valentine’s Day (!), and increase before swimsuit season.
More content = more traffic
Looking at visits by day of week, we saw radical drops in visits on the weekends. This seemed to be due to little unique local content being posted on Saturdays or Sundays. In this age of the 24/7 newsroom and increased Internet access through mobile, can news orgs afford to make resource decisions based on non-audience-based, chicken-and-egg logic (“We don’t get much traffic on the weekends so we can’t justify adding weekend staff.”)?
Sometimes you should have separate sites for each audience segment….
Josh Podell, an MBA student, focused on analyzing the e-commerce donation functions on the nonprofit sites. He observed that it’s hard to understand what works and what doesn’t when donors are coming to the site to find out more about the organization but residents are coming to find out about programs and services. Josh’s suggestion: Have a completely separate site – and Google Analytics account – for donors. An org could have much more focused content for each audience, and metrics such as visits per unique visitor, page views per visit and the percent of people who left the site after looking at just the home page (home page bounce rate) would give much more clear indicators for both sites.
….but sometimes you shouldn’t.
One of the organizations had its main site on one Google Analytics account, and its blog on another. Dan Lee, a graduate Strategic Public Relations student, noticed extremely high home page bounce rates from returning visitors compared to new visitors.
With the question of why burning in his head, Dan looked in detail at the site content and structure, and hypothesized that returning visitors were most likely to go to the home page, see the teaser about the latest blog entry, and immediately “leave” the site to go to the blog. The Google Analytics account for the blog did indeed show that its top referring site was the main site.
Google Analytics was “correct,” but only a human could have produced the right answer for the organization.
I exercise regularly and don’t smoke, but I avoid reading Prevention.com. It’s just too annoying to be reminded of all of the other “smart ways to live well.” When I see “by the editors of Prevention.com,” I have this picture in my mind of a bunch of really healthy people popping out cheerful stories like “5 Vitamins Your Bones Love” and “10 Reasons You’re Always Exhausted.”
Now I have another (annoying) vision, thanks to MinOnline‘s story about how well Prevention is using web analytics. Of course a staff that is so pragmatic and probably always mentally alert would resist “going for the cheap link grab and traffic spike” – the junk food of web analytics.
Prevention stays “on its own brand message and [courts] the kinds of audiences that it and
its advertisers really want. ‘We got back to engaging with our customer
in the ways we knew they wanted us to engage with them,’ says [vp/digital Bill] Stump.
Fishing for any and all eyeballs and courting simple traffic spikes in
the search-driven universe doesn’t pay off in the end. ‘You get waves of
traffic, but the tide goes back out and what are you left with?’
Instead, by keeping to the needs of the ‘core customer’ in everything
that goes out to syndication or into the e-mail newsletters, prevention.com is courting the
people who tend to stay.
“Now, each big wave raises the sea level for all of prevention.com’s metrics, says
Stump. In the last two years, overall page views climbed 60%. In the
last year, the number of visits per user went up 12%. But it is the
engagement metrics of which Stump is proudest. ‘The number that warms my
heart,’ he says, ‘is page views per visitor that are up 49%.’ That
means the new visitors are sticking with the site and drilling much
deeper than they ever have before. ‘In general, advertisers want an
engaged audience. They want the metrics that show that people value your
brand and come to you for something that is unique. We own natural
health and fitness and beauty. We are the authentic voice.'”
In the late 1990s, “America Online” was the shiny new company everyone watched, feared and tried to copy. Just “AOL” now, it’s hardly as fresh or inspiring. With its new CEO, logos and use of web analytics to select the stories it covers and evaluating its reporters, has AOL once again become a news organization to watch?
AOL’s announcement that it will employ “judicious use of Web-analytics software” sparked the expected flutter of coverage. It’s admitted to using data to inform (dictate?) news decisions, so you could be led to believe that AOL is adopting a true audience-based approach. However, after reading the Feb. 22 story in BusinessWeek and the reactions gathered by Media Post News, it seems like AOL is still using a traditional advertising-based mass media strategy. It’s still trying to be all things to all people. It’s just using web analytics to decide what those things are.
“Audience growth and audience engagement have to be the things that we judge the most off of our journalist investments,” AOL CEO Tim Armstrong is quoted as saying. So far, so good.
Armstrong also said that “brand ads should be a lot bigger on the Internet today,” talking about how online advertising revenue should pick up. But there was no mention about AOL’s own brand strategy, something that would answer the question of “What is AOL?” for audiences and advertisers once and for all. On which niches will it focus? How much of its content will be unique and compelling enough to those niche audiences so that they’ll come back regularly?
“The right approach to the content business is to KNOW YOUR AUDIENCE, or
the people that come to your site, and create a product for THEM. AOL’s
approach is clearly not centered on this….it’ll drive up page views and therefore, revenue but that’s not
likely to last as the industry becomes more analytics savvy. Today, a
million uniques with zero session times, high bounce rate and no repeat
visitors isn’t seen as a sign of a lack of audience but in the not too
distant future it will.”
I haven’t researched AOL myself, so I don’t know if all of the details in the BusinessWeek and Media Post News stories really reflect what AOL is doing. So I’ll just note some some things news orgs should think about when using web analytics to inform news decisions and evaluate journalists.
Evaluating success (either a site’s or a journalist’s) by total page views doesn’t work. A large number of page views may just indicate visitors got there by mistake or clicked around trying to find something. Plus, dynamic content (Flash, etc.) will not be counted as page views. Page views can be a useful metric, but only when combined with other metrics – such as ratios – that give context.
Engagement can’t be determined by web traffic or behavioral data alone. Attitudinal research is essential to find out why or why don’t people come to a site regularly, what they want and what they’re not finding.
If journalists are going to be held accountable for web traffic and audience engagement will they also have control over the factors that drive traffic, such as design, navigation and marketing? Or will they just submit their stories and hope for the best?
“AOL is even considering sharing a portion of quarterly profits with staffers whose work fetches the most page views.” BusinessWeek
How will traffic goals be set? If journalists will be rewarded for generating “traffic” (however it’s defined), will they be fired if they don’t? Will the benchmarks or starting points – and the time journalists have to reach the required traffic levels – be based on whether a topic is already established or whether it’s one a news org wants to nurture and grow because the topic is essential to achieving its strategy?
“Tacked to the newsroom walls in AOL’s downtown Manhattan headquarters are pages and pages of Web traffic data.” BusinessWeekUh, this would cause even me to shut down. It’s definitely not “judicious use of Web-analytics software.” Does AOL have a few key performance indicators that everyone understands and on which they can focus as a team?
Software and reports don’t make decisions; people do. Successful use of web analytics depends on the decision-makers understanding and using the information correctly. If news orgs believe the success of their websites depends on being truly audience-focused then they must also ensure the analytical resources and processes are there as well.
AOL may stumble again but at least it’s trying something different. I look forward to learning from AOL whether it succeeds or fails.
YouTube‘s become a verb and a household name, but I’ll always see it as an organization that’s brought metrics into the lives of the common people (those who have broadband Internet, anyway). The “Most Popular” and “Featured Videos” are seen worldwide, sometimes garnering millions of views. “Hey, did you see….” is usually accompanied by something like “…and it has x million views on YouTube!”
Number of views is great for little else other than bragging rights. It’s one of the “famous” metrics (web analytics guru Avinash Kaushik‘s term) that “are staring you in the face when you crack open any analytics tool” but “barely contain any insight.”
Yep, for anyone in the content business, number of views is right up there with hall of famers number of page views and monthly unique visitors.
YouTube has pushed all of its account holders – no matter how amateur – to use meaningful metrics. In March 2008 it launched Insight, its “video analytics tool for all users,” along with some almost-preachy instructions on how to use metrics to get more people to watch your videos and, of course, come more often to YouTube.
The Insight tool allows you to track “community engagements” (there’s that word again) in terms of ratings, comments, and favorites. YouTube doesn’t want you to settle for people just watching your video. People have to show, in a measurable way, that they not only watched it but also reacted to it.
At the very least people should give a star rating (one is bad, five is good). Rating is easy, quick and anonymous. Tagging a video as a favorite is the next rung. And if they’re really engaged, they’ll leave comments.
But, as anyone who’s ever spent any time at all on YouTube knows, many comments are spam, obscene and irrelevant – just noise. But the value of social media metrics is in looking beyond what James Kobelius in Information Management points out is an “often low and laughable” signal-to-noise ratio.
Kobelius notes that “if you crawl, correlate, categorize, mine, and explore it with the
right tools….[this unstructured information] can yield unexpected insights….The intelligence value of any individual tweet [or comment] in isolation is
negligible….Intelligence emerges from the aggregate.”
If you can stomach a few obscenities, look at this thought in Encyclopaedia Dramatica about YouTube view fraud and how the ratio of VPC, or views per comment, “is the most accurate way to determine if anyone” cares. “A high VPC usually means view fraud has been committed.”
The example in ED shows that a video with 136,097 views and 3,529 comments has a VPC of 38.7, a low number that indicates this is a video “that people actually find funny.” The video with 296,413 views, 541 comments and thus a VPC of 547.9 is probably something nobody really cares about.
I calculated some VPCs from this week’s “Most Popular” videos and came up with some numbers that I don’t know what to do with yet. To see if VPC can be used as a key performance indicator, I’ll need to calculate VPCs and crawl through the cacophony of a variety of news videos. VPC may never be “famous,” but it might be insightful.
It's almost Valentine's Day, so let's muse again about what it means to be "engaged." In this day of figuring out whether people will pay for web news, defining success by measuring engagement is more important than ever.
It doesn't matter whether you love or hate a news organization if you're engaged with it – as demonstrated by behaviors such as going to the site frequently, contributing content, e-mailing a story, rating a video or paying a monthly subscription fee.
Many worthy people have come up with all kinds of complicated mathematical formulas for measuring and tracking engagement. Nothing's stuck. In other words, just because a number was produced ("Disaster! Our engagement rating was 14 last month but our goal was 19!") doesn't mean site traffic and other key performance indicators move in conjunction with it. A metric is just a number if it doesn't move up or down as a result of some action or mistake on your part.
Although measuring engagement still eludes us, I hope news orgs will still adopt an engagement philosophy and an audience-focused culture that will guide the decisions that do lead to measurable results.
A philosophy still needs some definition. I like this quote from Dave Smith, CEO of Mediasmith, a digital advertising agency. The interview is in "Digital Engagement," a book by Leland Harden and Bob Heyman.
"Engagement is an unconscious tick of the mind that causes you to think differently about and notice a brand differently in the future."
In the same interview Smith also quotes Erwin Ephron, perhaps the "founder of modern media planning," as saying that "Media engagement and advertising engagement are very different things….Historically, media are measured by audience delivery. Advertising is measured by response. Engagement-based ratings would measure media by response."
In other words, it's not enough now just to put content out there and hope your audiences will like it. Traditional audience research that produces various numbers for loyalty and satisfaction isn't enough either. Audiences can't just tell you how they feel. They have to show you.
The video, one of The Tribune’s “Stump Interrupted” series, uses pop-up bubbles and illustrations to add context and value to a normally boring but important story. The pop-ups are entertaining without being silly.
When KBH is saying, “Our taxes have gone up too much in the last ten years,” the pop-up points out that “But…since 2003, Texas still had the 14th lowest per capita tax increase in the country.”
KBH: “I think that we are seeing too much power in one person, the power of appointment.”
Pop-ups: A large hand illustrating someone being appointed glides in from the left, followed by the fact that “Governor [Rick] Perry has made about 5,530 appointments since first taking office.”
On the site, people can also see the sources The Tribune used for the pop-ups.
The metrics angle: Counting how many times a video was viewed doesn’t give any info on whether the viewer was engaged. The more relevant measure is how much of the video was viewed, and whether the video was viewed from beginning to end.
I would also look at video metrics by topic, and set goals accordingly. I would imagine (no, really?) that the number of complete views of a Dallas Cowboys video is usually much higher than that of anything having to do with politics, even in Texas.
The Texas Tribune got California-born-and-bred me to watch a KBH video from beginning to end. I’m now more interested in both Texas politics and in how The Tribune covers it. Imagine how engaged a Texas resident who has a stake in this would be.
Actually, The Tribune doesn’t have to completely guess at this. In
addition to commenting and e-mailing the story, people can rate a story
as a “must read.”
I’m really intrigued about what The Tribune will do next. It’s a nonprofit news org that, according to WebNewser, didn’t cover the Fort Hood shootings because it’s “dedicated to covering ‘the politics and policy of Texas state government.'”
I love this focus on identifying a niche audience and topic, and sticking to serving the needs of that audience. WebNewser reported that editor Matt Stiles said that the Fort Hood story just “wasn’t our story. Should we have jut been one more news organization rushing to Fort Hood? I don’t think so.”
The Tribune’s a great example of a truly audience-focused news organization with unique and compelling content that provides value. Despite being staffed by “newspaper refugees,” it’s refreshingly not content-focused. It doesn’t build the content first and then hope the audience will come.
I’ve just gotten back from the 140 Characters Conference in LA where the message, loud and clear and 10 minutes per speaker at a time, was that it’s the quality of your followers that matters, not the quantity.
The first #140conf in New York in June was all about the unique communities that Twitter inspired. The dominant sponsor was Hootsuite, personified by a large owl walking around hugging people. Ann Curry duked it out with Rick Sanchez. Wyclef Jean showed up, late of course, but illustrating the importance of authencity. Attendees bonded over the duct-taped power outlets.
Five months later, it appears that Twitter has…matured. The speakers in LA weren’t giddy. The lead sponsor was Kodak, represented by CMO Jeffrey Hayzlett, a glossy brochure touting Kodak’s “convergence media tactics” and coupons for 15 percent off Kodak products. You can’t have either duct tape or power outlets in the Kodak Theatre (where the Academy Awards are held) so the crowd was often bigger in the lobby than in the auditorium.
I still had fun at #140conf LA – it is Twitter, after all – but the biz talk was pervasive: strategy, goals, objectives, processes, systems, results, the four Ps and the four Es, one of which was, of course, engagement.
Because it’s easy to gather and it looks like circulation and readership, the number of monthly unique visitors continues to be a key indicator of online success for news orgs. This is really dangerous, especially if used to develop news business models.
The total number of monthly UVs just doesn’t give any information about how engaged audiences are. Let’s say you have 100 million monthly uniques, as paidContent.org reportsthe new Steve Brill Journalism Online venture is aiming for.
This number doesn’t tell you whether those 100 million of those visitors visited once or 10 times, or whether they went to one page or to 20.
You really need to know the level of engagement to sell online advertising. And, you really need to know how engaged people are if your business model depends on paid subscribers or content.
According to paidContent.org, Journalism Online is counting on about 10 percent of its news affiliates’ audiences to pay for content. Sounds like a realistic, reasonable number, right?
No, it’s faulty business logic. Simply assuming a small percent of any total audience will do anything is really dangerous, and something that savvy entrepreneurs know or learn in Marketing 101. “There are 100 million people living in this area of the U.S. If I build a better mousetrap that costs $1, and if only 1 percent of those 100 million buy my mousetrap, I’ll have a million dollars!”
First, not all 100 million care about trapping mice. Others won’t pay even $1 for it. Still others don’t live near a store where they would be sold, and wouldn’t order it online or by other ways.
Estimating audiences is an art and a science. Estimating the audiences for paid content involves more art than science, but I hope news orgs will start with understanding what online audiences want. It doesn’t do much good to set these types of numbers based on what the news orgs need to desperately meet their revenue goals.
Maney points out that not only do “one-third of all American Internet users rated something online,” but also that “the proliferation in ratings is already changing societal dynamics.”
The story was a reminder that:
Ratings and comments are essential to have on everything in a news organization’s collection of sites, even if it’s just a simple “like this” rating (e.g., “89 people like this story”).
News orgs must candidly assess their standing vs. niche competitors such as TripAdvisor, Yelp and local news blogs. I’m still seeing news orgs comparing themselves only against their own kind, e.g., daily newspapers vs. weeklies and magazines. And it’s interesting – sad? – that Maney’s article doesn’t use a single news org in its examples.
With comments, it’s all about the quality of the comments and the contributors, not the quantity. This means defining and measuring success will be labor-intensive and and relatively subjective.
To be truly usable, attitudinal surveys must be highly targeted by category and cover both current and non-users. I mourn the money and time wasted by those “market studies” that focus mostly on the news org and its position in a broad geographic region rather than the current and potential audiences for specific niches.Surveys should not only ask where people are going for information to make decisions but also where they submit ratings and comments, and how often. What a person reads vs. what he rates vs. what he comments on will differ by topic.