Believing in time-on-site

A Sept. 12 Pew Center study found that “Americans are spending more time following the news than over the past decade.”  Great news – or is this yet another misleading key performance indicator, as my previous blog post about time spent on site might suggest?

No – I like the Pew Center study.  It’s a study of attitudes and feelings.  Good old-fashioned survey research (with all of its mind-numbing statistical sampling), is an essential component of web analytics.  Web site traffic data is audience behavior – the “what.” News orgs have to havPew Research Centere attitudinal research to understand the “why” so they can attract audiences that aren’t coming to their sites.

The data you get from Google Analytics or Omniture is enticing, isn’t it?  (Work with me, here….)  Oh wow, we can track every click!  Ah, yes, we can track every click – but we can ONLY track clicks on OUR site, not on anyone else’s.

A time-on-site calculation can only be harvested for you if someone clicks on a page in your site and generates a page view that’s counted by your Google Analytics/Omniture account.  Time-on-site is the time in between the first page clicked on YOUR site and the last page clicked on YOUR site.

This means:

1.   If someone clicks on YOUR site and then immediately goes to another site (a bounce),
Picture 8 it’s not included by Google Analytics/Omniture in the time- on-site calculation.  It’s like it never existed, time-on-site-wise.

It IS counted as one visit and as one page view.  So that means that all of those people who come to your site regularly (you know, the ones we really like) just to get the latest on a story aren’t counted in time-on-site – and they should be.

2.    If someone is on his/her third page in your site and opens another tab anPicture 5d goes to another site for twenty minutes before returning to your site and clicking on another two pages, those twenty minutes are included in time-on-site – and they shouldn’t be.

3.    The time a person spends on the last page of your site isn’t counted.
Picture 7 If someone clicks through a few pages on your site and spends 15
minutes utterly absorbed in a story before leaving your site to go pay
bills online, those 15 minutes aren’t included in time-on-site – and they should be.

So, time-spent-on-site is always either over- or under-counted.  And you’ll never know which – this makes this metric utterly unreliable as an indicator of success or failure.  So you can’t make a decision with this data because you can’t know whether your action – a section added, the number of long-form videos reduced – caused time spent to go up or down.

More importantly, these days it really doesn’t matter how much time people  actually spend on a news site.  What matters much, much more is whether people are engaged with the news, whether they believe news sites are an essential component to their lives, so much so that they come back repeatedly, rate a story with five stars, leave comments, click on an ad, and otherwise use the site.  It doesn’t matter whether they spend three seconds or three hours.
More say they graze for news

That’s what makes the Pew Center finding so exciting (surely you’re still with me on how great web analytics is?).  People actually said they’re spending more time with the news now than they did a decade ago.  It doesn’t matter whether they actually are (!) – they believe they are.
I wish every news org could afford its own Pew Center-like attitudinal research study so it could track how engaged its own targeted audiences are (or aren’t), and to understand how to get and retain new audiences.  The information wouldn’t always be pretty, but at least it would be data that could make a difference.

Number wars

What does the war in Iraq have in common with the war news organizations are fighting for their survival?

Nothing at all, unequivocally.

BBC However, I was struck by Owen Bennett-Jones’ closing of his August 27 BBC story about the disparities in the reports of the numbers of civilians killed in Iraq since the war started in 2003:

“It remains true that people tend to cite a number that reflects not their view of the quality of the research but rather their view of the war.”

In other words, getting someone to use your numbers – your definition of success – is just as hard as getting them to change their thinking.

In the world of web analytics, people “tend to present the metric that’s most likely to work in their favor.  They’re tracking the wrong way or they don’t want to look at a particular set of data. They are wrongly using analytics as what [Robert Rose of Big Blue Moose] calls a Weapon of Mass Delusion….Worst of all, they are not learning to apply insight to action.”

— from “A Web Analytics Trap,” by Jim Ericson, Information Management, August 13, 2010

 

 

Page views: bad metric #3

Why do news organizations persist in using total page views as a measure of success?  Perhaps because if you're afraid of numbers then you're even more afraid of bad numbers, or numbers that tell you that your site isn't as successful as you want.

As with unique visitors and time on site, page views is a deeply flawed metric for understanding how a news organization is growing and retaining audiences.  

If the number of page views goes up, it could be a good thing.  Or, it could be bad.

If page views go down, it could be a bad thing.  Or – you guessed it – it could be good.

We would all like to think that a soaring number of page views means lots of people are eagerly pawing through our sites reading everything that's written.  However, how many times have you gone to a site and clicked on, say, 12 pages, fruitlessly looking for something? 

This is counted as:

1.    One unique visitor
2.    One visit
3.    12 page views

And one dissatisfied person who may not come back.

The page views metric rewards the bad design and navigation that many news sites have (sorry).  Most news sites persist in using section titles that are the same as their legacy media product (e.g., "Local News," "Life"), leaving audiences – if they're so inclined – to have to click numerous times before landing on a story about a particular city or activity like gardening.   

Or, a site breaks up a story into multiple pages, which can be annoying to a reader and reduces the possibility the reader will read the entire story and rate it, e-mail it or leave a comment.  What could be counted as one page view with a comment is counted as, say, five page views. 

If a site is redesigned and readers can find what they want with fewer clicks, total page views will – should – go down. 

And, dynamic content, or content that changes on the page without the reader clicking anything – isn't counted at all.  Streaming stories, videos, widgets, Flash applications, podcasts?  One page view.

To truly grow, a news org must understand every action its audiences are taking on its sites.  These are challenging times that require news sites to experiment and try many different things.  Not all things will work, which means sometimes the numbers will be bad.  But – you guessed it – that's a good thing.  We have to know if something's not working so we can fix it. 

 

“A trend is a friend”

Someday I’ll give up web analytics and move on to something real like pottery or something, but until then I’ll keep fighting the good fight to get news orgs to stop using monthly unique visitors as an indicator of success.

It’s tempting, I know, to count UVs because the number of monthly subscribers is the standard for print, and the number of people Nielsen says watched a program is the standard for TV.

But technology has made everything different.  Strategy is more important than ever, and understanding audiences, not just counting them, is essential.

UV overcount

UVs are counted by counting the number of cookies, or computers,
that go to a site.  This means UVs are always significantly overcounted
or undercounted.

If one person uses three computers, it’s counted as
three unique visitors.

UV undercount Conversely, a number of people going to one computer – for example, at a school or library – means that UVs will be undercounted.

Neil Mason, a web analytics guru who does some pretty thorough research for his clients, noted in a recent ClickZ column that UVs are usually overcounted.

(So that’s the reason why news orgs use total UVs – would they use this number if it were consistently undercounted?  Don’t think so…)

Mason notes that while the UV metric is “particularly important for those sites that are dependent on advertising revenues as a major source of
income,”  it “must always be treated with caution and never taken at face
value.”

Believe me, it’s rare to hear a web analytics expert use the word “never.”

Gingerbread man The advice from the ever-pragmatic Mason:  “A trend is a friend.”   Analyzing significant increases or decreases over time will give news orgs the information needed to build audiences.

(Trends are indeed friends, but don’t even think about using counts of Facebook friends and Twitter followers!)

 

Mobile muddle

How do you measure mobile?  It’s a mess, even for web analytics gurus like Judah Phillips at Monster.com, who said as much in an interview with IQWorkforce:

“The mobile space is interesting to me too, but it’s very much like traditional web analytics on smaller screens with some absolutely crazy data collection, sessionization, and visitorization challenges.”

Huh?  Let’s start at the beginning.

Just as millions of other people, I have a mobile phone.  However, I just discovered that I don’t have a “phone.”  I have a “device,” or just simply, “a mobile.”  That’s right – “mobile” is now a noun.

I guess the definition of “phone” is now limited to something on which you only make or take calls.  And, it turns out, even the lowest end mobiles – or lean mobiles – at least have texting capabilities.  (If you want to sound like a techie, use SMS, or short message service.  Sheesh.)

Samsung V1000 Up until last year I had a lean mobile with a camera.  I loved using the camera but didn’t send photos to anyone because it would have cost me $15, just like that.  I didn’t send any texts because each one made or received cost $0.20 each, which meant every time I got an unsolicited text from a company or an unknowing friend I was a little annoyed. It wasn’t the cost.  It was just the principle of it all.  I neither wanted nor needed these texts, and I had no control over receiving them.

I’m on a smartphone now, an iPhone 3GS, for no particular reason other than it sounded fun.  I’m paying $5/month extra to make or get 200 texts, and I’ve found texting pretty useful – so useful that if I find myself doing more than 200 texts/month I’ll probably pay the AT&T rip-off unlimited-text fee of $20/month.  I’ve also been doing everything else everyone else does – reading news stories, tweeting, updating my Facebook status, checking in for flights, buying things.

IPhone apps And I have no idea whether I’ve been using mobile apps or the mobile web.

It turns out “There are ‘three worlds’ in mobile: apps, mobile Web and SMS. In the
case of smartphone owners, they will use all three to varying degrees.”  (From Internet2Go.net, March 2009.)

You know what “three worlds” means.  Three different sets of metrics.

And, guess what? Apps are device-specific, which means there are different sources of metrics for iPhone apps (which, contrary to popular belief, hasn’t taken over the world), BlackBerry, Palm Pre, etc.

And….

Mobile web browsers (e.g., Safari on an iPhone) are also device-specific.

And….

All of those mobile usage numbers from comScore, Ground Truth (a mobile measurement firm) and the like only measure one mobile world, the world of mobile web browsers.  They don’t measure usage from apps. And how many people use mobile apps rather than the mobile web, especially for Facebook and Twitter?  I dunno – a lot?

The mobile usage numbers may all be flawed, but they all do point to mobile’s continuing rapid growth.  So, unfortunately, that means we’ve got to understand what new nouns like “sessionization” and “visitorization” mean.

 

 

 

 

An ounce of Prevention

I exercise regularly and don’t smoke, but I avoid reading Prevention.com. It’s just too annoying to be reminded of all of the other “smart ways to live well.”  When I see “by the editors of Prevention.com,” I have this picture in my mind of a bunch of really healthy people popping out cheerful stories like “5 Vitamins Your Bones Love” and “10 Reasons You’re Always Exhausted.”

Prevention Now I have another (annoying) vision, thanks to MinOnline‘s story about how well Prevention is using web analytics.  Of course a staff that is so pragmatic and probably always mentally alert would resist “going for the cheap link grab and traffic spike” – the junk food of web analytics.

I haven’t been sleeping well because I think too much (the top reason people don’t get enough sleep and are therefore exhausted), so I’ll just plop in these two paragraphs verbatim from Steve Smith’s MinOnline story, “At the Building Prevention.Com, Only The Abs Are Flat.”

Page views rpt Prevention stays “on its own brand message and [courts] the kinds of audiences that it and
its advertisers really want. ‘We got back to engaging with our customer
in the ways we knew they wanted us to engage with them,’ says [vp/digital Bill] Stump.
Fishing for any and all eyeballs and courting simple traffic spikes in
the search-driven universe doesn’t pay off in the end. ‘You get waves of
traffic, but the tide goes back out and what are you left with?’
Instead, by keeping to the needs of the ‘core customer’ in everything
that goes out to syndication or into the e-mail newsletters, prevention.com is courting the
people who tend to stay.

Page views per visit “Now, each big wave raises the sea level for all of prevention.com’s metrics, says
Stump. In the last two years, overall page views climbed 60%. In the
last year, the number of visits per user went up 12%. But it is the
engagement metrics of which Stump is proudest. ‘The number that warms my
heart,’ he says, ‘is page views per visitor that are up 49%.’ That
means the new visitors are sticking with the site and drilling much
deeper than they ever have before. ‘In general, advertisers want an
engaged audience. They want the metrics that show that people value your
brand and come to you for something that is unique. We own natural
health and fitness and beauty. We are the authentic voice.'”

 

 

Wasting time

Time spent on a site or a visit ranks right up there with total page views and monthly unique visitors as widely quoted metrics masking as indicators of success for news organizations.

No, it's not a crime to misuse a metric, but isn't a shame to waste your time on something that's not absolutely essential to your site's success? 

Dali-clock-compressed Plus, the way that time spent is calculated is flawed.  All web metrics are flawed somewhat, but time spent is really misleading.

More on the ugly methodology later – let's tackle time spent's uselessness first.  In other words, if the methodology were acceptable would time spent still be a key performance indicator?

Advertisers have always made decisions based on the level of engagement a news org's audiences have with its brand and content.  But both content and the ways people use and interact with content are different – and thus the way engagement is measured is different, too. 

Man_reading_newspaper In the past, time spent was an important measure of engagement for news orgs and advertisers.  People spent whole chunks of time with one medium or another.  Readership surveys measured time spent per day or per week.

Because these were surveys, time spent was based on self-reported information.  It was what people said they did vs. what they actually did.

Picture 1 But it didn't matter whether what people said matched with what they did.  What mattered was how engaged people felt.  People who reported they spent an hour a day with Monday's newspaper but actually only spent twenty minutes believed they spent a large chunk of time and attention with a news org.

In stark contrast, web advertising decisions depend on knowing actual behavior as reported via rows upon rows of numbers ruthlessly pouring out every second.   Among many other things, advertisers track the number of times their ads come up and are clicked upon. Sites and audiences are more niche and are highly segmented.   The algorithms for and definitions of "engagement" vary for every site and every company.  

Time spent just isn't a good indicator of engagement.  Someone who spends five minutes a day on a site, goes to five different stories each visit and adds comments twice a week is clearly more engaged than someone who comes onto a site for 30 minutes a week and clicks idly on a few pages while talking on the phone.    

How many times have you spent 30 minutes or so on a site, flipping and flapping through what seems like a million page views in a fruitless attempt to find something?  Maybe you spent 30 minutes in such a visit once – and never went back.

A news org's success in the long-term will be based not on how much time people spend on a site but what they do once they're there.

————–

How Time Spent on a Site is Calculated

Continue reading “Wasting time”