I'm getting Twit-fatigued from all of the phenom Tw-stats, but I can't resist pointing out how Nielsen's May 2009 report on Twitter usage illustrates the problems with using average time on site as a rough gauge of engagement.
Nielsen reported that the average time per person on Twitter in May 2009 was a little over 17 minutes, an increase from about six minutes in May 2008.
This is an average. This means you don't know how many people spent 20 hours a day on Twitter, and how many spent zero.
We do know, from a recent Harvard Business Review report, that:
- the top 10 percent of "prolific" Twitter users produce over 90 percent of all Tweets, and that
- the median number of lifetime Tweets per user is only one!
My advice: Don't use time on site as an indicator of success unless you're willing to really dig into your data and segment out heavy users vs. light users.
Here are two more observations – bashes, really – on using time-on-site.
1. The HBR report notably doesn't use time on site. It studied the number of Tweets per person. Number of Tweets – or, what's actually produced – is a much better indication of what really matters.
Because Twitter is a social media site that depends on the number of people contributing and sharing Tweets, it just doesn't matter how much time someone spends on the site. A Twitter user who spends five minutes on Twitter and Tweets or reTweets five times is a much more engaged user than the Twitter lurker who spends 30 minutes on the site but doesn't post anything.
2. Isn't using seconds in time on site unneeded precision? The Nielsen numbers: 17 minutes AND 21 seconds in May 2009, up from six minutes AND 19 seconds in May 2008. Wow, those 21 and 19 seconds really add something to our understanding of Twitter usage, don't they….