Directing Traffic

THE MAY/JUNE ISSUE of the Columbia Journalism Review has a sidebar that lists political magazine websites and their traffic numbers. For some reason this bit of text has been given a lot of attention as various sites bicker about the numbers and who’s got more traffic and blah blah blah. (Everyone seems to have ignored the point of the main article: that Jonah Goldberg is a really smart guy who runs a really great website over at NRO.)

Anyway, the sidebar reported that, according to Jupiter Media Metrix, we get 247,000 unique visitors a month here at weeklystandard.com. This isn’t quite right. The truth, such as we can divine it, is that since we re-launched last fall, we’ve never had a month with as few as 247,000 unique visitors, and currently we’re getting about 570,000 unique visitors who call up about 9.3 million page views a month. CJR made a good-faith effort to get this stuff right. (One of its reporters called me asking for our numbers and I didn’t want to give them out at the time; being somewhat new to the web, I thought it might be gauche. Little did I know.) The fact that CJR wound up confusing things only points to how gelatinous all of this talk about web statistics is. Let me try to explain.

Years ago there was no consensus on what measurement to use for web traffic. Most people relied on “hits.” But a hit is just a file being called out from a server. If you have a page with 25 image files on it, every time someone comes to that page you get 25 hits. I know one web lass who’s constantly bragging about how her site gets a million hits a month, but she has 49 files on her homepage, so the truth is that even at the theoretical best, her site only gets 20,408 readers a month (that is, if each user comes to her home page only once and never clicks on anything else).

Those aware of these distinctions don’t use hits anymore. They look at “visits,” “unique visitors,” and “page views.” A “visit” is any time a user comes to a site. If the same user comes to a site 10 times in a day, he logs in 10 visits–even if he views 50 pages during each visit. A “unique visitor” is a measure that counts the different people who come to a site. But for a variety of technical reasons that I won’t bore you with, this isn’t an absolute measure and uniques can be easily over-counted depending on whether or not the visitors log on from multiple computers and also on the protocols their ISPs use. Finally, a “page view” is the number of pages which have been requested from a server. The page view is the simplest measure, and therefore probably the most reliable, but it, too, is imperfect. All pages from the site count equally, from the “search” page to the actual content, so a big multi-page site (like Salon) will get proportionally more page views per visitor than a smaller single-page site (like a blog).

But even though we’ve figured out the best units of measurement, the act of measuring is still suspect. The servers that dish out websites do two things: (1) They supply requested files to users. (2) They make a log file that tracks all of the above stats (plus some others that I haven’t discussed). The person who runs the website then uses a program that interprets those logs to get the traffic numbers (like many others, we use WebTrends). But in times of extraordinarily high traffic, the server simply stops marking the log so that it can use all of its power to send out requested files.

Anyway, when CJR asked websites to provide numbers, the other sites in the sidebar presumably gave them stats from their server logs. But when we declined to respond, CJR went to Jupiter Media Metrix. JMM is a firm that monitors traffic on the web, not through actual counting, but through statistical sampling–like the Nielsens. They collect what they think is a representative sample of web users, hook them up to a black box and keep track of where they go. (I’m speaking metaphorically; for a real explanation, click here.)

Now, I’m not saying that JMM’s system is better or worse than the log file system. My point is that both systems are flawed, maybe even deeply flawed. But so what? All media monitoring systems are flawed–the Nielsens, the newspaper and magazine circulation bureaus, the whole lot of them. And that’s fine because the truth is, while it’s impossible to measure absolute numbers, you can, more or less, establish relative performance. Which is really all that matters.

The problem with the CJR sidebar is that we didn’t provide our numbers, CJR filled in with numbers from a different accounting method from the one it used for other sites, and nobody stopped to distinguish between the two. The result was a distorted vision not just of the absolute numbers, but of the cardinality (which is the only thing these numbers are any good for anyway).

So, for what it’s worth, WebTrends says we have about 570,000 different people who come to us and ask for about 9.3 million page views per month. That’s where we are now, and we’ve been trending up since our launch 8 months ago.

Jonathan V. Last is online editor of The Weekly Standard.

Related Content