What’s So Unique About A Unique?


What’s So Unique?

Darren Rowse had an interesting write-up today about an interesting service called Izea Rank.  This is yet another “blog only” ranking system designed to help bloggers answer that age-old question of “where am I”.

Just for the record I won’t be participating for a number of reasons, but Darren’s article and some of the comments already made to it … worth a read, by the way … prompted me to get busy and finish up this post that has been languishing in my “Drafts” folder for some time.

The Unique Question

This will certainly not be a “how to” post and it also will certainly not answer any cosmic questions … but it asks one and I think the on-line world in general and the blogging community in particular have been ignoring the question far too long.

We’re Kidding Oursleves

Frankly, I feel it’s time we stopped ignoring the 500-pound gorilla in the room that every just tiptoes around.

Unique Blog VisitorsThe gorilla’s name?  TRAFFIC … everything business-oriented online rises or falls in relationship to the traffic it generates.  And yet we have really no decent way at all to measure traffic, compare traffic or even discuss it in scientific terms.  Right now the Internet is like a huge international highways system with roads that range from barely noticeable footpaths to mega-freeways and yet we don’t have a true, defensible standard for counting the cars that pass through the network.

How Many People Visit?  A Lot … Or Maybe Not — Who Knows?

In particular we have an absolutely abysmal and even corrupt system for attempting to count that subset of traffic, UNIQUE VISITITORS.  If you are reading these words on my website,

If you are reading these words on my website, www.retiredpay.com, then you are indeed a visitor, and I thank you for dropping by.  And certainly, if you are a human visitor, you are unique … aren’t we all.

But the question for discussion is,

  • How on earth can I know that you are unique?
  • How can I keep statistics that accurately approximate the number of unique visitors I receive.  This influences nearly everything involving this site’s ‘world’ …
  • What should I write about,
  • Should I change topics,
  • What form of advertising should I use,
  • What number of unique visitors may i honestly report to a prospective buyer …
  • the list of unanswered questions just goes on and on.

Yet even huge online deals and decisions that have significant financial outcomes to the millions of people are made every day based on numbers that are not only demonstrably false, but may even be maliciously in error.

A few typical, yet no less egregious examples:

  • Do you surf from a business, a school, a government agency or even an ISP that uses a gateway router IP address?

A great many of us do … it’s one of the most commonly recommended forms of network security and without it we couldn’t even have the Internet we have today … we would have long ago run out of individual IP addresses.

But this means that when you visit this page and another person behind the same router visits also, the stats counting software on my server has no reliable way to determine the two of you are unique, and not just two requests from the same user.

  • What if you have a specific, unique IP assigned?  The problem then translates into one of timing.  If you surf here today and you also visited last week, you’re clearly the same person … but what tells my stats program that?

Typically statistics program arbitrarily count each hit not seen in the previous 24 hours as a “unique” … but if someone comes back in23 hours and 59 minutes did they suddenly loe their uniqueness?

  • And finally, I mentioned at the beginning the caveat “if you are human.”

A huge percentage of the page requests to every web server are from non-human sources.

Typically, such robotic sources will use a name to indicate they are not people … but there is no enforceable Internet standard for this, and if I build a piece of software to repetively visit a site acting as a human and inflate both raw traffic and unique’s, and chose to hide the fact the ‘visitor’ is a robot, what happens?

Not a thing, except for bad data.

So that’s my observation for the day … hopefully you’ll do some further thinking/discussing.  I see no need of running yet another rank counting program when our entire methodology of counting is built on very loosely shifting sand.

Foundations, please?

Speak Your Mind