In the recent Search engine ranking report from SEOmoz, the number ten (out of 24) ranking factor was “Keyword Use in the Page Name URL (i.e. madfishdigital.com/folder/keyword.html).

It appears that it is becoming more important to make URLs descriptive, accurately describing a web page’s content. After the SEOmoz report was released, I racked my brain as to why something so exploitable could be such a huge factor. It seems that spammers could easily exploit this factor, and create poor quality websites with keyword rich URLs much like they did prior to 2004-2005. In 2005 Google appeared to be cracking down on this type of website spamming and those websites still appear to mostly be filtered lower in the results.

So why would keyword rich URLs be so important to a search ranking?

We know that Google is and has always been watching the click through rate of the websites in the organic rankings. And when a human has to choose between two search results, one with a URL of mydomain.com/index.php?q=6483&product_id=22&order=desc&page=1 and one with a url of mydomain.com/deck-clips.htm the human will pick the URL that is easiest to understand. Therefore, my conclusion is that keywords are important in a URL not because a search engine weighs in so much on how the keywords describe the URL’s content, but rather because URLs that are easier for humans to understand get higher click through rates by humans.

The report goes onto analyze what type of URL page name is best. For example, is there more influence from a static URL where the keyword is in the filename (i.e. deck-clips.html) as opposed to a dynamic page with the keyword in the query string (i.e. index.php?id=383794&page_name=deck-clips)?

The report says yes. The most influential is the static URL. This means that for those out there with a crazy content management system with dynamic URLs, you rankings can be improved by implementing a htaccess static rewrite.

This drives home the point now more than ever:  it’s important to consider the user when optimizing a website. Search engine crawlers are working harder to become human, however they still make mistakes.

Think about it, the more robot-like a search engine crawler becomes, the more susceptible it is to being exploited by spammers.

The more human the crawlers become, the easier it is for Google, Yahoo!, and Bing to display the cleanest, and most relevant search results for their users.

~Ben Herman 2009