I love how Google always preaches their “build for humans, not for bots (search engines)” philosphy yet they rely on bots and computerized systems themselves to rank search engine results. Am I the only one who thinks that’s a strage paradox?
If you want good rankings you’re forced to write keyword-specific title tags and headings, worry about having those and similar keywords in your copy, etc. just to get ranked for those specific search terms. Now any SEO will tell you that even if you have enough backlinks pointing to your text with the proper anchor text, you can rank for something you didn’t explicitly write about in your copy. The problem is that you need a fair number of those links, otherwise other pages will win those spots just by being more literal. But since Google is working on fixing “Googlebombs” , I’m assuming some changes might soon be made to lessen the effect of anchor text – (although that might only affect high-volume Googlebomb scenarios).
Anyway, back to the humans-bots-humans paradox. Only humans know what’s useful for humans (and even then there’s a lot of subjectivity and variables at play). Bots and algorithms attempt to systemize rules and determine what’s relevant, good, or useful. Spammers, smart businessmen, et. al. experiment, figure out these rules, take advantage of them, then exploit these ranking systems. Then the bots have to readjust – it’s a space race.
I was hoping that by this time (2007), Google would be much smarter at mapping associations and be able to figure out that something truly good (from a human perspective) can rank for associative terms, however, looking at site stats I still see that Google is still being quite literal.
You still have to write for bots as well as humans if you want to be ranked for specific keywords. Obviously humans should always come first but at times you will find yourself sacrificing “being clever” for “being ranked well by a bot” if you care at all about ranking. And in today’s attention economy, ranking means exposure so you just have to play by these rules.
I understand the (seemingly) utopian ideals that Google has and their “we’re working towards the goal of really being what we espouse to be,” but as long as a bot and a complicated set of logic rules stand in between me and the searcher, that bot will always have a significant role, and will not be as easily dismissed as Google hopes.