Google Ranking Needs a Spanking


Over at the Google blog today Amit Singhal has post 1 of 2 that promises an introduction to Google ranking.  As usual I’m disappointed in the way Google maintains what to me is a pretense of transparency while using some very ruthless and mysterious tactics to downrank sites they claim don’t meet quality guidelines.   Google (correctly) sees themselves as warring with spammers for control of the web but (incorrectly) thinks transparency is the wrong approach in this fight.

There were some rumblings last year of contacting webmasters directly about site problems but my understanding is that this would represent only a tiny fraction of total sites under penalty.    Of course, due to so little transparency in this area we can’t know the real numbers.

I’ll hope Amit’s second post is a LOT more specific, because I think he’s already practicing the kind oblique speak that is becoming commonplace when many from Google talk about ranking:

Amit:
No discussion of Google’s ranking would be complete without asking the common – but misguided! 🙂 – question: “Does Google manually edit its results?” Let me just answer that with our third philosophy: no manual intervention.

That statement is false, and he should not say it.   He does try to clarify later in the post:

I should add, however, that there are clear written policies for websites recommended by Google, and we do take action on sites that are in violation of our policies or for a small number of other reasons (e.g. legal requirements, child porn, viruses/malware, etc).

Action?  Yes, of course he means the *manual intervention* he said above does not happen.  Google has a right to pull sites out of the rankings, though it is annoying how much they talk about NOT manually intervening when they do it.    Because of no transparency nobody outside of Google knows how often they manually intervene.    Amit makes  it sound like it’s only for horrors like child porn or malware, but note that the use of inappropriate “SEO” tactics such as “hidden text” can get you removed and even banned from the Google index.   Unfortunately for small sites – e.g. “Aunt Sally’s House of Knitting website”  Aunt Sally may have no idea her webmaster is using these tactics.   How often does this happen?    My guess is that hundreds of thousands of legitimate sites are ranked very improperly due to technical penalties, but due to no transparency (and probably no measure of this at Google) nobody knows.

The big Google problem is that the policies for algorithmic downranking are not *clear enough*.  Many SEO companies prey on this lack of transparency, ironically often using Google’s mystique to lure unsuspecting businesses into expensive “optimization” schemes that don’t work or can get them seriously penalized.

Part of Google’s search algorithm philosphy is that they don’t share details because spammers would exploit them before honest people.   Although a weak case can be made for this idea, a better one is that in  non-transparent systems dishonest folks will do *better* because they invest more energy into finding the loopholes.    For example inbound linking, a very hot SEO topic last year at SES San Jose, has complex rules nobody understands outside of Google.    For example linking between sites in an information network can be advantageous or it can be penalized depending on whether Google (rather than the community or webmaster) sees the practice as manipulative of the algorithm or user-friendly and thus allowable.

Amit – a clear policy is one where the webmaster will know, rather than guess, what they are doing to annoy the Google algorithm or the manual intervention folks.

There is a pretty good source for information about how to approach site architecture for optimal ranking and it is to read Matt Cutts’ SEO related posts here.

Although Matt won’t give out much about the algorithmic penalties that create much of the Google confusion and frustration for established websites, if you follow Google’s guidelines and Matt’s posts on SEO you are unlikely to have serious problems with ranking.     Of course unless you work to optimize a new website you will have the *standard problems* with ranking since your competition is probably doing basic SEO on their site.   I’d argue (along with many SEO folks) that the best way to enter things this late in the game and hope for good ranks is with a topical *blog* to support your website.   Start with several posts about your general area of business, using a lot of the terminology people would use to find your website, and add posts regularly.

I’ll be covering the SES San Jose Search Conference and expect to hear a lot more debate about the issue of transparency, blogging, and SEO.