Blogging the conference has been a great way to test some ideas about blog ranking and watch Google struggle to bring the most relevant content into the main search (they’ve done pretty well with blog search, not so well with regular search which will have all the blog content listed in a week or so, basically too late to be all that helpful to users). More importantly the stuff from *this year* will probably be ranked above the SES San Jose 2009 information that is likely what people using that term are searching for effective next year. I’d think they could simply increase the value of ‘freshness’ for listings tagged as events related.
I had a nice discussion about this “events” ranking challenge with Jonathan from Google at the party. The problem is that to combat spam Google does not push out blog content immediately, meaning that if you search for “SES San Jose”, especially a month ago or so, you would have been likely to get old, dated content rather than the current SES page you’d normally want to find. This appears related to linking issues (newer has fewer), but also I think the regular engine is allergic to new content, which is why you’ll often find the most relevant Google stuff at the blog search if it’s a topic that is covered heavily by blogs such as SES or CES Las Vegas where I noted the same issues of “stale content” in the main search with “great content” in the blog search.
I remain convinced that some of the challenges faced in ranking could be solved by a combination of more algorithmic transparency from Google combined with greater accountability by publishers who’d agree to provide a lot more information about their companies so that Google can get a good handle on the credibility of the online landscape. This webmaster ID is happening now in several ways but I’d think it could be scaled up to include pretty much everybody publishing anything online (ie if you don’t register you’ll be subjected to higher scrutiny).
What your suggesting is the exact opposite of Google’s policy,and is not necessarily a good ides in that the more information we have about exactly how Google ranks pages will enable unscrupulous webmasters to “game” Google like they did before Google came along.
You mean transparency or webmaster ID? Together I think they are a great combination to allow everybody to do a better job with content.
Thanks to the *lack* of transparency a huge amount of energy is wasted trying to game Google because nobody knows the ranking factors. If you defined them cleverly – in ways that would increase quality of all content – then it *would not matter* if people tried to game or conform to them.
Yes, Google, in its Webmaster Guidelines is pushing content, content, content. There are many reports that Google makes available to webmasters that are also helpful. If you look at the information that is available through Google, there are many hints about what you should be doing.
But they are only hints, because if Google were to open up, then the relevancy of its search results would be seriously compromised. Of course if you’ve looked at some Google results lately you often can’t see the results for the ads.
Maybe the best search strategy is to find a search engine with the best ads.
Like your blog a lot.
That’s true, Google doesn’t expose the real policies of hitting the page 1 under search results of your specific keywords. However, experts in the market have done a lot to reveal some secrets that are at least closer to Google’s policies and search placement algorithms, several informative resources are there explaining detailed SEO process and the ways to beat the industry in organic search results. For example: http://www.bnbuzz.com, SEOBook.com and many more… one should consult them too besides the Google guidelines.