Blogging the conference has been a great way to test some ideas about blog ranking and watch Google struggle to bring the most relevant content into the main search (they’ve done pretty well with blog search, not so well with regular search which will have all the blog content listed in a week or so, basically too late to be all that helpful to users). More importantly the stuff from *this year* will probably be ranked above the SES San Jose 2009 information that is likely what people using that term are searching for effective next year. I’d think they could simply increase the value of ‘freshness’ for listings tagged as events related.
I had a nice discussion about this “events” ranking challenge with Jonathan from Google at the party. The problem is that to combat spam Google does not push out blog content immediately, meaning that if you search for “SES San Jose”, especially a month ago or so, you would have been likely to get old, dated content rather than the current SES page you’d normally want to find. This appears related to linking issues (newer has fewer), but also I think the regular engine is allergic to new content, which is why you’ll often find the most relevant Google stuff at the blog search if it’s a topic that is covered heavily by blogs such as SES or CES Las Vegas where I noted the same issues of “stale content” in the main search with “great content” in the blog search.
I remain convinced that some of the challenges faced in ranking could be solved by a combination of more algorithmic transparency from Google combined with greater accountability by publishers who’d agree to provide a lot more information about their companies so that Google can get a good handle on the credibility of the online landscape. This webmaster ID is happening now in several ways but I’d think it could be scaled up to include pretty much everybody publishing anything online (ie if you don’t register you’ll be subjected to higher scrutiny).