Extremism in the defense of the algorithm is no vice?


WordPress surfing led me to another interesting sob story from a penalized webmaster and my reply got so long it deserved to become a post:

Marshall Sponder wrote:

Take Know More Media’s case – you have 100+ blogs and 2+ years of content – that’s easy, 30,000 to 50,000 blog posts and Google, with just one or two paid links that pass PageRank, is going to throw the entire blog network out of it’s index over that?

Yep, it appears that’s it – that’s the reason.  But is it fair?  No.
Strictly from a users point of view I think it is very hard to justify technical penalties on good content.    Few users know or care what “hidden text” is, so if a mom and pop webmaster uses this tactic and Google deletes the otherwise informative, relevant website it is hard to argue that users are served well.    Even if a black hat SEO created a site filled with illegal tricks but also full of highly relevant quality content I think Google’s case against including that site is weak.  As a user I want  *quality content* and I don’t care about the site’s technical construction.    Where Google is simply banning sites for using spammy tactics I’d agree with Marshall that to be faithful to user centricism they really have to take it a step further and look at the content they are excluding.   Even if the content contains paid linking and other violations if it unique, quality content Google cannot exclude it without violating their stated “prime directive” of providing the best for the users.

However, Google has to manage about one trillion URLs, so obviously they need shortcuts in ranking and one of them is a page from AZ Senator Barry Goldwater’s playbook when – many years ago – he tried to justify an escalation of the Vietnam war, perhaps to nuclear level.   Google’s coin of the famous Goldwater phrase would be: “Extremism in the defense of the algorithm is no vice”.

I don’t think penalties are generally *fair* or *user friendly*, but I’m willing to concede they may be necessary for Google to function as profitably as they do since it would take a lot of human intervention to help every mom and pop determine what’s wrong with their sites.

However, I feel Google continues to fail in their obligation to communicate more effectively with penalized sites although I think they are s-l-o-w-l-y  catching on to the fact that most webmasters of penalized sites remain unclear as to why the site has been penalized or downranked.    Removal offers you a shot at “reinclusion” and (very rarely) possible webmaster tools staff feedback.   Downranking is algorithmic and Google will not generally offer any advice to help downranked sites.     In this case you generally want to re-read the webmaster guidelines and experiment with different approaches in an effort to boost rankings.

My view is that as many thin content database sites have flowed online Google is holding online material to a higher standard of quality, especially if it’s at a new website.    This helps explain why you can find well ranked pages that are inferior to pages at a new website.

There is a solution to all of this in my opinion, which is for Google to include a lot more community input and feedback into the process than they currently appear to do.    I’d guess the recent discussions to aquire DIGG may have been in part to gain more community feedback tools and data.     Historically Google has been brilliant at using algorithms to determine ranking and advertising, but has fallen short of brilliance in their ruthlessness in dealing with website practices they don’t like, leaving a lot of collateral damage – especially related to sites involved in “paid linking” and variations on that complex theme.

At SES San Jose 2009 I’ll hope to get to ask Matt Cutts more about this in person.   Matt is Google’s top spam cop and always very open to conversations about ranking and search.    In fact the best event of the conference is the Google Party where engineers are on hand to discuss search related issues – including complex ranking technicalities that are sometimes brought to Google’s attention as part of the search conference circuit.

Another shot in the Blog Revolution? Few links if by land and none if by sea.


Louis Gray is rightfully pissed off at the way Mashable, a major tech blog, did not properly handle some stories written by Gray.   Basically they under-attributed Gray’s reporting of Robert Scoble’s PodTech departure.   I’m not familiar enough with Mashable to know if Gray is reasonable to suggest that they’ve built the whole site on this type of secondary reporting, but I certainly agree that blogs are now doing what mainstream media has done for decades – sacrificing good quality reporting in the interest of monetization.   Also I think the great and thoughful voices of several big blogs have been largely replaced by marginal writers and writing as those sites struggle to become “media companies”.  

Another defect of the new web is that linking practices and linking strategy have become very critical to success – A list sites simply don’t link out appropriately because they (correctly) view their links as valuable and (incorrectly) choose not to give that value away.   

Matt’s got a good post on this story, noting how attribution is a cornerstone of good journalism and Mashable and others should do a better job of attribution, though I’m not clear if Matt would agree that insufficient linking is part of opportunistic linking strategies more than journalistic oversight:

I wrote over there: 

…. but monetization is trumping journalism all over the place and I think the blog community should think about this a lot more than we do.

I don’t know about Mashable’s practices, but often it is marginally paid and marginally talented writers who feed the big blogs that originally had really thoughtful voices.

Also, natural linking has effectively become a “web currency” and many “A list” sites are very reluctant to link to sites outside of their frames of reference – I believe they see it as too big of a favor where even 5 years back it would have been done without a second thought.

I see this as a growing problem with many large, heavily monetized tech blogs. They are (slowly) trading profit concerns for journalism and web concerns. An inevitable thing, but a bad one

Google’s Constitutional Amendment: The Right to Rank as you see fit


Some of the most lively debate and controversy at search conferences surrounds the issue of Google ranking rights.   At Search Engine Strategies in San Jose the most interesting (and confrontational) session involved Michael Gray taking Matt Cutts to task on Google’s aggressive stand on commercially driven linking.    

The stakes of the “right to rank” question may become even higher in the context of a recent Microsoft v Google case, where MS is suggesting in their court brief against the Google Doubleclick merger that the merger will create something like monopoly conditions in the online advertising space because (according to Microsoft’s sources) Google+Doubleclick serve more than half the world’s online advertising.  

Although I don’t think MS is attacking Google ranking methods directly here it’ll be interesting to see if Google claims that since their algorithm does not rank the free “organic” listings on a commercial basis the suit has less merit than it would if they *did* favor sites in the organic listings.   

This would, of course, beg the key point that Google’s ranking power is now so high that it can make or break companies – offline as well as online – depending on how they rank in the organic “free” listings.   This confers on Google an obligation that IMHO they still do not take seriously enough – the obligation to minimize the collateral damage and maximize the correct rankings using, if necessary, more human intervention.     In short I’m saying that until the results are *so good* that only highly subjective opinions are coming into play Google needs to do *more* than is currently done, based on the principle that “with great wealth comes great responsibility”.    Ironically I think Google’s success has to a large extent insulated them from the growing criticism in the webmaster community.   Some of that criticism is self serving, e.g. spammers who are unhappy their tactics now fail, but much of the criticism is coming from users and newly minted webmasters or mom and pops who are frustrated because they can’t seem to get ranked properly for even the most obvious queries.   Google blames the spammers for this, but it’s a dynamic process and more transparency from Google – perhaps with stronger forms of site and webmaster ID for “official” or clearly white hat sites – could go a long way to solving the transparency problems.

Over at Matt Cutts’ blog he makes this point about a recent ASK court case decision in favor of a search engine’s right to rank as they see fit.  This point lies at the heart of the right to rank debate:

 Again, it makes sense that search engines get to decide how to rank/remove content in their own index…

I replied over there:

Matt …hmmm….wouldn’t you agree that this has some clear limits?   What would you call crossing the line on this freedom to rank however you see fit?
*
If Google pulled what Yahoo did some time ago and essentially forced sites to pay for inclusion or be excluded would that fall within the sensical realm?  
*
MSN is claiming (somewhat ironically and hypocritically, but correctly) that Google’s ad power is becoming close enough to a monopoly that remedies are in order.  Historically there has been trouble when a single company or country controlled more than half a resource – why no problem here?      

—– end reply —–