SES San Jose Countdown


In terms of internet search the really big and influential conference is – without a doubt – SES San Jose.

WebmasterWorld Pubcon and the new SMX Conference series by Danny Sullivan (who more than anybody was the architect of the SES empire) offer similar content, SES remains the key conference venue for search marketing professionals.

I’ll be live blogging the conference and I’ll even try to get a few real time pix out from Tuesday night’s Google Party hosted at the GooglePlex.   In many ways the “Google Dance” is the highlight of the search year, when Google hosts conference attendees (including folks who just sign up for free exhibit passes), as well as tons of Google employees.    The food is great and it’s hard to beat free beer, ice cream, sno cones, and candy but the real highlight is chatting with Google search engineers who with a few exceptions like the amazing Matt Cutts, … don’t seem to get out much.

Knol Knows Ranking


Google’s new knol project features articles on any topic by anybody who cares to call themselves an expert.   The concept is really intriguing as it relates to bringing higher authority information online that will be vetted both by the author and by the community at large.

Search marketing guru Aaron Wall is one of the sharpest fellows in the SEO business and he’s been testing knol and has concerns about the results both in terms of outranking original material and copyright.

Aaron on knol

The SMX search folks – I think it was Danny Sullivan – have also been testing knol and also are suggesting knol material is ranking very well.     Google says they are not giving their own site preferential treatment, but I’m guessing what they mean is that they are applying the same rules to knol as other sites.    If, for example, those rules that are *applied equally to all sites* happen to include a high value for incoming links from Google sites or other knol pages, the knol content effectively has a big advantage, technically without any special treatment.

In terms of search strategy I think the rule here is to …. write some knol page for topics of interest to you or topics for which you want to improve your rank.      I expect knol to be a huge topic at the upcoming search conference  – SES San Jose.

Later:  Time to check out this knol thing by creating some content myself.    Here are listings I created for:

Beijing, China |     Blogs and Blogging

Extremism in the defense of the algorithm is no vice?


WordPress surfing led me to another interesting sob story from a penalized webmaster and my reply got so long it deserved to become a post:

Marshall Sponder wrote:

Take Know More Media’s case – you have 100+ blogs and 2+ years of content – that’s easy, 30,000 to 50,000 blog posts and Google, with just one or two paid links that pass PageRank, is going to throw the entire blog network out of it’s index over that?

Yep, it appears that’s it – that’s the reason.  But is it fair?  No.
Strictly from a users point of view I think it is very hard to justify technical penalties on good content.    Few users know or care what “hidden text” is, so if a mom and pop webmaster uses this tactic and Google deletes the otherwise informative, relevant website it is hard to argue that users are served well.    Even if a black hat SEO created a site filled with illegal tricks but also full of highly relevant quality content I think Google’s case against including that site is weak.  As a user I want  *quality content* and I don’t care about the site’s technical construction.    Where Google is simply banning sites for using spammy tactics I’d agree with Marshall that to be faithful to user centricism they really have to take it a step further and look at the content they are excluding.   Even if the content contains paid linking and other violations if it unique, quality content Google cannot exclude it without violating their stated “prime directive” of providing the best for the users.

However, Google has to manage about one trillion URLs, so obviously they need shortcuts in ranking and one of them is a page from AZ Senator Barry Goldwater’s playbook when – many years ago – he tried to justify an escalation of the Vietnam war, perhaps to nuclear level.   Google’s coin of the famous Goldwater phrase would be: “Extremism in the defense of the algorithm is no vice”.

I don’t think penalties are generally *fair* or *user friendly*, but I’m willing to concede they may be necessary for Google to function as profitably as they do since it would take a lot of human intervention to help every mom and pop determine what’s wrong with their sites.

However, I feel Google continues to fail in their obligation to communicate more effectively with penalized sites although I think they are s-l-o-w-l-y  catching on to the fact that most webmasters of penalized sites remain unclear as to why the site has been penalized or downranked.    Removal offers you a shot at “reinclusion” and (very rarely) possible webmaster tools staff feedback.   Downranking is algorithmic and Google will not generally offer any advice to help downranked sites.     In this case you generally want to re-read the webmaster guidelines and experiment with different approaches in an effort to boost rankings.

My view is that as many thin content database sites have flowed online Google is holding online material to a higher standard of quality, especially if it’s at a new website.    This helps explain why you can find well ranked pages that are inferior to pages at a new website.

There is a solution to all of this in my opinion, which is for Google to include a lot more community input and feedback into the process than they currently appear to do.    I’d guess the recent discussions to aquire DIGG may have been in part to gain more community feedback tools and data.     Historically Google has been brilliant at using algorithms to determine ranking and advertising, but has fallen short of brilliance in their ruthlessness in dealing with website practices they don’t like, leaving a lot of collateral damage – especially related to sites involved in “paid linking” and variations on that complex theme.

At SES San Jose 2009 I’ll hope to get to ask Matt Cutts more about this in person.   Matt is Google’s top spam cop and always very open to conversations about ranking and search.    In fact the best event of the conference is the Google Party where engineers are on hand to discuss search related issues – including complex ranking technicalities that are sometimes brought to Google’s attention as part of the search conference circuit.

Microsoft BrowseRank to compete with Google PageRank


CNET profiles a new paper showcasing a Microsoft effort to enhance search by looking at *user behavior* as well as the old standby standards that all the major search engines use such as links in to the page, the content of the page, titles of the page, and several others.

Google’s initial brilliance was recognizing that the link relationships on the web gave you great insight into the best websites. Google correctly noted that sites with many links to them, especially for a particular keyword, were more likely to match a users interest for that keyword. Although many factors have been included in Google ranking for years, pagerank was arguably the most important breakthrough. Initially the system tried to be an online analogy to academic citation. Google’s Larry Page reasoned that websites with more incoming links would tend to be better, and that those incoming links themselves should also be weighted according to the importance of the site from which they came.

The system started to show severe signs of wear as search marketeers as well as mom and pop businesses began to “game” the pagerank system, creating spurious incoming links from bogus sites and buying links from high rank websites.

Enter Microsoft “BrowseRank”, which will arguably be harder to game because it will monitor the behavior of millions of users, looking for relationships between sites, pages, length of time on page, and more. It’s a good idea of course but arguably it is Google that has *by far* the best data set to manage this type of approach. So even if Microsoft’s system starts to deliver results superior to Google’s one can expect Google to kick their own efforts into gear.

As with all search innovation the users shoud be the big winners. Search remains good but not great, and competition in this space will only serve to make everybody better….right?

Google Ranking Needs a Spanking


Over at the Google blog today Amit Singhal has post 1 of 2 that promises an introduction to Google ranking.  As usual I’m disappointed in the way Google maintains what to me is a pretense of transparency while using some very ruthless and mysterious tactics to downrank sites they claim don’t meet quality guidelines.   Google (correctly) sees themselves as warring with spammers for control of the web but (incorrectly) thinks transparency is the wrong approach in this fight.

There were some rumblings last year of contacting webmasters directly about site problems but my understanding is that this would represent only a tiny fraction of total sites under penalty.    Of course, due to so little transparency in this area we can’t know the real numbers.

I’ll hope Amit’s second post is a LOT more specific, because I think he’s already practicing the kind oblique speak that is becoming commonplace when many from Google talk about ranking:

Amit:
No discussion of Google’s ranking would be complete without asking the common – but misguided! 🙂 – question: “Does Google manually edit its results?” Let me just answer that with our third philosophy: no manual intervention.

That statement is false, and he should not say it.   He does try to clarify later in the post:

I should add, however, that there are clear written policies for websites recommended by Google, and we do take action on sites that are in violation of our policies or for a small number of other reasons (e.g. legal requirements, child porn, viruses/malware, etc).

Action?  Yes, of course he means the *manual intervention* he said above does not happen.  Google has a right to pull sites out of the rankings, though it is annoying how much they talk about NOT manually intervening when they do it.    Because of no transparency nobody outside of Google knows how often they manually intervene.    Amit makes  it sound like it’s only for horrors like child porn or malware, but note that the use of inappropriate “SEO” tactics such as “hidden text” can get you removed and even banned from the Google index.   Unfortunately for small sites – e.g. “Aunt Sally’s House of Knitting website”  Aunt Sally may have no idea her webmaster is using these tactics.   How often does this happen?    My guess is that hundreds of thousands of legitimate sites are ranked very improperly due to technical penalties, but due to no transparency (and probably no measure of this at Google) nobody knows.

The big Google problem is that the policies for algorithmic downranking are not *clear enough*.  Many SEO companies prey on this lack of transparency, ironically often using Google’s mystique to lure unsuspecting businesses into expensive “optimization” schemes that don’t work or can get them seriously penalized.

Part of Google’s search algorithm philosphy is that they don’t share details because spammers would exploit them before honest people.   Although a weak case can be made for this idea, a better one is that in  non-transparent systems dishonest folks will do *better* because they invest more energy into finding the loopholes.    For example inbound linking, a very hot SEO topic last year at SES San Jose, has complex rules nobody understands outside of Google.    For example linking between sites in an information network can be advantageous or it can be penalized depending on whether Google (rather than the community or webmaster) sees the practice as manipulative of the algorithm or user-friendly and thus allowable.

Amit – a clear policy is one where the webmaster will know, rather than guess, what they are doing to annoy the Google algorithm or the manual intervention folks.

There is a pretty good source for information about how to approach site architecture for optimal ranking and it is to read Matt Cutts’ SEO related posts here.

Although Matt won’t give out much about the algorithmic penalties that create much of the Google confusion and frustration for established websites, if you follow Google’s guidelines and Matt’s posts on SEO you are unlikely to have serious problems with ranking.     Of course unless you work to optimize a new website you will have the *standard problems* with ranking since your competition is probably doing basic SEO on their site.   I’d argue (along with many SEO folks) that the best way to enter things this late in the game and hope for good ranks is with a topical *blog* to support your website.   Start with several posts about your general area of business, using a lot of the terminology people would use to find your website, and add posts regularly.

I’ll be covering the SES San Jose Search Conference and expect to hear a lot more debate about the issue of transparency, blogging, and SEO.

Facebook tells me I’m overweight – this is *good* targeted advertising?


Logging into Facebook I was assaulted presented with an advertisement featuring a picture of an incredibly fit fellow’s chiseled abdomen with the caption “48 YR OLD Overweight?”….

I suppose I should be thankful this was not a picture of a shirtless Mark Zuckerberg, but ..

I’m 48 so I can’t believe this was a coincidence – obviously Facebook is using my personal information to target ads to me – using the information they said they’d keep confidential and I really don’t want shared with any old Tom, Dick, or Hairy bodybuilder advertisers.

As I’ve noted before online privacy is largely an oxymoron, and I’m really not very concerned about the privacy “violation” here.  However something about this pisses me off – I think partly because after all the hype – including from people like me – I hate to think this is the best we can do with targeted advertising.

Sure, I’m a *little* overweight but I don’t need the bogus overpriced green diet junk advertised to me here by Mr. Muscleydude.    This is the classic type of junk product “seen on TV” presented in an annoying way using information I don’t want given out to advertisers.   In my book Facebook has already pushed past the limit of advertising more than is welcome by me, and I get the strong feeling that with revenues in question we’ll see a lot more of these marginally relevant ads in the future.

SEO Pseudo Alert: Google Crawling Flash


For many years anybody who knew anything about search engine optimization “SEO”, would scoff at the idea of using more than minor number amount of Flash elements in websites, because for many years those Flash elements were largely invisible to search engines – most notably Google – and therefore sites that used Flash would often rank lower than others simply because Google could not recognize the Flash parts of their content.

Designers like Flash because it offers a very dynamic and attractive way to present information.  It is image rich and context poor.   At least until today’s Google announcement that they have figured out a way to index Flash stuff.

Although this is great news for the millions of sites using Flash that will now probably enjoy somewhat better rankings as their Flash content and navigation (link structure)  is better indexed by Google, I’d caution designers to keep avoiding Flash until this process is much better understood.  I’d guess that one of the key defects of flash sites – having navigation that is opaque to the Googlebot – will continue to be problematic even under the new systems.    A good designer can get much of the same “look and feel” of flash with good use of good images, art, and CSS (Cascading Style Sheets), and from an SEO perspective I think sites are still well advised to note the best observation I’ve ever heard about SEO from Matt Cutts at Google – almost certainly Matt is the world’s most authoritative expert on Google ranking and SEO:

“Googlebot is stupid”, said Matt, so you need to help it figure out your website.

I guess Googlebot is smarter now that it recognizes Flash, but Matt’s advice about this is still very relevant and frankly simpler than most people think.    Here’s some SEO advice for newbies:

1) Think In terms of ranking *properly* rather than ranking higher than sites that are better than you are.   If competitors are more relevant think of ways to make *your site* and *your product* more relevant.

2) Research keywords (or just guess if you are lazy) and make a list of those that you want to rank well for.

3) Make sure your content is rich in the keywords for which you are ranking well.   Make sure your page Titles use those keywords in the Title for the page, and use unique, keyword rich titles for each of your pages.    Make sure the content in the page is very relevant to the query – ie is this something that is going to help the reader out in their journey to enlightenment?    If not, make it so!

4) Links, links, links.    These are the mother’s milk of online success.   Do not buy them, earn them and get them from other sites in your network, sites of friends, etc.    Establish relationships of relevance with others that will get them to link to your website.     Avoid cheap linking schemes – as always think in terms of what creates a valuable resource for your readers.

5) Blog.  Blog more.  Google appears to be ranking blog content favorably and I predict they’ll need to do even more of this as blogs are replacing websites as the freshest and most relevant content on most topics.

Whether you are a mom and pop or a multinational, if you want to rank well online you should be blogging regularly about your topics.   When blogging, follow the rules 1-4 above.

6) Lower your monetary expectations.   Making money online is much harder than offline people think.   Even most Silicon Valley insiders generally only make big money from a handful of projects.    The overwhelming majority of startups fail, often leaving the founders with nothing but the memory of hard work.

7) Raise your personal expectations.  The online world is fascinating, exploding in popularity and significance, and is where you need to be.  Get on board with a blog!

Links and SEO


From a search ranking perspective links are one of a website’s top concerns- probably the most important concern as linking often trumps content in terms of where a site will place for search queries.

As always, a great source for SEO information is Matt Cutts blog over at Google where a careful read of his SEO posts will bring you a lot of enlightenment about Google do’s and don’ts. His post of a few days ago was particularly interesting as it deals with Google’s crackdown on paid links that try to pass pagerank. This is one of the most contentious topics in SEO and an area where I wish Google would be more transparent since there are so many linking approaches that are not paid but may be questionable in the eyes of Google. The fact that they depend so much on reporting of paid links is also a problem as it allows aggressive SEOs to “game the system” by selectively reporting competitors while creating complex and undetectable linking for their own sites.

However my biggest concern about linking is not something Google can fix, and that is the fact that even in the world of what Google views as legitimate, authority passing links, strategic linking to “friend and associate” websites has largely replaced the early approaches to linking where people work to simply link to a great resource for the reader.   As blogging has exploded into prominence and linking importance this problem has become critical, and we now see that early and well established blogs will outrank far better resources that have few incoming links because they are new.   Ideally, the older resources would be better stewards and link out to the good new resources but generally the stakes have become too high as links are now correctly seen as more valuable than advertising and bloggers have become too reluctant to link to other resources unless there is some reciprocal benefit.

Google on SEO


Search Engine Optimization is at the same time a simple concept (help the search engines find and rank your pages) and a very complex one (proper use of redirection when changing domain names, Google downranking, duplicate content and hundreds more topics that are covered online in many places and also at conferences like the SES Conference Series, Webmasterworld PubCon, or the SMX Conferences.  

Arguably the best source for basic SEO information is Matt Cutts’ blog, and he always has great summaries of the conferences at which he gives talks.    Here’s a great post from Matt today after Danny Sullivan’s SMX Seattle Conference.   Google has added some information to their famous (and infamous) webmaster Guidelines, which should be read by every webmaster as they are the best *basic* information about how to structure a site to be ranked properly.   You’ll also want to read Matt’s SEO posts which offer a lot more specifics and technical advice.  

Although several years ago you would *also* have been well advised to read up on some of the tricks of the trade such as various schemes for keyword optimization, I would argue that for most webmasters tricks are more likely to be counterproductive than productive.   This is a really rich topic because there remain many techniques that fall into a sort of gray area of optimization where ranks are affected, but crossing the Google draws between acceptable techniques and unacceptable can lead to severe penalties.   Since Google does not draw a clear objective line we have the ongoing gray area of optimization. 

Many SEO techniques relate to *linking* strategies and *keyword optimization*.     It is an area where I believe Google has in many ways fueled the rise of the very content they hate by making the rules too vague and (more importantly) allowed adsense advertising on pages that don’t meet reasonable web quality standards.   Early in the game I was often frustrated when I would improve on a bad page only to have it drop in ranks due to algorithmic quirks.   I soon decided to leave crappy but high ranked pages alone, fearing they’d be downranked if I changed them.  This in turn caused problems as Google tightened up quality standards. Google is great about transparency in several areas, but algorithmic search penalties are not one of them.

I should also say there are some exceptionally good SEO folks out there who always have amazing advice when I bump into them at conferences.    David Naylor and Aaron Wall, and Todd Malicoat all have remarkable insight into the complexities of Google ranking as does Vanessa Fox who used to work for Google and Danny Sullivan who runs the SMX series of SEO Conferences.    My general advice about SEO is to do it yourself or in-house, but there are a handful of people like this who know the game so well that the normal rules about avoiding SEO folks do not apply.