Oregon Coast Bird Watching

This post falls squarely in the “SEO Experiments” category. We’ve had an informative but “plain jane” Oregon Coast website up for some time based on Oregon Coast magazine which is published by Northwest Travel Magazines.

The site has historically ranked poorly for “Oregon Coast” and related terms, probably in part because we had never done much to optimize it for search engines, and (I think) partly because quite ironically Google now struggles to properly optimize websites that have extensive internal cross linking. Ironic because extensive linking was a cornerstone of early web quality but fell out of ranking fashion as Google sought to kill off auto-generated websites that used that technique to boost their pagerank and thereby their Google rank for optimized query terms. This became a spam signal because it is so easy to create large database driven websites, but in the case of many sites it is also a good *quality signal* because the site may be very info rich, covering basically every mile of the Oregon Coast Highway 101 in good, objective detail. Google recognizes they’ve created a lot of collateral damage in this way but frankly they have not done much to fix the problem, basically feeling that there is enough “good content” that ranks well. This is wrong and unfortunate, and in travel it has led to a lot of mediocre results when better search would give detailed blog and website references to pages spawned, for example, by people who live in the place getting described and have extensive insider detail.

One part of the optimization has been to rename the site OregonCoastTravel.net and 301 redirect the old pages at 101MilebyMile.com to the new name, hoping to rank better for “Oregon Coast” and “Oregon Coast Travel” as we should.

I’m linking here to the Oregon Coast birding page because it is a straggler that has been 301 redirected to OregonCoastTravel.net but remains listed by Google at the old site. Also, it is an excellent resource page for that topic of Oregon Coast Birding. I want to see how fast this page will now be correctly reindexed.

Black Hat / White Hat SEO Session at SES San Jose

Black Hat SEO Session at SES San Jose
Originally uploaded by JoeDuck

The SEO session here at SES San Jose is packed as everybody struggles to get competitive advantage, though I’m concerned that even some fairly advanced SEO folks simply don’t understand how seriously they may damage their clients or sites with a number of techniques that are no longer tolerated by Google.

Jill’s making an excellent point about “incompetent SEO” who, in her words…suck. I think she’s right that there are a lot of very bad folks doing SEO – I’d suggest about 90% or more.

Boser – the sites that rank are the ones that put in the effort. He says that you could generally replace the top sites with any others wihout bothering users.

Boser: buying links remains a key tactic for competive markets.

Naylor: Common mistakes are from within the organization where their link buying goes over the thresholds. Good black hats “know the threshholds” at which the search engine will identify link buying problems.

Disagreement (hey, cool) Jill’s Whalen suggesting good SEO is simply common sense, Todd Friesen suggesting it’s not. Jill says it’s not about tricking the engines.

Boser: Must compete now, so using paid links to get the clients going is both acceptable and needed.

Widget marketing – ?

SEO Champion question: He’s angry but not clear why – some conflict with Greg Boser over SEO stuff.

Boser: Links are about blending in to your niche. Hotels dominated by aggressive search spam for years – a tough industry. Hard in those space to follow the rules and compete.

Bruce Clay: Doing things that are way out of bounds is much riskier for established sites. You can’t afford to burn your own house so consider these risks.

BMW: Burned themselves buying links for short time. Greg – less consequence for the big players. WordPress also got a hand slap for link abuses. [ yes, but if we use a “user quality” metric we’d expect big players to have fewer consequence rather than severe punishment].

Matt Cutts from Google (audience). We take action on a lot of big sites. Some panel argument with Matt here about how fairly Google applies the rules. Boser: Forbes is spamming – why no action?

Naylor: Legal site in UK. Restoration involved identifying the paid links that a company representative had purchased. Google found them in 4 months and banned the site.

Audience question: What about the user – isn’t their interest the best definition of white vs black hat? Todd may not understand what she means here, he’s noting that outright deception is out. But I think her question is more nuanced and the answer is generally yes. Naylor’s correctly noting that users may not care much if they get site A or site B, and that is often true.

Made for Adsense sites: Boser – it’s crap, and includes a lot of blogging content. [This topic is so complicated – I’m always amazed how everybody thinks that they know crappy from good content. I think you need to ask communities about what they want, and trust their judgement. I think Google is moving in that direction and it’s a good thing.  What if somebody has a fabulous site, better than competition, that is made for adsense?   Where is the line?]  Answer – community judgement.

SES Session Description:

Searcher Track
Black Hat, White Hat: Playing Dirty with SEO

Some say that “black hat” search marketers will do anything to gain a top ranking and others argue that even “white hat” marketers who embrace ethical search engine optimization practices are ultimately trying to game the search ranking system. Are white hats being naive? Are black hats failing to see the long-term picture? This session will include an exploration of the latest black and white issues, with lots of time for dialog and discussion.

Matthew Bailey, President, SiteLogic

Greg Boser, President, WebGuerrilla LLC

Knol Knows Ranking

Google’s new knol project features articles on any topic by anybody who cares to call themselves an expert.   The concept is really intriguing as it relates to bringing higher authority information online that will be vetted both by the author and by the community at large.

Search marketing guru Aaron Wall is one of the sharpest fellows in the SEO business and he’s been testing knol and has concerns about the results both in terms of outranking original material and copyright.

Aaron on knol

The SMX search folks – I think it was Danny Sullivan – have also been testing knol and also are suggesting knol material is ranking very well.     Google says they are not giving their own site preferential treatment, but I’m guessing what they mean is that they are applying the same rules to knol as other sites.    If, for example, those rules that are *applied equally to all sites* happen to include a high value for incoming links from Google sites or other knol pages, the knol content effectively has a big advantage, technically without any special treatment.

In terms of search strategy I think the rule here is to …. write some knol page for topics of interest to you or topics for which you want to improve your rank.      I expect knol to be a huge topic at the upcoming search conference  – SES San Jose.

Later:  Time to check out this knol thing by creating some content myself.    Here are listings I created for:

Beijing, China |     Blogs and Blogging

Extremism in the defense of the algorithm is no vice?

WordPress surfing led me to another interesting sob story from a penalized webmaster and my reply got so long it deserved to become a post:

Marshall Sponder wrote:

Take Know More Media’s case – you have 100+ blogs and 2+ years of content – that’s easy, 30,000 to 50,000 blog posts and Google, with just one or two paid links that pass PageRank, is going to throw the entire blog network out of it’s index over that?

Yep, it appears that’s it – that’s the reason.  But is it fair?  No.
Strictly from a users point of view I think it is very hard to justify technical penalties on good content.    Few users know or care what “hidden text” is, so if a mom and pop webmaster uses this tactic and Google deletes the otherwise informative, relevant website it is hard to argue that users are served well.    Even if a black hat SEO created a site filled with illegal tricks but also full of highly relevant quality content I think Google’s case against including that site is weak.  As a user I want  *quality content* and I don’t care about the site’s technical construction.    Where Google is simply banning sites for using spammy tactics I’d agree with Marshall that to be faithful to user centricism they really have to take it a step further and look at the content they are excluding.   Even if the content contains paid linking and other violations if it unique, quality content Google cannot exclude it without violating their stated “prime directive” of providing the best for the users.

However, Google has to manage about one trillion URLs, so obviously they need shortcuts in ranking and one of them is a page from AZ Senator Barry Goldwater’s playbook when – many years ago – he tried to justify an escalation of the Vietnam war, perhaps to nuclear level.   Google’s coin of the famous Goldwater phrase would be: “Extremism in the defense of the algorithm is no vice”.

I don’t think penalties are generally *fair* or *user friendly*, but I’m willing to concede they may be necessary for Google to function as profitably as they do since it would take a lot of human intervention to help every mom and pop determine what’s wrong with their sites.

However, I feel Google continues to fail in their obligation to communicate more effectively with penalized sites although I think they are s-l-o-w-l-y  catching on to the fact that most webmasters of penalized sites remain unclear as to why the site has been penalized or downranked.    Removal offers you a shot at “reinclusion” and (very rarely) possible webmaster tools staff feedback.   Downranking is algorithmic and Google will not generally offer any advice to help downranked sites.     In this case you generally want to re-read the webmaster guidelines and experiment with different approaches in an effort to boost rankings.

My view is that as many thin content database sites have flowed online Google is holding online material to a higher standard of quality, especially if it’s at a new website.    This helps explain why you can find well ranked pages that are inferior to pages at a new website.

There is a solution to all of this in my opinion, which is for Google to include a lot more community input and feedback into the process than they currently appear to do.    I’d guess the recent discussions to aquire DIGG may have been in part to gain more community feedback tools and data.     Historically Google has been brilliant at using algorithms to determine ranking and advertising, but has fallen short of brilliance in their ruthlessness in dealing with website practices they don’t like, leaving a lot of collateral damage – especially related to sites involved in “paid linking” and variations on that complex theme.

At SES San Jose 2009 I’ll hope to get to ask Matt Cutts more about this in person.   Matt is Google’s top spam cop and always very open to conversations about ranking and search.    In fact the best event of the conference is the Google Party where engineers are on hand to discuss search related issues – including complex ranking technicalities that are sometimes brought to Google’s attention as part of the search conference circuit.

Google Ranking Needs a Spanking

Over at the Google blog today Amit Singhal has post 1 of 2 that promises an introduction to Google ranking.  As usual I’m disappointed in the way Google maintains what to me is a pretense of transparency while using some very ruthless and mysterious tactics to downrank sites they claim don’t meet quality guidelines.   Google (correctly) sees themselves as warring with spammers for control of the web but (incorrectly) thinks transparency is the wrong approach in this fight.

There were some rumblings last year of contacting webmasters directly about site problems but my understanding is that this would represent only a tiny fraction of total sites under penalty.    Of course, due to so little transparency in this area we can’t know the real numbers.

I’ll hope Amit’s second post is a LOT more specific, because I think he’s already practicing the kind oblique speak that is becoming commonplace when many from Google talk about ranking:

No discussion of Google’s ranking would be complete without asking the common – but misguided! 🙂 – question: “Does Google manually edit its results?” Let me just answer that with our third philosophy: no manual intervention.

That statement is false, and he should not say it.   He does try to clarify later in the post:

I should add, however, that there are clear written policies for websites recommended by Google, and we do take action on sites that are in violation of our policies or for a small number of other reasons (e.g. legal requirements, child porn, viruses/malware, etc).

Action?  Yes, of course he means the *manual intervention* he said above does not happen.  Google has a right to pull sites out of the rankings, though it is annoying how much they talk about NOT manually intervening when they do it.    Because of no transparency nobody outside of Google knows how often they manually intervene.    Amit makes  it sound like it’s only for horrors like child porn or malware, but note that the use of inappropriate “SEO” tactics such as “hidden text” can get you removed and even banned from the Google index.   Unfortunately for small sites – e.g. “Aunt Sally’s House of Knitting website”  Aunt Sally may have no idea her webmaster is using these tactics.   How often does this happen?    My guess is that hundreds of thousands of legitimate sites are ranked very improperly due to technical penalties, but due to no transparency (and probably no measure of this at Google) nobody knows.

The big Google problem is that the policies for algorithmic downranking are not *clear enough*.  Many SEO companies prey on this lack of transparency, ironically often using Google’s mystique to lure unsuspecting businesses into expensive “optimization” schemes that don’t work or can get them seriously penalized.

Part of Google’s search algorithm philosphy is that they don’t share details because spammers would exploit them before honest people.   Although a weak case can be made for this idea, a better one is that in  non-transparent systems dishonest folks will do *better* because they invest more energy into finding the loopholes.    For example inbound linking, a very hot SEO topic last year at SES San Jose, has complex rules nobody understands outside of Google.    For example linking between sites in an information network can be advantageous or it can be penalized depending on whether Google (rather than the community or webmaster) sees the practice as manipulative of the algorithm or user-friendly and thus allowable.

Amit – a clear policy is one where the webmaster will know, rather than guess, what they are doing to annoy the Google algorithm or the manual intervention folks.

There is a pretty good source for information about how to approach site architecture for optimal ranking and it is to read Matt Cutts’ SEO related posts here.

Although Matt won’t give out much about the algorithmic penalties that create much of the Google confusion and frustration for established websites, if you follow Google’s guidelines and Matt’s posts on SEO you are unlikely to have serious problems with ranking.     Of course unless you work to optimize a new website you will have the *standard problems* with ranking since your competition is probably doing basic SEO on their site.   I’d argue (along with many SEO folks) that the best way to enter things this late in the game and hope for good ranks is with a topical *blog* to support your website.   Start with several posts about your general area of business, using a lot of the terminology people would use to find your website, and add posts regularly.

I’ll be covering the SES San Jose Search Conference and expect to hear a lot more debate about the issue of transparency, blogging, and SEO.

SEO Pseudo Alert: Google Crawling Flash

For many years anybody who knew anything about search engine optimization “SEO”, would scoff at the idea of using more than minor number amount of Flash elements in websites, because for many years those Flash elements were largely invisible to search engines – most notably Google – and therefore sites that used Flash would often rank lower than others simply because Google could not recognize the Flash parts of their content.

Designers like Flash because it offers a very dynamic and attractive way to present information.  It is image rich and context poor.   At least until today’s Google announcement that they have figured out a way to index Flash stuff.

Although this is great news for the millions of sites using Flash that will now probably enjoy somewhat better rankings as their Flash content and navigation (link structure)  is better indexed by Google, I’d caution designers to keep avoiding Flash until this process is much better understood.  I’d guess that one of the key defects of flash sites – having navigation that is opaque to the Googlebot – will continue to be problematic even under the new systems.    A good designer can get much of the same “look and feel” of flash with good use of good images, art, and CSS (Cascading Style Sheets), and from an SEO perspective I think sites are still well advised to note the best observation I’ve ever heard about SEO from Matt Cutts at Google – almost certainly Matt is the world’s most authoritative expert on Google ranking and SEO:

“Googlebot is stupid”, said Matt, so you need to help it figure out your website.

I guess Googlebot is smarter now that it recognizes Flash, but Matt’s advice about this is still very relevant and frankly simpler than most people think.    Here’s some SEO advice for newbies:

1) Think In terms of ranking *properly* rather than ranking higher than sites that are better than you are.   If competitors are more relevant think of ways to make *your site* and *your product* more relevant.

2) Research keywords (or just guess if you are lazy) and make a list of those that you want to rank well for.

3) Make sure your content is rich in the keywords for which you are ranking well.   Make sure your page Titles use those keywords in the Title for the page, and use unique, keyword rich titles for each of your pages.    Make sure the content in the page is very relevant to the query – ie is this something that is going to help the reader out in their journey to enlightenment?    If not, make it so!

4) Links, links, links.    These are the mother’s milk of online success.   Do not buy them, earn them and get them from other sites in your network, sites of friends, etc.    Establish relationships of relevance with others that will get them to link to your website.     Avoid cheap linking schemes – as always think in terms of what creates a valuable resource for your readers.

5) Blog.  Blog more.  Google appears to be ranking blog content favorably and I predict they’ll need to do even more of this as blogs are replacing websites as the freshest and most relevant content on most topics.

Whether you are a mom and pop or a multinational, if you want to rank well online you should be blogging regularly about your topics.   When blogging, follow the rules 1-4 above.

6) Lower your monetary expectations.   Making money online is much harder than offline people think.   Even most Silicon Valley insiders generally only make big money from a handful of projects.    The overwhelming majority of startups fail, often leaving the founders with nothing but the memory of hard work.

7) Raise your personal expectations.  The online world is fascinating, exploding in popularity and significance, and is where you need to be.  Get on board with a blog!

Links and SEO

From a search ranking perspective links are one of a website’s top concerns- probably the most important concern as linking often trumps content in terms of where a site will place for search queries.

As always, a great source for SEO information is Matt Cutts blog over at Google where a careful read of his SEO posts will bring you a lot of enlightenment about Google do’s and don’ts. His post of a few days ago was particularly interesting as it deals with Google’s crackdown on paid links that try to pass pagerank. This is one of the most contentious topics in SEO and an area where I wish Google would be more transparent since there are so many linking approaches that are not paid but may be questionable in the eyes of Google. The fact that they depend so much on reporting of paid links is also a problem as it allows aggressive SEOs to “game the system” by selectively reporting competitors while creating complex and undetectable linking for their own sites.

However my biggest concern about linking is not something Google can fix, and that is the fact that even in the world of what Google views as legitimate, authority passing links, strategic linking to “friend and associate” websites has largely replaced the early approaches to linking where people work to simply link to a great resource for the reader.   As blogging has exploded into prominence and linking importance this problem has become critical, and we now see that early and well established blogs will outrank far better resources that have few incoming links because they are new.   Ideally, the older resources would be better stewards and link out to the good new resources but generally the stakes have become too high as links are now correctly seen as more valuable than advertising and bloggers have become too reluctant to link to other resources unless there is some reciprocal benefit.

Google on SEO

Search Engine Optimization is at the same time a simple concept (help the search engines find and rank your pages) and a very complex one (proper use of redirection when changing domain names, Google downranking, duplicate content and hundreds more topics that are covered online in many places and also at conferences like the SES Conference Series, Webmasterworld PubCon, or the SMX Conferences.  

Arguably the best source for basic SEO information is Matt Cutts’ blog, and he always has great summaries of the conferences at which he gives talks.    Here’s a great post from Matt today after Danny Sullivan’s SMX Seattle Conference.   Google has added some information to their famous (and infamous) webmaster Guidelines, which should be read by every webmaster as they are the best *basic* information about how to structure a site to be ranked properly.   You’ll also want to read Matt’s SEO posts which offer a lot more specifics and technical advice.  

Although several years ago you would *also* have been well advised to read up on some of the tricks of the trade such as various schemes for keyword optimization, I would argue that for most webmasters tricks are more likely to be counterproductive than productive.   This is a really rich topic because there remain many techniques that fall into a sort of gray area of optimization where ranks are affected, but crossing the Google draws between acceptable techniques and unacceptable can lead to severe penalties.   Since Google does not draw a clear objective line we have the ongoing gray area of optimization. 

Many SEO techniques relate to *linking* strategies and *keyword optimization*.     It is an area where I believe Google has in many ways fueled the rise of the very content they hate by making the rules too vague and (more importantly) allowed adsense advertising on pages that don’t meet reasonable web quality standards.   Early in the game I was often frustrated when I would improve on a bad page only to have it drop in ranks due to algorithmic quirks.   I soon decided to leave crappy but high ranked pages alone, fearing they’d be downranked if I changed them.  This in turn caused problems as Google tightened up quality standards. Google is great about transparency in several areas, but algorithmic search penalties are not one of them.

I should also say there are some exceptionally good SEO folks out there who always have amazing advice when I bump into them at conferences.    David Naylor and Aaron Wall, and Todd Malicoat all have remarkable insight into the complexities of Google ranking as does Vanessa Fox who used to work for Google and Danny Sullivan who runs the SMX series of SEO Conferences.    My general advice about SEO is to do it yourself or in-house, but there are a handful of people like this who know the game so well that the normal rules about avoiding SEO folks do not apply.

The Donny Deutsch Experiment

Hey, my Donny Deutsch post, part of our SEO Experiment series here at Joe Duck, is now at #12 worldwide as we move into CES.   What?  a few hours after this post I dropped to 34 – not sure wazzup. OK, now back to 12 minutes later – may just have been a server shuffle thing or my mistake …   My goal is to get into the top three sites for the query “Donny Deutsch” although Google’s quirkiness could make this tricky to do before next week. I think I’ll rise over time thanks to the incoming links and the inordinate amount of Donny Deutsch attention here at the blog, but normally you’d try to rise to the top over many months and not a few weeks. However “Donny Deutsch” is not a highly competitive term so I’ll have a shot here.  Though Donny Deutsch is is a fairly heavily searched name due to Donny’s excellent TV show “The Big Idea With Donny Deutsch”, his bombastic style and his ability to pony up $200,000,000 without going into debt.\

What?  You are looking for the Donny Deutsch Big Idea CES website?   Here it is!