Oregon Coast Bird Watching


This post falls squarely in the “SEO Experiments” category. We’ve had an informative but “plain jane” Oregon Coast website up for some time based on Oregon Coast magazine which is published by Northwest Travel Magazines.

The site has historically ranked poorly for “Oregon Coast” and related terms, probably in part because we had never done much to optimize it for search engines, and (I think) partly because quite ironically Google now struggles to properly optimize websites that have extensive internal cross linking. Ironic because extensive linking was a cornerstone of early web quality but fell out of ranking fashion as Google sought to kill off auto-generated websites that used that technique to boost their pagerank and thereby their Google rank for optimized query terms. This became a spam signal because it is so easy to create large database driven websites, but in the case of many sites it is also a good *quality signal* because the site may be very info rich, covering basically every mile of the Oregon Coast Highway 101 in good, objective detail. Google recognizes they’ve created a lot of collateral damage in this way but frankly they have not done much to fix the problem, basically feeling that there is enough “good content” that ranks well. This is wrong and unfortunate, and in travel it has led to a lot of mediocre results when better search would give detailed blog and website references to pages spawned, for example, by people who live in the place getting described and have extensive insider detail.

One part of the optimization has been to rename the site OregonCoastTravel.net and 301 redirect the old pages at 101MilebyMile.com to the new name, hoping to rank better for “Oregon Coast” and “Oregon Coast Travel” as we should.

I’m linking here to the Oregon Coast birding page because it is a straggler that has been 301 redirected to OregonCoastTravel.net but remains listed by Google at the old site. Also, it is an excellent resource page for that topic of Oregon Coast Birding. I want to see how fast this page will now be correctly reindexed.

SES San Jose – Orion Search Panel


SES San Jose – Orion Search Panel
Originally uploaded by JoeDuck

Live (well 10 minute delay?) from the afternoon keynote here at SES San Jose. We’ve got Matt Cutts, Robert Scoble, Danny Sullivan, Tim Westergren, Kirsten Mangers, Rich LaFurgy here to talk about search. I’ll try to add as the talk goes on…

OK, it’s over and was disappointing.   All the speakers are exceptional experts, but I think this casual approach did not work because rarely did we get any of the meaty search information both Matt Cutts and Danny Sullivan generally deliver.    If I was making recommendations to SES I would have had each of these folks do separate sessions in their areas of expertise and get into more detail.   Matt, for example, is arguably the world’s top search expert and Robert one of the very top experts on blogging and social communities.   No need to water their stuff down so much.

SES San Jose – Lee Siegel Keynote


Lee Siegel is about to speak here at SES San Jose. He’s the author of “Against the Machine” and a senior editor at The New Republic, and a noted critic of the new media, primarily because he feels anonymity is a threat to intelligent, enlightened conversation.

Although I’m sympathetic to Lee’s points about how abusive the online world can be, and how foolish it is to consider as sacred the hate speech and the junk banter that passes as conversation, he’s missing two key features of the new conversational media that effectively sweep away much of the significance of his legitimate concerns.

First, the high tolerance for abusive and threatening language has become something of a new standard, especially for younger commenters. I don’t like it either, but for many writers this does not reflect the type of threat it would under other circumstances. It is not appropriate to apply old interpretations of this language to the modern usage.

Second is that focusing on the defects of blogging and new media distracts us from the profound and positive changes in communication – changes that represent the early stages of truly democratic and massively participatory conversations.

I don’t think Siegel is so much *wrong* as he is making fairly insignificant points about the new media. I’d certainly agree that there is a danger whenever people are stifled. For me the outrageous online treatment of Kathy Sierra, a noted blogger,is the exception that proves the rule. These cases are very few, and in a broad sense are eclipsed by the thousands of new voices coming online *every day*.

So, is there value in paying attention to these problems? Sure. Should this drive our understanding and appreciation of the most profound transformation in human communication history?

Nope.

Cuil Search – what am I missing about all that they are missing?


TechCrunch and others are waxing almost poetically about the new Cuil search engines, designed by ex Googlers to compete with the mother ship.    But after a few scattershot queries I’m just not feeling the power of Cuil.    It still is failing to find itself for what seems like an obvious query of “Cuil Search Engine”, and for the query “computers” I’d expect a bit more than this among Cuil’s claimed inventory of 121 billion web pages:

We didn’t find any results for “computers”

Some reasons might be…

  • a typo. Please check your spelling.
  • your search includes a term that is very rare. Try to find a more common substitute.
  • too many search terms. Please try fewer terms.

Finally, try to think of different words to describe your search.

Obviously Cuil is missing a LOT of stuff, so what am I missing here?

Update:  I’m still waiting to be impressed, but have learned that Cuil’s architecture is such that if some of the servers go down you can get empty results as I did last night.   Now, ‘Computers’ does turn up relevant results.

Microsoft BrowseRank to compete with Google PageRank


CNET profiles a new paper showcasing a Microsoft effort to enhance search by looking at *user behavior* as well as the old standby standards that all the major search engines use such as links in to the page, the content of the page, titles of the page, and several others.

Google’s initial brilliance was recognizing that the link relationships on the web gave you great insight into the best websites. Google correctly noted that sites with many links to them, especially for a particular keyword, were more likely to match a users interest for that keyword. Although many factors have been included in Google ranking for years, pagerank was arguably the most important breakthrough. Initially the system tried to be an online analogy to academic citation. Google’s Larry Page reasoned that websites with more incoming links would tend to be better, and that those incoming links themselves should also be weighted according to the importance of the site from which they came.

The system started to show severe signs of wear as search marketeers as well as mom and pop businesses began to “game” the pagerank system, creating spurious incoming links from bogus sites and buying links from high rank websites.

Enter Microsoft “BrowseRank”, which will arguably be harder to game because it will monitor the behavior of millions of users, looking for relationships between sites, pages, length of time on page, and more. It’s a good idea of course but arguably it is Google that has *by far* the best data set to manage this type of approach. So even if Microsoft’s system starts to deliver results superior to Google’s one can expect Google to kick their own efforts into gear.

As with all search innovation the users shoud be the big winners. Search remains good but not great, and competition in this space will only serve to make everybody better….right?

Google Ranking Needs a Spanking


Over at the Google blog today Amit Singhal has post 1 of 2 that promises an introduction to Google ranking.  As usual I’m disappointed in the way Google maintains what to me is a pretense of transparency while using some very ruthless and mysterious tactics to downrank sites they claim don’t meet quality guidelines.   Google (correctly) sees themselves as warring with spammers for control of the web but (incorrectly) thinks transparency is the wrong approach in this fight.

There were some rumblings last year of contacting webmasters directly about site problems but my understanding is that this would represent only a tiny fraction of total sites under penalty.    Of course, due to so little transparency in this area we can’t know the real numbers.

I’ll hope Amit’s second post is a LOT more specific, because I think he’s already practicing the kind oblique speak that is becoming commonplace when many from Google talk about ranking:

Amit:
No discussion of Google’s ranking would be complete without asking the common – but misguided! 🙂 – question: “Does Google manually edit its results?” Let me just answer that with our third philosophy: no manual intervention.

That statement is false, and he should not say it.   He does try to clarify later in the post:

I should add, however, that there are clear written policies for websites recommended by Google, and we do take action on sites that are in violation of our policies or for a small number of other reasons (e.g. legal requirements, child porn, viruses/malware, etc).

Action?  Yes, of course he means the *manual intervention* he said above does not happen.  Google has a right to pull sites out of the rankings, though it is annoying how much they talk about NOT manually intervening when they do it.    Because of no transparency nobody outside of Google knows how often they manually intervene.    Amit makes  it sound like it’s only for horrors like child porn or malware, but note that the use of inappropriate “SEO” tactics such as “hidden text” can get you removed and even banned from the Google index.   Unfortunately for small sites – e.g. “Aunt Sally’s House of Knitting website”  Aunt Sally may have no idea her webmaster is using these tactics.   How often does this happen?    My guess is that hundreds of thousands of legitimate sites are ranked very improperly due to technical penalties, but due to no transparency (and probably no measure of this at Google) nobody knows.

The big Google problem is that the policies for algorithmic downranking are not *clear enough*.  Many SEO companies prey on this lack of transparency, ironically often using Google’s mystique to lure unsuspecting businesses into expensive “optimization” schemes that don’t work or can get them seriously penalized.

Part of Google’s search algorithm philosphy is that they don’t share details because spammers would exploit them before honest people.   Although a weak case can be made for this idea, a better one is that in  non-transparent systems dishonest folks will do *better* because they invest more energy into finding the loopholes.    For example inbound linking, a very hot SEO topic last year at SES San Jose, has complex rules nobody understands outside of Google.    For example linking between sites in an information network can be advantageous or it can be penalized depending on whether Google (rather than the community or webmaster) sees the practice as manipulative of the algorithm or user-friendly and thus allowable.

Amit – a clear policy is one where the webmaster will know, rather than guess, what they are doing to annoy the Google algorithm or the manual intervention folks.

There is a pretty good source for information about how to approach site architecture for optimal ranking and it is to read Matt Cutts’ SEO related posts here.

Although Matt won’t give out much about the algorithmic penalties that create much of the Google confusion and frustration for established websites, if you follow Google’s guidelines and Matt’s posts on SEO you are unlikely to have serious problems with ranking.     Of course unless you work to optimize a new website you will have the *standard problems* with ranking since your competition is probably doing basic SEO on their site.   I’d argue (along with many SEO folks) that the best way to enter things this late in the game and hope for good ranks is with a topical *blog* to support your website.   Start with several posts about your general area of business, using a lot of the terminology people would use to find your website, and add posts regularly.

I’ll be covering the SES San Jose Search Conference and expect to hear a lot more debate about the issue of transparency, blogging, and SEO.

SEO Pseudo Alert: Google Crawling Flash


For many years anybody who knew anything about search engine optimization “SEO”, would scoff at the idea of using more than minor number amount of Flash elements in websites, because for many years those Flash elements were largely invisible to search engines – most notably Google – and therefore sites that used Flash would often rank lower than others simply because Google could not recognize the Flash parts of their content.

Designers like Flash because it offers a very dynamic and attractive way to present information.  It is image rich and context poor.   At least until today’s Google announcement that they have figured out a way to index Flash stuff.

Although this is great news for the millions of sites using Flash that will now probably enjoy somewhat better rankings as their Flash content and navigation (link structure)  is better indexed by Google, I’d caution designers to keep avoiding Flash until this process is much better understood.  I’d guess that one of the key defects of flash sites – having navigation that is opaque to the Googlebot – will continue to be problematic even under the new systems.    A good designer can get much of the same “look and feel” of flash with good use of good images, art, and CSS (Cascading Style Sheets), and from an SEO perspective I think sites are still well advised to note the best observation I’ve ever heard about SEO from Matt Cutts at Google – almost certainly Matt is the world’s most authoritative expert on Google ranking and SEO:

“Googlebot is stupid”, said Matt, so you need to help it figure out your website.

I guess Googlebot is smarter now that it recognizes Flash, but Matt’s advice about this is still very relevant and frankly simpler than most people think.    Here’s some SEO advice for newbies:

1) Think In terms of ranking *properly* rather than ranking higher than sites that are better than you are.   If competitors are more relevant think of ways to make *your site* and *your product* more relevant.

2) Research keywords (or just guess if you are lazy) and make a list of those that you want to rank well for.

3) Make sure your content is rich in the keywords for which you are ranking well.   Make sure your page Titles use those keywords in the Title for the page, and use unique, keyword rich titles for each of your pages.    Make sure the content in the page is very relevant to the query – ie is this something that is going to help the reader out in their journey to enlightenment?    If not, make it so!

4) Links, links, links.    These are the mother’s milk of online success.   Do not buy them, earn them and get them from other sites in your network, sites of friends, etc.    Establish relationships of relevance with others that will get them to link to your website.     Avoid cheap linking schemes – as always think in terms of what creates a valuable resource for your readers.

5) Blog.  Blog more.  Google appears to be ranking blog content favorably and I predict they’ll need to do even more of this as blogs are replacing websites as the freshest and most relevant content on most topics.

Whether you are a mom and pop or a multinational, if you want to rank well online you should be blogging regularly about your topics.   When blogging, follow the rules 1-4 above.

6) Lower your monetary expectations.   Making money online is much harder than offline people think.   Even most Silicon Valley insiders generally only make big money from a handful of projects.    The overwhelming majority of startups fail, often leaving the founders with nothing but the memory of hard work.

7) Raise your personal expectations.  The online world is fascinating, exploding in popularity and significance, and is where you need to be.  Get on board with a blog!

Links and SEO


From a search ranking perspective links are one of a website’s top concerns- probably the most important concern as linking often trumps content in terms of where a site will place for search queries.

As always, a great source for SEO information is Matt Cutts blog over at Google where a careful read of his SEO posts will bring you a lot of enlightenment about Google do’s and don’ts. His post of a few days ago was particularly interesting as it deals with Google’s crackdown on paid links that try to pass pagerank. This is one of the most contentious topics in SEO and an area where I wish Google would be more transparent since there are so many linking approaches that are not paid but may be questionable in the eyes of Google. The fact that they depend so much on reporting of paid links is also a problem as it allows aggressive SEOs to “game the system” by selectively reporting competitors while creating complex and undetectable linking for their own sites.

However my biggest concern about linking is not something Google can fix, and that is the fact that even in the world of what Google views as legitimate, authority passing links, strategic linking to “friend and associate” websites has largely replaced the early approaches to linking where people work to simply link to a great resource for the reader.   As blogging has exploded into prominence and linking importance this problem has become critical, and we now see that early and well established blogs will outrank far better resources that have few incoming links because they are new.   Ideally, the older resources would be better stewards and link out to the good new resources but generally the stakes have become too high as links are now correctly seen as more valuable than advertising and bloggers have become too reluctant to link to other resources unless there is some reciprocal benefit.

Google search transparency? You call that transparency?


Google does a lot of wonderful things, including many that people do not give this amazing company nearly enough credit for doing. These include mail, calendar, and document applications as well as great free search.

However Google transparency goes out the window when it comes to open discussion of the incredible amount of collateral damage Google inflicts daily on websites – including many that never know how their mom and pop business has been displaced by clever SEO tactics from spammers as well as legitimate marketeers who understand the system well.

Udi Manber at Google suggests that they are working for better transparency in the rankings process but I’m sure not holding my breath.

Strategically I believe Google continues to make a mistake here that ultimately is their great achilles heel, though Microsoft and Yahoo have been so busy fumbling their online balls that they don’t seem to get that yet.

The idea is that transparency leads to sharing ranking secrets and that leads to abuse of those rules. Sure, there would be some of that, but better would be to do a lot more to involve the online community in the definition and policing of spammy material, and also to be more responsive to webmasters who have questions about why their sites suddenly disappear from the rankings or – far more common and mysterious – are simply downranked to the degree they no longer get Google traffic. This last penalty offers one of the few instances where Google actually comes very close to lying to webmasters, implying that when “your site appears in the index” you have no penalty when in fact the downrank penalty by Google is severe, leading to almost no Google traffic. If you are an advanced SEO person you’ll have a sense of the downrank penalty, but in the best indication of how the lack of transparency backfires at Google it is the top SEO Marketers and spam experts who immediately will determine that they have penalties.

Mom and pop businesses are often hung out to dry with these penalties or – more often – simply ranked lower than they should be because they have failed to perform basic SEO on their websites because they have no idea what SEO even means. Also common are websites who hire or associate with questionable SEOs (which constitute about 90% of all SEOs), not knowing that they have violated Google’s improved-but-still-too-ambiguous webmaster guidelines.

In fairness to Google they do have a huge scaling challenge with everything they do.  Dealing with milllions of sites and billions of queries can’t be handled with more than a tiny fraction of the effort going into manual solutions.   However this is what the socializing power of the internet is for.  Digg, Wikipedia, and many other sites effectively police content quality without massive labor costs.

So Udi I’m thrilled you and Google are bringing more transparency to the process but forgive my skepticism that Google will give more than lip service to a much broader, open discussion and corrections of the many ways the ranking process has failed to deliver something that is really important: fairness.

 

Update:
My comment about this topic left over at the most excellent Mr. Matt Cutts’:

Matt I really thought Ubi’s post was probably too generic to be of practical help to most sites with problems. From the inside it probably appears that Google is bending over backwards to make absolutely sure almost no “innocent” sites get caught up in the SEO and Spam crossfire, but in practice most sites now attempt SEO in some form and many sites (and even companies) wind up damaged or destroyed without even knowing what hit them. The issue is the degree to which Google should share “what hit them”. Policy is to share nothing about algorithmic damage, and I think policy is still to define “being in the index” as “no penalty” which totally confuses anybody outside of SEO and even many of us who understand SEO quite well.

It’s the classic collateral damage argument – Google thinks this is necessary to protect the Algorithm, but I think long term this is a mistake and Google should expand the system of communication and community so there is at least a better explanation of the severe downranking penalties that leave sites in the index but out of view.

Towards a solution? Next time you do quality team hires have the new people play webmaster for a month before you share any info with them – have them work some sites, try to communicate with support, etc. This might help bring the outside frustrations…inside.