In the Beginning …


A hypothesis I’ve been enjoying lately is that the universe has evolved from a single bit (or an “on or off” rule) that, in the beginning, was applied randomly to itself.  Everything has expanded from that and exists as information rather than particles and waves – or put another way our existence is best modelled as a set of information relationships rather than a collection of particles and waves.

I’m not sure how original this idea is (I think I read once that Feynman had speculated about the universe being driven by information bits rather than physical things)  but I’m really liking the concept as it seems to be consistent with most of the current ideas about how the world works such as string theory and evolution and perhaps even some forms of religious belief that don’t define things too narrowly.

So I’m saying  that the universe is best thought of as a non-physical entity analogous to a computer program (but I do NOT mean it is a “simulation” from a giant mind or computer – I’m saying at the core of everything there is … nothing but a single rule).  Everything we experience is an expansion from a single rule that says an information bits can be either on or off   (or 1 / 0 or whatever).    The physical reality / evolution / everything we all experience is as “real” as in other modelling, but it is an imperfect vision of what makes it all tick and when we eventually find the ultimate source it will be a single info bit.

That single “off or on” rule, when applied in random fashion, will lead to increasingly complex rule relationships.   Given enough time many different “information associations” will emerge including what we now call “matter” and “energy”.  In turn these matter and energy relationships evolved into … you, me, and the universe we currently experience.

What I like about this notion is that it helps me come to grips with some of the most baffling aspects of modern physics:   The first is string cosmology which suggests there could be an infinite number of alternate universes existing side by side with ours.    This seems a lot less fanciful if our existence is information driven.  In  the same way you can run a chess program and this blog side by side without them having awareness of each other we could be “running” our universe without awareness of our next door neighboring universe.   

The second reconciliation is with the electrons of an atom, which are not well defined as either particles or energy.  They “exist” in our models as a probability relationship with our observation and with the atom.   This is mind bending if you try to think about it in physical terms, but if *at the very core* the electron is a rule and not an object or wave then it seems to be much more accessible as a concept.

At first I’d thought this might do something to reconcile the religion vs science debates but I don’t think religious folks will be too happy with an origin this simple and … “godless”, though I think I’m cool with the idea we can say God is the prime mover in the equation – the single on/off rule that, applied randomly to itself, started everything.

Google search transparency? You call that transparency?


Google does a lot of wonderful things, including many that people do not give this amazing company nearly enough credit for doing. These include mail, calendar, and document applications as well as great free search.

However Google transparency goes out the window when it comes to open discussion of the incredible amount of collateral damage Google inflicts daily on websites – including many that never know how their mom and pop business has been displaced by clever SEO tactics from spammers as well as legitimate marketeers who understand the system well.

Udi Manber at Google suggests that they are working for better transparency in the rankings process but I’m sure not holding my breath.

Strategically I believe Google continues to make a mistake here that ultimately is their great achilles heel, though Microsoft and Yahoo have been so busy fumbling their online balls that they don’t seem to get that yet.

The idea is that transparency leads to sharing ranking secrets and that leads to abuse of those rules. Sure, there would be some of that, but better would be to do a lot more to involve the online community in the definition and policing of spammy material, and also to be more responsive to webmasters who have questions about why their sites suddenly disappear from the rankings or – far more common and mysterious – are simply downranked to the degree they no longer get Google traffic. This last penalty offers one of the few instances where Google actually comes very close to lying to webmasters, implying that when “your site appears in the index” you have no penalty when in fact the downrank penalty by Google is severe, leading to almost no Google traffic. If you are an advanced SEO person you’ll have a sense of the downrank penalty, but in the best indication of how the lack of transparency backfires at Google it is the top SEO Marketers and spam experts who immediately will determine that they have penalties.

Mom and pop businesses are often hung out to dry with these penalties or – more often – simply ranked lower than they should be because they have failed to perform basic SEO on their websites because they have no idea what SEO even means. Also common are websites who hire or associate with questionable SEOs (which constitute about 90% of all SEOs), not knowing that they have violated Google’s improved-but-still-too-ambiguous webmaster guidelines.

In fairness to Google they do have a huge scaling challenge with everything they do.  Dealing with milllions of sites and billions of queries can’t be handled with more than a tiny fraction of the effort going into manual solutions.   However this is what the socializing power of the internet is for.  Digg, Wikipedia, and many other sites effectively police content quality without massive labor costs.

So Udi I’m thrilled you and Google are bringing more transparency to the process but forgive my skepticism that Google will give more than lip service to a much broader, open discussion and corrections of the many ways the ranking process has failed to deliver something that is really important: fairness.

 

Update:
My comment about this topic left over at the most excellent Mr. Matt Cutts’:

Matt I really thought Ubi’s post was probably too generic to be of practical help to most sites with problems. From the inside it probably appears that Google is bending over backwards to make absolutely sure almost no “innocent” sites get caught up in the SEO and Spam crossfire, but in practice most sites now attempt SEO in some form and many sites (and even companies) wind up damaged or destroyed without even knowing what hit them. The issue is the degree to which Google should share “what hit them”. Policy is to share nothing about algorithmic damage, and I think policy is still to define “being in the index” as “no penalty” which totally confuses anybody outside of SEO and even many of us who understand SEO quite well.

It’s the classic collateral damage argument – Google thinks this is necessary to protect the Algorithm, but I think long term this is a mistake and Google should expand the system of communication and community so there is at least a better explanation of the severe downranking penalties that leave sites in the index but out of view.

Towards a solution? Next time you do quality team hires have the new people play webmaster for a month before you share any info with them – have them work some sites, try to communicate with support, etc. This might help bring the outside frustrations…inside.

MicroHooBook: A Case Study in Online Lexicographical Evolution


After rumors of a Microsoft Yahoo Facebook deal surfaced I thought I’d cleverly coined the phrase “MicroHooBook” to describe the merger, and blogged a post with MicroHooBook in the title.    So understandably today when I read Matt Ingram’s MicroHooBook post   I first thought “Hey, Matt stole my cleverly coined word without even a link back!”.    But after a bit of Googling and timestamping research I learned he wrote his post a full hour or more before mine!   Yikes – he probably thinks I was the one who nabbed the term from him.    Good net citizen I am I immediately linked to Matt’s post and left a note at his comment section.

But wait…there’s more…..

It appears the first use of MicroHooBook happened here at The 463 by Sean Garrett (and I thought “Joe Duck” was a cryptic name for a mostly tech blog).  I wasn’t familiar with this blog or Sean but he must be quite a sharp guy to think of MicroHooBook before Matt, and then I, though of it, all probably independently in another example of how the internet is making literary lexicographical originality even harder than it used to be. 

The good news about MicroHooBook?    As a terms that was not used much if at all previously, it’s going to be a great little SEO case study for me.   This post, which uses the term often and links somewhat opportunistically to my own MicroHooBook post rather than what some would see as the more deserving Matt or Sean posts, should soon appear at the top of the ranks for the term, perhaps correctly because I sure am spending more time writing about this topic than the original MicroHooBookers.  MicroYaHookers?

Hey, I like “MicroYaHooker” better than MicroHooBook.  I’ll consider that my orginal contribution to the online lexicography .. at least until I find somebody who already wrote it.

Update:  Google indicated “MicroYaHooker” is so original it’s not even a GoogleWhackBlatt yet…

 

 

Ground Zero from Fast Company’s offices in NYC




Ground Zero from Fast Company’s offices in NYC

Originally uploaded by Robert Scoble

Robert Scoble has a very intriguing and tragic picture from Fast Company’s NY offices looking at Ground Zero.

The attack on the World Trade Centers that killed almost 3000 innocent Americans will be viewed as one of history’s most significant events both as an unprecedented attack on America but also as the catalyst that set the stage for the most expensive security and military buildup in history. Did all this spending prevent more attacks? I think probably yes. Was the return on this spending as great as alternatives such as improved infrastructure? Almost certainly not.

MicroHooBook rumors are very probably false. A test of the non-Emergency Blogcasting System?


I thought I’d coined “MicroHooBook” but Matt   had done that  a full hour before.

Just a moment, just a moment…. looks like The 463 had it before Matt.   Originality sure isn’t what it used to be…

Microsoft is certainly working with Yahoo now to try to buy a piece of the company rather than the whole – Microsoft announced that over the weekend.     Most think they want to buy the search component of Yahoo and that Yahoo may sell because if they don’t Carl Icahn will be forcing a proxy fight that he will probably win, having already bought or lined up about 30% of the votes/shares in his favor.     

But John Furrier “broke” the rumor that as soon as they had Yahoo search MS would snap up Facebook for 15-20 billion.    I think this rumor is speculation and nothing more and I’m even thinking this was something of a test of the non-emergency blogcasting system, which generally delivers misleading information even faster than the truth. 

John Furrier and Robert Scoble are both clever guys, which is why I’m a bit suspicious they have cooked up the MicroHooBook rumor to test TechMeme and how the blogosphere reacts to unfounded rumors.

As usual, the blogOsphere loves unfounded and unverified rumors and this is the key tech blog story for Monday May 19. 

I think Sarah Lacy has this right, and she’s got more of an inside track to Facebook than most reporters.

Blog Revolution Note XXIV


At SoundBiteBlog I stumbled (or rather twitter-comment-followed) an excellent post about how much the poisonous / ranting writing styles of many blogs help them succeed.   The author wonders if nice blogs can finish first …

The short answer is “sure”.  A good example is Matt Cutts at Google who rarely has a bad word to say about anybody at his blog yet has one of the most read technical resources on the internet for Google search issues.   Fred Wilson’s A VC is also a blog with heavy readership and a friendly tone.    Marc Andreessen at blog.pmarca.com  is another and there are many, many more.

However I think the key blogging success issue is ranking, and there are many ranking problems in blogging paradise.  Blogs that rank well will be read more often and in turn will confer more rank via linking, so the  *linking style* of most of the old timer blogs  has really inhibited the broader conversation.   The best posts about any given topic are rarely by A list blogs anymore but these posts are rarely seen because the ranking structure favors older, more linked blogs over those with less Google authority.   

The old authority models work much better for websites – where high ranks for a general category make sense  – than for blogging where authors tend to cover a lot of topics.    TechCrunch will appear with a higher rank than almost any other blog if a technology topic is covered even if their coverage is weak, wrong, or misguided.    A thoughtful and well researched post about a critical topic is unlikely to surface if it is written by an “outsider” and escapes the RSS feed of somebody prominent, or sometimes even if linking to that post is seen by the “A lister” as giving a potential competitor too much free juice.   Note how “up and coming” tech blogs like Mathew Ingram link generously while most A list blog writers – who are now often hired writers, paid to be seen as a key breaking source of news – are far less likely to  cite other blogs.    Ironically I think success has really diminished some formerly great blogs.    John Battelle is one of the most thoughtful writers on the web but now he’s way too busy with Federated Media to keep Searchblog as lively as it once was.  

Google and other aggregators (like TechMeme) in part use metrics similar to Google pagerank to define TechCrunch as more reliable because they have more incoming links, more history on the topic, and more commenting activity.   This is not a *bad* way to rank sites but it tends to miss many high quality, reflective articles from sources who do not actively work the system. 

Solutions?  I still think a blog revolution is needed more than ever to re-align quality writing and new bloggers with the current problematic ranking systems. 

In terms of the ranking algorithms I’m not sure how to fix things, though I think Gabe should use more manual intervention to surface good stuff rather than just have TechCrunch dominate TechMeme even when their coverage is spotty and weird.   I’m increasingly skeptical that TechMeme is surfacing the best articles on a topic – rather it seems to give too much authority to a handful of prominent but superficial stories.    As others link and discuss those stories we have only the echo of a smart conversation.  

I don’t spend enough time searching Technorati to know if they are missing the mark or not, but I like the fact they are very inclusive.   However like Google and I think Techmeme, Technorati has trouble surfacing content that is highly relevant and high quality but not “authoritative”.

For their part, Google needs to do more to bring blog content into the web search results.   Last year at SES Matt Cutts was explaining to me that they are doing more of this than ever and I’m sympathetic to the fact that fresh content into the SERPS will lead to spamming problems, but I’m finding that I often get more relevant results from a blog search at Google than a regular search.   This is more the case for breaking news or recent events but it has even happened for research topics where the blog search has led me to expertise I don’t find in the web listings.

Bullfrog Stew in Beijing



Bullfrog Stew 158
Originally uploaded by JoeDuck

Bullfrog Stew was one of the more interesting dishes we sampled in Beijing.   It was tasty – spicy stew with the bullfrog meat very soft and tender.   The tiny bones, however, made it kind of hard to really savor.

Pictured is Kevin Wu, my friend and Medford Oregon Dentist.

Guardian UK Climate Changers


In January the Guardian UK listed fifty people who can help save the planet.   I was very encouraged to see Bjorn Lomborg on this list as he’s one of the few well informed and rational voices in the global warming debate.   Lomborg simple, obvious, and common sense argument is that we are failing to prioritize our time and treasure as we deal with global challenges like Global Warming, health, and poverty.   He’d like to see us devote more resources to the most pressing problems and fewer to the least pressing, suggesting that although Global Warming will cause problems it is very unlikely that catastrophe is looming.

Perhaps ironically the next person on the list is Climatologist Gavin Schmidt from the RealClimate.org blog -my favorite source for spirited debate about Climate Change.    Gavin is one of several well connected scientists who participate regularly at that blog along with a group of moderately informed yet rabid commenters who are very quick to attack as “denialists” blog participants who suggest any deparature from the prevailing partly line on climate change.    A friend noted to me recently that climate change has become the new religion of the inquisition, where heretics are verbally burned at the stake – usually by those who are not particularly well informed – for suggesting even obvious problems such as the many defects in current global climate computer models.     

As somebody trained in science, my biggest concern remains the reluctance (refusal?) of climate scientists to define their work in ways that allow much if any falsifiability – they key mainstay of all modern science.   You’ll be very hard pressed to find many climate modellers say “if we find [insert any measurable phenomenon here], then our assumptions about warming are misguided”.     Unlike most conventional science where falsifiabilty is king and politics is left at the door, the climate community has a political component that is coloring the perception of the scientists.    I rarely hear scientists challenge the hysterical assertions that climate change will lead to catastrophic conditions soon.   Since the science does not suggest we have catastrophe looming, why this failure to comment more thoroughly and responsibly on the issue?    I think most of this is the assumption that reducing pollution is so important it’s OK to mislead the public into thinking warming catastrophes are looming when in fact they are not.   Watch “An Inconvenient Truth”, a movie largely supported as factual by the climate community and then read the critiques of the film’s examples.   

Although many in the climate field bristle at the notion that they have a vested interest in “hype” thanks to over $5,000,000,000 in annual grants for climate reasearch, but clearly feeding your kids plays a role in most human opinions and scientific opinion is no exception to this.    

The list *should* include Steve McIntyre, creator of the blog www.ClimateAudit.org, created in some ways to foil the dramatic level of omission of relevant information and participation that characterizes RealClimate.org.    MyIntyre is a mathematician and amateur scientist who is making quite a name for himself by replicating tree ring studies and challenging some questionable practices in the climate change community. 

From the Guardian:
Bjørn Lomborg
Statistician

Bjorn Lomborg Bjørn Lomborg, 42, has become an essential check and balance to runaway environmental excitement. In 2004, the Dane made his name as a green contrarian with his bestselling book The Skeptical Environmentalist, and outraged scientists and green groups around the world by arguing that many claims about global warming, overpopulation, energy resources, deforestation, species loss and water shortages are not supported by analysis. He was accused of scientific dishonesty, but cleared his name. He doesn’t dispute the science of climate change, but questions the priority it is given. He may look increasingly out of step, but Lomborg is one of the few academics prepared to challenge the consensus with credible data.

Gavin Schmidt
Climatologist

Gavin Schmidt, 38 and British, is a climate modeller at the Nasa Goddard Institute for Space Studies in New York. He founded RealClimate.org with colleagues in 2004. Offering “climate science from climate scientists”, the site has quickly become a must-read for interested amateurs, and a perfect foil to both the climate sceptic misinformation that saturates sections of the web and the overexcitement of the claims of some environmentalists. Unapologetically combative, technical and high-brow, the site and its contributors – essentially blogging in their spare time – nail the myth that scientists struggle to communicate their work. Whenever a major flaw is pointed out in the global consensus on climate change, or new evidence is discovered to blame it on the sun, it is always worth checking RealClimate. The site has a policy of not getting dragged into the political or economic aspects of science, but it’s fairly easy to guess which side it’s on.

Update to make my case:  Realclimate’s response to the new Hurricane study that suggests that the link between Hurricanes and Global Warming has been exaggerated shows how  – to my way of thinking – they have little if any interest in falsifiability.  RC seems to frequently highlight even anecdotal evidence supporting their view but critically rejects even well researched, peer reviewed studies that suggest things they don’t want to hear.   This rejecting the alternative hypothesis because it does not suit your beliefs science … or is it religion?