Is Lou Dobbs’ Head Going to Explode?

Update:  Dobbs is leaving CNN:

CNN’s Lou Dobbs personal crusade to rant about the plethora of problems with our bizarre immigration policy came to a fun “head” tonight as he berated and then looked ready to jump out of his chair to strangle the very composed Paul Waldman of a (liberal/ left) media watchdog group called Media Matters.

The group studied stories on “Lou Dobbs Tonight” and suggest the obvious:  Dobbs’ routinely crosses the line of reasoned journalism in his personal crusade to stem the tide of illegal immigration.

“When was the last time you did a positive story about immigrants”  Dobbs:  “I don’t know”.

I’m more than tired of blowhards like Lou Dobbs, Bill O’Reilly, and Keith Olbermann all of whom routinely discard good standards of quality journalism in favor of either bombastic nonsense or simplifications of complex issues.   These guys are not journalists – they are *entertainers*.    That is OK, but stop the pretense!    TV “news” is mostly garbage now, and we should all be very, very ashamed.

Online Abuse and Harassment: Where are the Rules?

I’m reposting from my WebGuild post about the Ariel Waldman case where she is accusing Twitter of failing to enforce their Terms of Service over a what Ariel says was a case of very bad harassment and abuse on Twitter:

Are there appropriate standards of conduct for social network communication or does anything go in the wild west of social networks, twitter, and blogging?

Ariel Waldman was the target of an online “stalker” who posted abusive comments about her via Twitter. She’s understandably upset about the harrassment and posted a long note about getting no satisfaction from Twitter despite responses including a call with the Twitter CEO, who seemed to feel the case fell outside of Twitter’s responsibility.

I’m trying to get Twitter’s response to Ariel because I have a feeling there actions may hinge on a couple of twists that complicate what at first appears to be a clear cut case of putting free speech – which should be protected at great cost, above threat speech – which is a plague on the online world and should be harshly policed by the online and offline community including law enforcement.

The first issue is that Ariel blogs about some very “emotionally charged” topics with sexually charged language (though I saw no sign of what I would call abusive language in a quick scan of her blogs). However Twitter may be thinking that to censor comments about her or her topics while keeping Ariel’s own stuff online would not be in keeping with some sort of fairness standard (I agree this would be a weak argument based on Ariel’s description of the abuse).

The more relevant twist is that Ariel is the community manager of Pownce, a social microblogging site that is very much in direct competition with Twitter. Unless Ariel is certain that Pownce would handle this situation very differently from how Twitter is handling it she really needs to explain why this is calling out Twitter so powerfully rather than making more general statements about how the very lax online abuse standard are threatening the online social fabric.

This problem very powerfully emerged last year when Kathy Sierra, a prominent and excellent blogger, quit blogging entirely after several death threats against her. Although most of the community expressed outrage an alarming number of prominent bloggers suggested that free speech issues trumped the death threats, and came irresponsibly close to supporting what they seemed to see as the right of harrassers to threaten violence against others.

So it is important to make clear here that my personal view (which is not necessarily that of WebGuild) is that Twitter is wrong as are any social networks that allow harassment of community members. Whatever tiny advantages we might gain in free speech from an “anything goes” policy are washed away as debate is stifled under the threat of the virtual violence turning into real violence.

Update: Twitter Replies to Ariel

In their reply at GetSatisfaction, a customer resolution website, Twitter suggests that this case might be viewed differently by people if the comment stream was available. Presumably both Ariel and Twitter have a copy, so it should be published in the interests of fairness to everybody concerned.

Update 2:  Ariel’s Mom Checks in at her blog:

Mom Says:
May 22nd, 2008 at 10:31 pm

Yes, this is Ariel’s real mother. Those of you who are easily manipulated by media driven celebrity conspiracy theories or actually believe there is no such thing as integrity any longer will ignore this post. Too bad for you.

I am not here to comment on twitter, TOS, freedom of speech, the “sexiness” of ShakeWellBeforeUse or if Ariel is a c—. If I said she wasn’t, you wouldn’t believe me anyway.

I CAN attest to one thing. It IS a fact Ariel’s stalker has been after her for over 3 years beginning in her home town—before she had a high profile on the web. I have seen the physical evidence and know it to be threatening. Ariel did nothing to initiate this situation, the person in question is mentally unbalanced and deeply insecure. The person found out where she lived and made it known to her. Ariel has done everything within her power (talking to the person and friends of the person, police, legal advice, adjustment of lifestyle) to defuse the situation all to no avail. I had thought when she moved to the city, these attacks would end, but they have not. There is more than mere name calling going on. There is a history of vindictive harrassment. Whatever else you think about how she is handling it is your opinion, but she did NOT make this up.

Since I have known Ariel all her life I can tell you one thing. She plays by the rules. She does not manipulate people or situations for her own gain. And she is too smart to screw up her own reputation as a consultant in social media to try and play competing services against each other. All speculation on that account is ridiculous.

And Mom to Ariel: you could have told me you were going to blog this rather than let me randomly find out about it on my own.

In the Beginning …

A hypothesis I’ve been enjoying lately is that the universe has evolved from a single bit (or an “on or off” rule) that, in the beginning, was applied randomly to itself.  Everything has expanded from that and exists as information rather than particles and waves – or put another way our existence is best modelled as a set of information relationships rather than a collection of particles and waves.

I’m not sure how original this idea is (I think I read once that Feynman had speculated about the universe being driven by information bits rather than physical things)  but I’m really liking the concept as it seems to be consistent with most of the current ideas about how the world works such as string theory and evolution and perhaps even some forms of religious belief that don’t define things too narrowly.

So I’m saying  that the universe is best thought of as a non-physical entity analogous to a computer program (but I do NOT mean it is a “simulation” from a giant mind or computer – I’m saying at the core of everything there is … nothing but a single rule).  Everything we experience is an expansion from a single rule that says an information bits can be either on or off   (or 1 / 0 or whatever).    The physical reality / evolution / everything we all experience is as “real” as in other modelling, but it is an imperfect vision of what makes it all tick and when we eventually find the ultimate source it will be a single info bit.

That single “off or on” rule, when applied in random fashion, will lead to increasingly complex rule relationships.   Given enough time many different “information associations” will emerge including what we now call “matter” and “energy”.  In turn these matter and energy relationships evolved into … you, me, and the universe we currently experience.

What I like about this notion is that it helps me come to grips with some of the most baffling aspects of modern physics:   The first is string cosmology which suggests there could be an infinite number of alternate universes existing side by side with ours.    This seems a lot less fanciful if our existence is information driven.  In  the same way you can run a chess program and this blog side by side without them having awareness of each other we could be “running” our universe without awareness of our next door neighboring universe.   

The second reconciliation is with the electrons of an atom, which are not well defined as either particles or energy.  They “exist” in our models as a probability relationship with our observation and with the atom.   This is mind bending if you try to think about it in physical terms, but if *at the very core* the electron is a rule and not an object or wave then it seems to be much more accessible as a concept.

At first I’d thought this might do something to reconcile the religion vs science debates but I don’t think religious folks will be too happy with an origin this simple and … “godless”, though I think I’m cool with the idea we can say God is the prime mover in the equation – the single on/off rule that, applied randomly to itself, started everything.

Google search transparency? You call that transparency?

Google does a lot of wonderful things, including many that people do not give this amazing company nearly enough credit for doing. These include mail, calendar, and document applications as well as great free search.

However Google transparency goes out the window when it comes to open discussion of the incredible amount of collateral damage Google inflicts daily on websites – including many that never know how their mom and pop business has been displaced by clever SEO tactics from spammers as well as legitimate marketeers who understand the system well.

Udi Manber at Google suggests that they are working for better transparency in the rankings process but I’m sure not holding my breath.

Strategically I believe Google continues to make a mistake here that ultimately is their great achilles heel, though Microsoft and Yahoo have been so busy fumbling their online balls that they don’t seem to get that yet.

The idea is that transparency leads to sharing ranking secrets and that leads to abuse of those rules. Sure, there would be some of that, but better would be to do a lot more to involve the online community in the definition and policing of spammy material, and also to be more responsive to webmasters who have questions about why their sites suddenly disappear from the rankings or – far more common and mysterious – are simply downranked to the degree they no longer get Google traffic. This last penalty offers one of the few instances where Google actually comes very close to lying to webmasters, implying that when “your site appears in the index” you have no penalty when in fact the downrank penalty by Google is severe, leading to almost no Google traffic. If you are an advanced SEO person you’ll have a sense of the downrank penalty, but in the best indication of how the lack of transparency backfires at Google it is the top SEO Marketers and spam experts who immediately will determine that they have penalties.

Mom and pop businesses are often hung out to dry with these penalties or – more often – simply ranked lower than they should be because they have failed to perform basic SEO on their websites because they have no idea what SEO even means. Also common are websites who hire or associate with questionable SEOs (which constitute about 90% of all SEOs), not knowing that they have violated Google’s improved-but-still-too-ambiguous webmaster guidelines.

In fairness to Google they do have a huge scaling challenge with everything they do.  Dealing with milllions of sites and billions of queries can’t be handled with more than a tiny fraction of the effort going into manual solutions.   However this is what the socializing power of the internet is for.  Digg, Wikipedia, and many other sites effectively police content quality without massive labor costs.

So Udi I’m thrilled you and Google are bringing more transparency to the process but forgive my skepticism that Google will give more than lip service to a much broader, open discussion and corrections of the many ways the ranking process has failed to deliver something that is really important: fairness.


My comment about this topic left over at the most excellent Mr. Matt Cutts’:

Matt I really thought Ubi’s post was probably too generic to be of practical help to most sites with problems. From the inside it probably appears that Google is bending over backwards to make absolutely sure almost no “innocent” sites get caught up in the SEO and Spam crossfire, but in practice most sites now attempt SEO in some form and many sites (and even companies) wind up damaged or destroyed without even knowing what hit them. The issue is the degree to which Google should share “what hit them”. Policy is to share nothing about algorithmic damage, and I think policy is still to define “being in the index” as “no penalty” which totally confuses anybody outside of SEO and even many of us who understand SEO quite well.

It’s the classic collateral damage argument – Google thinks this is necessary to protect the Algorithm, but I think long term this is a mistake and Google should expand the system of communication and community so there is at least a better explanation of the severe downranking penalties that leave sites in the index but out of view.

Towards a solution? Next time you do quality team hires have the new people play webmaster for a month before you share any info with them – have them work some sites, try to communicate with support, etc. This might help bring the outside frustrations…inside.

MicroHooBook: A Case Study in Online Lexicographical Evolution

After rumors of a Microsoft Yahoo Facebook deal surfaced I thought I’d cleverly coined the phrase “MicroHooBook” to describe the merger, and blogged a post with MicroHooBook in the title.    So understandably today when I read Matt Ingram’s MicroHooBook post   I first thought “Hey, Matt stole my cleverly coined word without even a link back!”.    But after a bit of Googling and timestamping research I learned he wrote his post a full hour or more before mine!   Yikes – he probably thinks I was the one who nabbed the term from him.    Good net citizen I am I immediately linked to Matt’s post and left a note at his comment section.

But wait…there’s more…..

It appears the first use of MicroHooBook happened here at The 463 by Sean Garrett (and I thought “Joe Duck” was a cryptic name for a mostly tech blog).  I wasn’t familiar with this blog or Sean but he must be quite a sharp guy to think of MicroHooBook before Matt, and then I, though of it, all probably independently in another example of how the internet is making literary lexicographical originality even harder than it used to be. 

The good news about MicroHooBook?    As a terms that was not used much if at all previously, it’s going to be a great little SEO case study for me.   This post, which uses the term often and links somewhat opportunistically to my own MicroHooBook post rather than what some would see as the more deserving Matt or Sean posts, should soon appear at the top of the ranks for the term, perhaps correctly because I sure am spending more time writing about this topic than the original MicroHooBookers.  MicroYaHookers?

Hey, I like “MicroYaHooker” better than MicroHooBook.  I’ll consider that my orginal contribution to the online lexicography .. at least until I find somebody who already wrote it.

Update:  Google indicated “MicroYaHooker” is so original it’s not even a GoogleWhackBlatt yet…



Ground Zero from Fast Company’s offices in NYC

Ground Zero from Fast Company’s offices in NYC

Originally uploaded by Robert Scoble

Robert Scoble has a very intriguing and tragic picture from Fast Company’s NY offices looking at Ground Zero.

The attack on the World Trade Centers that killed almost 3000 innocent Americans will be viewed as one of history’s most significant events both as an unprecedented attack on America but also as the catalyst that set the stage for the most expensive security and military buildup in history. Did all this spending prevent more attacks? I think probably yes. Was the return on this spending as great as alternatives such as improved infrastructure? Almost certainly not.

MicroHooBook rumors are very probably false. A test of the non-Emergency Blogcasting System?

I thought I’d coined “MicroHooBook” but Matt   had done that  a full hour before.

Just a moment, just a moment…. looks like The 463 had it before Matt.   Originality sure isn’t what it used to be…

Microsoft is certainly working with Yahoo now to try to buy a piece of the company rather than the whole – Microsoft announced that over the weekend.     Most think they want to buy the search component of Yahoo and that Yahoo may sell because if they don’t Carl Icahn will be forcing a proxy fight that he will probably win, having already bought or lined up about 30% of the votes/shares in his favor.     

But John Furrier “broke” the rumor that as soon as they had Yahoo search MS would snap up Facebook for 15-20 billion.    I think this rumor is speculation and nothing more and I’m even thinking this was something of a test of the non-emergency blogcasting system, which generally delivers misleading information even faster than the truth. 

John Furrier and Robert Scoble are both clever guys, which is why I’m a bit suspicious they have cooked up the MicroHooBook rumor to test TechMeme and how the blogosphere reacts to unfounded rumors.

As usual, the blogOsphere loves unfounded and unverified rumors and this is the key tech blog story for Monday May 19. 

I think Sarah Lacy has this right, and she’s got more of an inside track to Facebook than most reporters.

Blog Revolution Note XXIV

At SoundBiteBlog I stumbled (or rather twitter-comment-followed) an excellent post about how much the poisonous / ranting writing styles of many blogs help them succeed.   The author wonders if nice blogs can finish first …

The short answer is “sure”.  A good example is Matt Cutts at Google who rarely has a bad word to say about anybody at his blog yet has one of the most read technical resources on the internet for Google search issues.   Fred Wilson’s A VC is also a blog with heavy readership and a friendly tone.    Marc Andreessen at  is another and there are many, many more.

However I think the key blogging success issue is ranking, and there are many ranking problems in blogging paradise.  Blogs that rank well will be read more often and in turn will confer more rank via linking, so the  *linking style* of most of the old timer blogs  has really inhibited the broader conversation.   The best posts about any given topic are rarely by A list blogs anymore but these posts are rarely seen because the ranking structure favors older, more linked blogs over those with less Google authority.   

The old authority models work much better for websites – where high ranks for a general category make sense  – than for blogging where authors tend to cover a lot of topics.    TechCrunch will appear with a higher rank than almost any other blog if a technology topic is covered even if their coverage is weak, wrong, or misguided.    A thoughtful and well researched post about a critical topic is unlikely to surface if it is written by an “outsider” and escapes the RSS feed of somebody prominent, or sometimes even if linking to that post is seen by the “A lister” as giving a potential competitor too much free juice.   Note how “up and coming” tech blogs like Mathew Ingram link generously while most A list blog writers – who are now often hired writers, paid to be seen as a key breaking source of news – are far less likely to  cite other blogs.    Ironically I think success has really diminished some formerly great blogs.    John Battelle is one of the most thoughtful writers on the web but now he’s way too busy with Federated Media to keep Searchblog as lively as it once was.  

Google and other aggregators (like TechMeme) in part use metrics similar to Google pagerank to define TechCrunch as more reliable because they have more incoming links, more history on the topic, and more commenting activity.   This is not a *bad* way to rank sites but it tends to miss many high quality, reflective articles from sources who do not actively work the system. 

Solutions?  I still think a blog revolution is needed more than ever to re-align quality writing and new bloggers with the current problematic ranking systems. 

In terms of the ranking algorithms I’m not sure how to fix things, though I think Gabe should use more manual intervention to surface good stuff rather than just have TechCrunch dominate TechMeme even when their coverage is spotty and weird.   I’m increasingly skeptical that TechMeme is surfacing the best articles on a topic – rather it seems to give too much authority to a handful of prominent but superficial stories.    As others link and discuss those stories we have only the echo of a smart conversation.  

I don’t spend enough time searching Technorati to know if they are missing the mark or not, but I like the fact they are very inclusive.   However like Google and I think Techmeme, Technorati has trouble surfacing content that is highly relevant and high quality but not “authoritative”.

For their part, Google needs to do more to bring blog content into the web search results.   Last year at SES Matt Cutts was explaining to me that they are doing more of this than ever and I’m sympathetic to the fact that fresh content into the SERPS will lead to spamming problems, but I’m finding that I often get more relevant results from a blog search at Google than a regular search.   This is more the case for breaking news or recent events but it has even happened for research topics where the blog search has led me to expertise I don’t find in the web listings.