Over at Marketing Pilgrim, an excellent resource from Andy Beal and friends, I’ve been giving Jordan a hard time about citing a radio industry study that (surprise!) shows that radio is awesomely effective. I see a ton of this in the travel sector and these bogus studies that “prove” economic or advertising effectiveness are really starting to piss me off, because this is an abuse of the correct notion that research is a great way to measure the effectiveness of things. Ironically you don’t even need any “cheating” on these industry sponsored studies to get bad results for the reasons I discuss with Jordan below:
Joe Duck Says:
Supported with fundingprovided from Radio industry companies
Studies by agencies like this generally *will not* publish anything but favorable things about radio. All such industry sponsored research is therefore suspect.
Jordan McCollum Says:
Well, yes and no. They might not publish their studies that don’t have favorable results, but (I hope) they’re not screwing with their methodologies to produce results skewed in their favor in the experiments that they do publish.
And while increasing unaided recall 450% is a pretty nice stat, it’s the only concrete, conclusive, across-the-board improvement found in the study. The other positives were significant for some brands studied and not others, with aggregate totals of almost no change. That’s why I said that it can influence them, but “well-established brands” might not be as effective.
Thank you for the comment, though. You’re right–you gotta follow the money.
Joe Duck Says:
Jordan it’s a very slippery slope to use industry studies due to the selectivity, though it is really common. Interestingly studies like this don’t have to screw with anything at all to create problems for people who want unvarnished truth. Assume for example that they did 3 excellent, methodologically sound studies on this topic and 2 of them indicated “zero increase in unaided recall”. The logical research conclusion is to be skeptical of the recall claim, but if we only see the positive study we’ll draw wrong conclusions. It’s rarely this cut and dried and you rarely see industry studies with good sets of assumptions, so all I’m also suggesting that studies like this are better left on the shelf if you are building a quality marketing strategy. One should stick to research done by people or groups who will still be around regardless of the outcome of the research.