tag:blogger.com,1999:blog-13805897224338004222024-03-13T11:42:47.090-07:00MPM ToolkitDavid Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.comBlogger49125tag:blogger.com,1999:blog-1380589722433800422.post-57805068803427387772009-10-11T08:29:00.000-07:002009-10-11T09:08:06.558-07:00New Study, Same Results: Minority of Companies Do Effective Marketing Performance MeasurementThe latest addition to my collection of surveys about marketing measurement is <a href="http://www.marketingperformanceadvantage.com/">The Marketing Performance Advantage</a>, a joint effort from strategic marketing consultants <a href="http://www.cmgpartners.com/">CMG Partners</a> <a href="http://www.cmgpartners.com/"></a> and market researcher <a href="http://www.cmbinfo.com/">Chadwick Martin Bailey</a>. Based on 400 online interviews among CFOs, CEOs and marketing employees of companies with 100+ employees, this is one of the larger and more sophisticated studies on the topic.<br /><br /><span style="font-weight: bold;">One-Quarter of Companies Measure Marketing Performance Effectively</span><br /><br />The main finding is that about one-quarter of marketers feel they do an adequate job of measurement. This matches other studies on the topic. The survey asked several questions along those lines:<br /><br />- 20% say their company "excels" at measurement<br />- 22% "excel" at using measurement-based insights to drive improvement<br />- 24% see a positive impact from measurement, and<br />- 27% have fully integrated measurement into marketing planning<br /><br />The study also resembled other research in showing that many more marketers list measurement as a top priority (44%) than actually do it.<br /><br /><span style="font-weight: bold;">Marketing VP's Are More Satisfied with Measurement Than Anyone Else</span><br /><br />One intriguing detail was that senior marketers seem eerily "overconfident" (the authors' word) compared with those above and below them in the organization.<br /><br />- 13% of marketing vice presidents consider marketing performance measurement a "huge challenge", compared with 34% to 38% of CEOs, marketing directors and marketing managers, and 61% of CFOs.<br /><br />- 38% of marketing vice presidents felt that measurement has a "huge impact" on their business, compared with 15% to 29% of CEOs, marketing directors and marketing managers, and 7% of CFOs.<br /><br /><em>How well is your organization performing with respect to measuring the performance of marketing initiatives? How well are you using insights to improve the performance of marketing initiatives?</em><br /><table border="1" cellpadding="0" cellspacing="0"><tbody><tr><td valign="top" width="150">This is a huge challenge...</td><td valign="top" width="107">CEO</td><td valign="top" width="61">CFO</td><td valign="top" width="106">VP Marketing</td><td valign="top" width="106">Director Marketing</td><td valign="top" width="106">Manager Marketing</td></tr><tr><td valign="top" width="150">Measuring MP </td><td valign="top" width="107">36% </td><td valign="top" width="61">61% </td><td valign="top" width="106">13%</td><td valign="top" width="106">38%</td><td valign="top" width="106">34%</td></tr><tr><td valign="top" width="150">Improving MP </td><td valign="top" width="107">28% </td><td valign="top" width="61">61% </td><td valign="top" width="106">9%</td><td valign="top" width="106">38%</td><td valign="top" width="106">38%</td></tr></tbody></table><br /><em>To what extent, if at all, has measuring the performance of your marketing initiatives improved your business?</em><br /><table width="654" border="1" cellpadding="0" cellspacing="0"><tbody><tr><td valign="top" width="154">Impact of MP on your business</td><td valign="top" width="107">CEO</td><td valign="top" width="65">CFO</td><td valign="top" width="109">VP Marketing</td><td valign="top" width="109">Director Marketing</td><td valign="top" width="109">Manager Marketing</td></tr><tr><td valign="top" width="154">No impact </td><td valign="top" width="107">29%</td><td valign="top" width="65">52%</td><td valign="top" width="109">0% </td><td valign="top" width="109">21%</td><td valign="top" width="109">21%</td></tr><tr><td valign="top" width="154">Neutral </td><td valign="top" width="107">42%</td><td valign="top" width="65">41%</td><td valign="top" width="109">62%</td><td valign="top" width="109">64%</td><td valign="top" width="109">50%</td></tr><tr><td valign="top" width="154">Huge Impact </td><td valign="top" width="107">29%</td><td valign="top" width="65">7%</td><td valign="top" width="109">38%</td><td valign="top" width="109">15%</td><td valign="top" width="109">29%</td></tr></tbody></table><br />Although the authors don't make the connection, these results help to explain why more money isn't invested in marketing measurement: the marketing vice presidents who control the purse strings are the least convinced they have a major problem.<br /><br /><span style="font-weight: bold;">Barriers to Measurement: Data, Technology and Process</span><br /><br />The survey also asked about major barriers to marketing measurement. These include all the usual suspects. If anything, what was intriguing was that executive support is such a small issue compared with the others:<br /><br /><span style="font-style: italic;">Barriers to improvement (% answers 1-4 on scale of 1-10):</span><br /><br />- 40% collecting the right data<br />- 40% technology/systems<br />- 39% clear & effective processes<br />- 36% use of customer analytics<br />- 36% organizational alignment<br />- 26% skills sets<br />- 20% senior level buy-in<br /><br /><span style="font-weight: bold;"> Effective Companies Have Clear Process to Apply Measurements, Invest in Measurement Capabilities and Hold Marketing Accountable for Results</span><br /><br />Another set of questions covered adoption of best practices, and compared answers from companies reporting positive impact from marketing measurement with answers from the others. The biggest differences were in having clear processes to ensure that measurement-based insights are applied to decisions; the next tier included marketing targeted investments and holding marketing accountable for measured results. Senior level buy-in, strategic alignment and usage outside of marketing were less prominent.<br /><br /><span style="font-style: italic;">Best practice adoption (index of use by companies reporting positive impact, where 100=average of all companies)</span><br /><br />- 251 clear process to ensure measurements are applied to decisions<br />- 215 targeted investments in measurement technology/systems, skills and data<br />- 206 marketing held accountable on performance metrics<br />- 161 alignment of marketing activities to strategic business objectives<br />- 159 senior level buy-in<br />- 145 usage beyond marketingDavid Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com1tag:blogger.com,1999:blog-1380589722433800422.post-43485026532692879182009-09-29T14:08:00.000-07:002009-09-29T14:33:38.461-07:00eMarketer Report Details Next Steps for Online Brand Measurement<a href="http://www.emarketer.com/">eMarketer</a> recently released a deeply researched report on <a href="http://www.emarketer.com/brandmeasurement/">Online Brand Measurement</a>. Since it touched on several topics I’ve been pondering recently (see <a href="http://customerexperiencematrix.blogspot.com/2009/09/web-analytics-is-dead-so-is-customer.html">Web Analytics Is Dead…</a> on my <a href="http://customerexperiencematrix.blogspot.com/">Customer Experience Matrix</a> blog) , I read it with particular care.<br /><br />This is a long report (58 pages), so I won’t review it in detail. But here are what struck me as the critical points:<br /><br /><span style="font-weight: bold;">- Web measurement has largely focused on counting views and clicks, not measuring long-term brand impact.</span> Counting is much easier but it doesn’t capture the full value of any Web advertisement. One result has been that marketers overspend on search ads, which are great at generating immediate response, and underspend on Web display ads which influence long term behavior even if they don’t generate as many click-throughs.<br /><br /><span style="font-weight: bold;">- Media buyers want Web publishers to provide the equivalent of Gross Rating Points (GRPs), so they can effectively compare Web ad buys with purchases in other media.</span> That’s okay as far as it goes, but it’s still just about counting, not about measuring the quality or impact of the impressions. As the paper points out, even engagement measures such as time on site or mentions in social media, don’t necessarily equate to positive brand impact.<br /><br /><span style="font-weight: bold;">- Just about everyone agrees that the right way to measure brand impact is to tailor measurements to the goal of a particular marketing program.</span> This may sound like a conflict with the desire for a standard GRP-like measure, but it really reflects the distinction between counting the audience and measuring impact. GRPs work fine for buying media but not for assessing results. Traditional media face precisely the same dichotomy, which is why marketing measurement is still a puzzle for them as well. And just as most offline brand measures are ultimately based on surveys and panels, I'd expect most online brand measures will be too.<br /><br /><span style="font-weight: bold;">- Meaningful impact measurement will integrate several data types, including online behaviors, visitor demographics, offline marketing activities and actual purchase behavior.</span> These will come from a combination of direct online sources (i.e., traditional Web analytics), panel-based research and surveys (for audience and attitudinal information), and offline databases (for demographics and purchases). Ideally these would be meshed within marketing mix models and response attribution models that would estimate the incremental impact of each marketing program and allow optimization. But such sophisticated models won’t appear tomorrow.<br /><br />To me, this final point is the most important because it points to a “grand unification theory” of marketing measurement that combines the existing distinct disciplines and sources. The paper cites numerous current efforts, including:<br /><br />- multimedia databases being created (separately) by panel-based measurement firms including <a href="http://www.comscore.com">comScore</a>, <a href="http://www.Nielsen.com">Nielsen</a>, <a href="http://www.Quantcast.com">Quantcast</a> and <a href="http://www.compete.com">TNS Media Compete</a>;<br /><br />- <a href="http://www.datranmediaaperture.com/">Datatran Media’s Aperture</a>, which combines email and postal addresses with <a href="http://www.Acxiom.com">Acxiom</a> household data, <a href="http://www.ixicorp.com/">IXI</a> financial data, <a href="http://www.mindsetmarketing.com/">MindSet Marketing</a> healthcare data and <a href="http://www.nextaction.net/">NextAction</a> retail data;<br /><br />- a joint effort between <a href="http://www.Omniture.com">Omniture</a> and <a href="http://www.kantar.com/">WPP’s Kantar Group</a> that combines data from email, search, display ads and traditional media;<br /><br />- another Nielsen project combining TV ad effectiveness information from Nielsen IAG with panel purchase data from <a href="http://www.iagr.net/">Nielsen Homescan</a>.<br /><br />These all reinforce the claim I made in <a href="http://customerexperiencematrix.blogspot.com/2009/09/web-analytics-is-dead-so-is-customer.html">last week’s blog post</a> that individual data will increasingly be combined with panel- and survey-based information to provide community-level insights that are actually more valuable than individual data alone.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-27975335075329106152009-09-25T07:07:00.000-07:002009-09-25T07:25:38.230-07:00ANA Agency/Client Forum: Agency Performance Isn't Based on ResultsI spent most of yesterday at the <a href="http://www.ana.net/">Association of National Advertisers</a> (ANA)’s <em>Agency/Client Forum</em>. The agenda covered a range of timely topics including digital advertising and social media. But I think it’s fair to say that most of energy was focused on the pocketbook issue of agency compensation.<br /><br />In particular, the question <em>du jour</em> was value-based compensation or its cousin, pay-for-performance. I would have thought those were pretty much the same thing, but as Coca Cola’s Director of Worldwide Media & Communication Operations Sarah Armstrong set out in a detailed description of Coke’s own process, value-based compensation works largely by estimating a reasonable cost in advance, while pay-for-performance is based on after-the-fact assessments. (Coke’s process incorporates both – a base fee that is intended cover agency costs, plus up to 30% bonus based on performance.)<br /><br />However, as several speakers made clear during the day, most pay-for-performance measures are based on agency behaviors such as innovation, strategic thought and execution, rather than business results such as sales, market share or even communications activities such as media cost savings. Specifically, an ANA survey that will be formally released in mid-October found that 56% of agency performance measures were based on qualitative metrics, vs just 19% on business results and 25% on communications metrics.<br /><br />My initial reaction to this survey was pretty dismissive – either you pay for results or you don’t. But the fundamental rationale, mentioned by several speakers through the day, is that business results are affected by many factors beyond the agency’s control, so it really wouldn’t be fair to penalized or reward them purely on that basis. The analyst in me says it’s still worth trying to isolate the agency's contribution to results, but this is definitely a valid point. So including the subjective measures does make more sense than I initially thought.<br /><br />A related question that ran through several presentations was whether agencies are commodities. One presenter flashed a survey that showed 80% of respondents thought they are (I didn’t capture the details of that survey, but think it was an informal online poll, so it’s probably not very meaningful). But both the clients and agencies among conference speakers felt strongly that they are not.<br /><br />What was interesting, though, was the sorts of distinguishing features that speakers cited – strategic insights, brand stewardship, creative genius, etc. Those are based largely on the skills and chemistry of the individuals working on an account. As the cliche says, those assets “go down the elevators every night” – that is, they are individuals rather than property of the agency itself. So it’s possible that the agencies themselves are pretty much commodities (i.e., have about the same processes and technology) even if their people are different.<br /><br />And even when it comes to people, I find it hard to believe that any one agency can really have people who are on average much better than any other agency. There are, in fact, plenty of smart and creative people in the world. Yes, there are occasional true geniuses, and clients lucky enough to find them working on their accounts may indeed gain a strategic advantage. Perhaps some of those geniuses are even so clever that they can build an entire culture around themselves to leverage their skills. But I'd say that level of genius is very much the exception.<br /><br />In general, then, I suspect that once a quality agency comes up to speed, it would produce roughly similar results to another quality agency. This doesn't mean that you could immediately switch from one to another. But over the long term, agencies probably are something close to a commodity.<br /><br />This relates back to the performance measurement questions. The value of an agency really does lie in its strategic, creative, and execution contributions, plus its ability to work closely with the client. In theory, most agencies should be able to do these equally well. But in fact, there will be variations based on the individual team members as well as (to a lesser degree, I think) differences in agency processes and culture. So it makes sense for performance evaluations to focus on those factors, even though they’re subjective. Marketers must measure those factors to identify areas needing improvement, either by changing performance of their current agency partners or switching to new ones.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-75587281282154492009-09-01T17:12:00.000-07:002009-09-01T17:32:57.269-07:00Mzinga Survey Shows Most Companies Don't Measure Social Media ROI<span style="font-style: italic;">Toute le</span> blogosphere is in love with social media, which of course means that some contrarians have to argue that it’s over-hyped. So it was interesting to see a survey (available <a href="http://www.mzinga.com/en/Community_Technology/Resources/Industry_Research/">here</a>; registration required) show that social technologies are indeed widely adopted: 86% of 555 respondents said they are currently using them for business purposes, and 61% said it was an ongoing component of their business. <br /><br />Caveat: the survey was sponsored by social technology vendor <a href="http://www.mzinga.com">Mzinga</a> in conjunction with the <a href="http://www.babson.edu/bee">Babson Executive Education</a> program, so they had a stake in the outcome. But I didn't see any obvious problems with it, and even allowing for some bias, the results still suggest wide social technology usage among a broad spectrum of businesses.<br /><br />Probably the most interesting result from a marketing measurement perspective was that just <span style="font-weight: bold;">16% of respondents reported measuring ROI</span> on their social media programs. No surprise, alas, but worrisome because programs that can’t prove ROI are subject to cancellation when money is tight. <br /><br />Somewhat supporting this line of reasoning, the survey showed that just 40% of respondents had budget dedicated to social media and 57% had employees assigned to it. Perhaps many of those employees work for free, but a more likely explanation is that their costs are not part of project budgets because they're part of a vaguely fixed "overhead". This makes it easier to sustain a social media effort without formal economic justification. But it can’t be a permanent situation – managers will eventually realize that time spent on social media has a real cost. So justification of some sort will ultimately be needed.<br /><br />Of course, that justification won’t necessarily be ROI. We all know that many traditional marketing investments are not justified on the basis of ROI, and marketing is by far the most common social media application (57%, vs 39% for internal collaboration, 31% other, 29% customer service & support, 25% sales, 21% human resources, 16% strategy and 14% product development). Marketing in social media could easily go unmeasured as well. <br /><br />Indeed, just 8% of respondents said their social technology system could showcase ROI, vs. 41% who said it couldn’t. An impressively large 44% didn’t know, which I interpret to mean that they didn't care enough to find out. So I think it’s safe to say that ROI measurement hasn’t been a major priority.<br /><br />The other intriguing figure in this survey was that <span style="font-weight: bold;">55% of respondents said there was no feature/function that they'd like added to their social media platform</span>. <span style="font-style: italic;">REALLY?</span> They can't be trying very hard: I mean, I can think of features I’d like added to a <span style="font-style: italic;">light bulb</span>.* <br /><br />If people are satisfied with their tools in such a rapidly evolving space, they probably aren’t using them for much. Or, to put it more charitably, maybe they recognize that they’re not taking advantage of what’s already available and feel they should master that before looking for anything more. Either way, this suggests that most deployments are quite immature. <br /><br />One final factoid: 61% are integrating social media within their Web site or other sites, vs. 40% running standalone community sites and 39% deploying as social widgets in third party sites such as Facebook. I’m surprised that community sites and widgets are so popular. Maybe these are signs of experimentation. Anyway, it’s food for thought.<br /><br />My general take, then: the survey shows wide testing of social technologies, but little deep engagement. Without a firm economic or other justification, there’s a good chance that the efforts won’t be sustained. So it’s up to social technology gurus, and vendors like Mzinga, to start demonstrating not just what social technology can do, but what makes it worth an investment. <br /><br />__________________________________________<br />* How about an indicator that shows how long until it burns out? Preferably with a wireless Internet connection that alerts me when failure is imminent.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-30327379785489506762009-07-06T10:38:00.000-07:002009-07-06T10:49:07.105-07:00CMO Council Study: Customer Loyalty Is FleetingThe <a href="http://www.cmocouncil.org">CMO Council</a> and Catalina Marketing’s <a href="http://www.pointermedianetwork.com">Pointer Media Network</a> recently released a major study on consumer loyalty in packaged goods brands. The study, <span style="font-style: italic;">Losing Loyalty: The Consumer Defection Dilemma™</span>, draws on Catalina’s vast loyalty card transaction database to analyze the individual buying patterns of more than 32 million consumers in 2007 and 2008 across 685 leading CPG brands. <br /><br />The bottom line is that “loyal” consumers are not as reliable as most of us might have guessed. “For the average brand in this study, 52% of highly loyal consumers in 2007 either reduced loyalty or completely defected from the brand in 2008.” You can <a href="http://www.cmocouncil.org/resources/form_losing_loyalty.asp">download the 12 page report</a> for details.<br /><br />Not surprisingly, the report proposes to use individualized targeting services like Pointer Media Network to reduce churn by making carefully selected offers to at-risk consumers. Although the recommendation is obviously self-serving, I do think it’s correct.<br /><br />But it seems to me that the implications are more fundamental. In the eternal debate about brand value, finding that loyalty evaporates more quickly than expected makes it even harder to justify marketing programs that don’t bring about an immediate, measurable return. <br /><br />I’ve seen arguments (sorry, I can’t recall where) that the traditional buying model of awareness – interest – trial – purchase doesn’t correspond to reality. The survey results seem consistent with that position, in that they present consumer behavior as much less predictable than expected. This further reinforces the idea that investments with short-term results are more reliable than the long-term investments traditionally associated with brand building. <br /><br />Pardon the cliche, but what we’re talking about here is a paradigm shift. If consumers don’t follow a predictable buying pattern, then brand value models based on such a pattern are not justifiable. Marketers need a fundamentally new framework to predict how their activities will affect consumer behavior. This framework may owe more to chaos theory than a linear process flow. I don’t know what they new model will look like, but recognizing that one is necessary is the first step towards creating it. If anybody out there has some candidates to offer, I’d love to hear about them.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-43847252739214618182009-05-30T08:31:00.000-07:002009-05-30T10:45:22.035-07:00Two More Surveys Confirm that Most Marketers Don't Track ROI<p>The <a href="http://www.salesleadmgmtassn.com/">Sales Lead Management Association</a> and <a href="http://%20www.velosgroup.com/">Velos Group</a> published their annual lead management practices survey last week. (Read it <a href="http://www.salesleadmgmtassn.com/login.php?continue=/2008_Lead_Management_Practices.php">here</a>; free registration required.) The survy had a relatively small sample (just over 140 responses) and was weighted towards smaller companies (80% had fewer than 25 sales reps). But it still provides some insight into how many companies actually do business.<br /><br />The key finding from a marketing measurement viewpoint was that <strong>62.5% of respondents do not track ROI on marketing programs</strong>. This is not especially surprising; in fact, it’s better than the 76% reporting they do not use ROI another, larger study released last week by <a href="http://www.lenskoldgroup.com/">Lenskold Group</a>. (Click <a href="http://www.lenskold.com/content/2009mroistudy.html">here</a> for the Lenskold study.) But it’s still bad news.<br /><br />Perhaps even more distressing is that just 19.3% of the respondents listed their inability to track ROI as a major sales lead management concern. Subtracting those from the 62.5%, this means that more than 40% were not particularly concerned about their failure to track ROI. It MIGHT also mean that many of those 40% actually could track ROI if they wanted to, although it’s more likely that most don’t have the capability but don’t consider that a problem.<br /><br />The other findings from the survey also generally confirm that dismal state of the art, at least among smaller firms.<br /><br />- More than half the respondents (55.5%) said they do not qualify their marketing inquiries before sending them to sales. This implies a huge waste of time by salespeople who then do the qualifications themselves, or, more likely, cherry pick the leads that look superficially promising and ignore the rest. Unless a company has managed to staff its sales team with clairvoyants, this is a guarantee that it will discard some good leads and spend more than it should on some bad ones.<br /><br />- One-third (33.8%) don’t use sales automation or customer relationship management systems. Again, this is fundamental efficiency-killer. The survey also found that companies using these systems were not terribly satisfied with the results: 54% rated their satisfaction at 5 or less on a scale of 1 to 10. Maybe the problem is with the software itself, but I suspect the issue is lack of training and other supporting investments.<br /><br />- Half had no formal sales forecasting process (27.2%) or used Excel only (23.8%). Again, this shows very immature sales management at these companies.</p><p>I must say I find these results quite sad, given how long these tools have been available and how well their benefits are established. </p><p>But perhaps it’s best to adopt the more positive attitude of the survey authors and see this as an opportunity. As they put it, respondents “have a lot of room for improvement in their sales and marketing best practices. By spending time and resources in this critical business area, companies will be able to increase sales, allocate marketing resources more efficiently and will be able to forecast their sales more accurately. All of which will help them survive these difficult economic times.” </p>David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-5520956241127152362009-05-27T07:59:00.000-07:002009-05-27T08:28:34.711-07:00Whopper Freak-Out Wins Ad Effectiveness AwardI received a mailing with the agenda for the Association of National Advertisers' <a href="http://www.ana.net/events/conferencemtg/MAC-JUN09">Marketing Accountability and Effectiveness Conference</a> in New York on June 2. This looks like a good event, covering all the usual-but-useful bases: proving the value of marketing (Enterprise Rent-a-Car), earning a place at the "C-suite" table (panel led by Ernst & Young), advanced analytics (VG Corporation) and media optimization (Citizens Bank).<br /><br />But my favorite is an "EFFIE" Award for Burger King, for its "Whopper Freak-out" campaign, which "explored deprivation to see what would happen if America's most beloved burger was removed from the menu forever without any announcement." Since I avoid both television and Burger King, this was news to me, but I gather a bunch of TV commercials were involved. Interesting.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-21941073628711963702009-04-01T11:08:00.000-07:002009-04-01T11:13:55.906-07:00Synonyms for "To Make Use Of"This isn't really a blog post but I couldn't find another easy way to put the image below on the Web. It's from <a href="http://www.visualthesaurus.com/">www.visualthesaurus.com</a>, which is certainly something I'm happy to publicize a bit because it is indeed useful. This specific map addresses a problem I've had for years, which is finding a word to convey "making use of something". Typically I want to use "exploit" but that sounds rather harsh. This map has a number of alternatives.<br /><br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYY3hL7EMIzBqSA4REM3_q-eLGWWtA-HQLQG6Ar4KT-sE-Wk4Hq2RTklhBVGJJRHTnMfzrF2fp7ht-B8yCxTb6kj69Y7jXvzRuXjOTOr2xI2zzCLDQUBa83VnuRCswmdt-K8crj5BvAeg/s1600-h/exploit_synonyms.gif"><img id="BLOGGER_PHOTO_ID_5319786682482155746" style="DISPLAY: block; MARGIN: 0px auto 10px; WIDTH: 338px; CURSOR: hand; HEIGHT: 312px; TEXT-ALIGN: center" alt="" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYY3hL7EMIzBqSA4REM3_q-eLGWWtA-HQLQG6Ar4KT-sE-Wk4Hq2RTklhBVGJJRHTnMfzrF2fp7ht-B8yCxTb6kj69Y7jXvzRuXjOTOr2xI2zzCLDQUBa83VnuRCswmdt-K8crj5BvAeg/s400/exploit_synonyms.gif" border="0" /></a><br /><div></div>David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com1tag:blogger.com,1999:blog-1380589722433800422.post-67836065884925696892009-03-24T11:08:00.000-07:002009-03-24T11:58:02.199-07:00Marketing Measurement Book Includes Free Online FormsI'm not usually quite so self-promotional but suppose it's reasonable to announce final publication of my long-promised book <a style="font-style: italic;" href="http://raabassociatesinc.com/2009/03/16/raab-book-on-marketing-measurement-now-available/">The Marketing Measurement Toolkit</a>. It's a step-by-step tutorial on the process of building a marketing measurement system, from initial project definition through deployment. The idea was to move beyond the theories (important as they are) to help people with the practical details. You can order from the publisher at <a href="http://www.racombooks.com./">www.racombooks.com</a>.<br /><br />My favorite feature of the book (especially since they didn't put my picture on the cover) is a collection of forms and scorecards that help people to organize their project and assess risk factors. I've put these online <a href="http://raabassociatesinc.com/mpm-toolkit/">here</a> where anyone can download them. Obviously they make more sense in the context of the book, but even without that I think they'll provide useful checklists at different project stages. <br /><br />Here, for example, is an extract <span style="font-weight: bold;"></span>from the <span style="font-weight: bold;">Analytics Readiness Scorecard</span> in chapter 8. The extract covers only Response Measurement, while the full scorecard includes similar sections on Segmentation Models, Predictive Models, Marketing Mix Models, Simulation Models and Optimization Models. The idea is to figure out which types of analytics your company can build with its current resources, or, looking at it slightly differently, which resources it must add to do the analytics you want. Users enter a 1-5 score for the existing and needed columns, and the system then calculates a gap. This isn't intended to provide much more than conventional wisdom, but a big, well-organized pile of conventional wisdom can be very useful.<br /><br /><table str="" style="border-collapse: collapse; width: 640pt;" width="854" border="1" cellpadding="0" cellspacing="0"><col style="width: 173pt;" width="230"> <col style="width: 56pt;" span="3" width="75"> <col style="width: 299pt;" width="399"> <tbody><tr style="height: 15pt; font-weight: bold;" height="20"> <td colspan="5" class="xl32" style="border-right: 1pt solid black; height: 15pt; width: 640pt;" width="854" height="20">Analytics Readiness Scorecard</td> </tr> <tr style="height: 13.5pt;" height="18"> <td class="xl22" style="height: 13.5pt; width: 173pt; font-weight: bold;" width="230" height="18">Response Measurement</td> <td class="xl23" style="width: 56pt; text-align: center;" width="75">existing</td> <td class="xl23" style="width: 56pt; text-align: center;" width="75">needed</td> <td class="xl23" style="width: 56pt; text-align: center;" width="75">gap</td> <td class="xl24" style="width: 299pt;" width="399">comment</td> </tr> <tr style="height: 13.5pt;" height="18"> <td class="xl25" style="height: 13.5pt; width: 173pt;" width="230" height="18">source captured directly</td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl27" style="width: 56pt; text-align: center;" num="" fmla="=B3-C3" width="75">0</td> <td class="xl26" style="width: 299pt;" width="399"><br /></td> </tr> <tr style="height: 13.5pt;" height="18"> <td class="xl25" style="height: 13.5pt; width: 173pt;" width="230" height="18">contact history available</td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl27" style="width: 56pt; text-align: center;" num="" fmla="=B4-C4" width="75">0</td> <td class="xl26" style="width: 299pt;" width="399"><br /></td> </tr> <tr style="height: 13.5pt;" height="18"> <td class="xl25" style="height: 13.5pt; width: 173pt;" width="230" height="18">response survey available</td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl27" style="width: 56pt; text-align: center;" num="" fmla="=B5-C5" width="75">0</td> <td class="xl26" style="width: 299pt;" width="399"><br /></td> </tr> <tr style="height: 13.5pt;" height="18"> <td class="xl25" style="height: 13.5pt; width: 173pt;" width="230" height="18">pre/post analysis possible</td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl27" style="width: 56pt; text-align: center;" num="" fmla="=B6-C6" width="75">0</td> <td class="xl26" style="width: 299pt;" width="399"><br /></td> </tr> <tr style="height: 13.5pt;" height="18"> <td class="xl25" style="height: 13.5pt; width: 173pt;" width="230" height="18">test/control possible</td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl27" style="width: 56pt; text-align: center;" num="" fmla="=B7-C7" width="75">0</td> <td class="xl26" style="width: 299pt;" width="399"><br /></td> </tr> <tr style="height: 13.5pt;" height="18"> <td class="xl25" style="height: 13.5pt; width: 173pt;" width="230" height="18">multi-variate test possible</td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl26" style="width: 56pt; text-align: center;" width="75"><br /></td> <td class="xl27" style="width: 56pt; text-align: center;" num="" fmla="=B8-C8" width="75">0</td> <td class="xl26" style="width: 299pt;" width="399"><br /></td> </tr> <tr style="height: 13.5pt;" height="18"> <td class="xl28" style="height: 13.5pt; width: 173pt;" width="230" height="18">total</td> <td class="xl26" style="width: 56pt; text-align: center;" num="" fmla="=SUM(B3:B8)" width="75">0</td> <td class="xl26" style="width: 56pt; text-align: center;" num="" fmla="=SUM(C3:C8)" width="75">0</td> <td class="xl26" style="width: 56pt; text-align: center;" num="" fmla="=SUM(D3:D8)" width="75">0</td> <td class="xl29" style="width: 299pt;" width="399"><br /></td> </tr> </tbody></table><br />So, by all means, check out the forms and, if you're so inclined, purchase the book. Any comments are more than welcome.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-11549303372520160312009-02-28T06:20:00.000-08:002009-02-28T06:33:51.676-08:00Rate This Neutral: Scout Labs Social Media Monitoring is Definitely Cool, Possibly AccurateSure I like flashing lights and buzzers: what technologist doesn’t? And if a product has all that plus a low price, it’s darn near irresistible. So I was quite excited when I saw <a href="http://www.scoutlabs.com/">Scout Labs</a>, a very nicely packaged social media monitoring tool that combines automated search, sentiment identification, importance ranking, trend reporting, alerts, bookmarking, and collaboration for under $300 per month. What’s not to like?<br /><br />A couple things, it turns out. But let’s look at the good stuff first. Scout Labs’ combines three of the <a href="http://mpmtoolkit.blogspot.com/2009/02/tools-for-social-media-measurement.html">five social media measures</a> I proposed last week (tracking mentions, identifying mentioners, measuring influence, understanding sentiment and measuring impact). Specifically, it searches blogs, news feeds, video and photo sites, <a href="http://www.twitter.com/">Twitter</a> and some social network sites (although not yet the big ones); provides influence measures; and classifies blog posts by sentiment. It doesn’t attempt to identify mentioners (i.e., track multiple posts by the same individual), or to measure the impact of an item on its audience. But three out of five is pretty good.<br /><br />More important, the things that Scout Labs does, it does well. The search feature lets users specify multiple terms and whether each term is required, relevant or excluded. Once a search is defined, the system will automatically scan the top 12 million blogs for qualified entries, rate their sentiments as positive, negative or neutral, and show them in a list. Each item on the list shows the blog headline and phrases with the search terms highlighted. A side box shows common words in all the entries, ranked by frequency. This by itself gives a quick view of what’s being said about the search target. <br /><br />Users can drill into the listed items to see the full entry, details about where it came from, how many external links attach to the item and its source, and the sentiment rating. They can manually revise the rating, bookmark the item with keywords, attach a note for discussion, and email a link with a system-generated summary and the user’s own comments to anyone the user chooses. The system currently uses the link counts as an influence measure, and can rank the items by influence or date. Scout Labs is working to upgrade its influence metric by integrating Web traffic data and a measure of the source’s relevance to the search topic.<br /><br />But there’s more. The system can prepare graphs showing trends in volume, sentiment, and share of total blog mentions. Graphs can compare statistics for up to four different searches. Users can specify the date ranges to report on, currently going back up to three months and soon extending to six months.<br /><br />Things are a little less exciting once you move beyond the blogosphere. The system will list search results for photo sites, video sites and Twitter, but doesn’t offer sentiment tracking or graphs. Scout Labs is working on adding sentiment tracking to Twitter comments. I guess it's not fair to ask them to measure sentiment for photos or videos. <br /><br />As to pricing, the smallest Scout Labs plan allows five saved searches for $99 per month, although the company thinks expects most businesses will take plans for 25 or more searches, which start at $249 per month. There are no limits on the number of users or search hits in any plan and the system continuously updates the results of the saved searches. <br /><br />So far so good. There's a free 30 day trial, so I set up two test searches in Scout Labs, each for a demand generation software vendor I track closely. The system found many of the posts I expected, and it was definitely fun and convenient to dig into them. If I worked at one of those firms, I would gladly pay $249 per month for this. <br /><br />But then I ran the same searchs in <a href="http://www.icerocket.com/">IceRocket</a>, a free tool that also does searches of blogs and other sources. IceRocket found nearly twice as many hits during the same time period, and they looked legitimate. Ouch. But Scout Labs acknowledges that its 12 million blogs don’t cover the entire blogsphere (over 100 million blogs, last I heard), and it does let you add feeds if one you want is missing. Plus IceRocket doesn’t support saved searches or do any of the other cool stuff. So I’m a little worried about coverage but still willing to pay Scout Labs’ fee. <br /><br />Next I took a closer look at the sentiment ratings in the Scout Labs results. I didn’t expect them to be perfect, but was seriously disappointed. On one search, 32 of 39 items were labeled as neutral. Some of those were actually pretty positive, but, as Scout Labs explains in a <a href="http://www.scoutlabs.com/2009/02/26/how-does-sentiment-work-and-how-accurate-is-it-anyway/">recent blog post</a>, they try to be conservative by labeling items as neutral unless the tone is clear. Fair enough. But the seven positive items were all pretty much neutral too. For example, several were help wanted postings that simply specified experience with the products in question. There were no items classified as negative, although one or two of the posts arguably could have been.<br /><br />In the blog post I just mentioned, Scout Labs offers a detailed discussion of its sentiment rating technique. The gist is that they don’t just count “happy” and “sad” words, but semantically analyze each entry to understand which words relate to the search topic. Sounds good in theory. They also say their automated ratings agree with college-educated humans about 75% of the time. In comparison, they say, college-educated humans agree with each other about 85% of the time. (Clearly they are not talking about married couples.) <br /><br />But if the vast majority of items are neutral, that’s less useful than it sounds. Remember the basic statistics: if 80% of the items are neutral, then a system that blindly ranks everything as neutral will be correct 80% of the time. The ratings that really count are the positives and negatives, and I wonder how often a human would agree with those ratings in Scout Labs. I’d want to look at that much more closely before deciding whether to rely on Scout Labs' results. <br /><br />I'd still pay for Scout Labs for the convenience of the searches, statistics and collaboration tools. As I say, it's a very nice interface. I might even find on closer examination that the sentiment ratings are useful even if they’re only somewhat accurate: after all, they might still get a trend right and call up useful samples. But much as I like the bells and whistles, I’m not as enthusiastic about Scout Labs as when I started.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com2tag:blogger.com,1999:blog-1380589722433800422.post-33586491956539294802009-02-23T09:04:00.000-08:002009-02-24T07:24:12.026-08:00Vizu Measures the Brand Impact of Online Ads with Just One QuestionI wrote last week about a general framework for <a href="http://mpmtoolkit.blogspot.com/2009/02/tools-for-social-media-measurement.html">measuring the marketing impact of social media</a>. This proposed a general hierarchy of:<br /><br />1. tracking mentions<br />2. identifying mentioners<br />3. measuring influence<br />4. understanding sentiment<br />5. measuring impact<br /><br />As with all marketing measurement, the hardest task is the last one: measuring impact. This requires connecting the messages that people receive with their actual subsequent behavior, and hopefully establishing a causal relationship between the two. The fundamental problem is the separation between those two events: unless the message and purchase are part of the same interaction, you need some way to link the two events to the same person. A second problem is the difficulty of isolating the impact of a single event from all the other events that could influence someone’s behavior.<br /><br />These problems are especially acute for brand advertising, which pretty much by definition is not connected with an immediate purchase. Brand advertisers have long dealt with this by imagining buyers moving through a sequence of stages before they make the actual purchase. A typical set of stages is awareness, interest, knowledge, trial (the first actual purchase) and regular use (repeat purchases).<br /><br />Even though these stages exist only inside the customer’s head, they can be measured through surveys. So can more detailed attitudes towards a product such as feelings about value or specific attributes. For both types of measurement, marketers can define at least a loose connection between the survey results and eventual product purchases. Although the resulting predictions are far from precise, they offer a way to measure subtle factors, such as the impact of different advertising messages, that techniques based on actual purchases cannot.<br /><br />The Internet is uniquely well suited for this type of survey-based analysis, since people can be asked the questions immediately after seeing an advertisement. One vendor that does this is <a href="http://www.factortg.com/">Factor TG</a>, which I wrote about last year (click <a href="http://mpmtoolkit.blogspot.com/2008/05/factor-tg-answers-all-measurement.html">here</a> for the post.) Another, which I mentioned last week, is <a href="http://www.vizu.com/">Vizu</a> .<br /><br />What makes Vizu different from other online brand advertising surveys is that each Vizu survey asks just one question. The question itself changes with each survey, and is based on the specific goal for the particular campaign. Thus, one survey might ask about awareness, while another might ask about purchase intentions. Vizu asks its question to a small sample of people who saw an advertisement and also to a control group of people who were shown something else. It assumes that the difference in answers between the two groups is the result of seeing the advertisement itself.<br /><br />Although asking a single question may seem like a fairly trivial approach, it actually has some profound implications. The most important one is that it greatly increases response rate: Vizu founder Dan Beltramo told me participation can be upwards of 3 percent, compared with tenths or hundredths of a percent for longer traditional surveys.<br /><br />This in turn means statistically significant survey results become available much sooner, giving marketers quick answers and letting them watch trends over relatively short time periods. It also provides significant results for much smaller ad campaigns or for panels within larger campaigns. This lets marketers compare results from different Web sites and for different versions of an ad, allowing them to fine tune their media selections and messages ways that traditional surveys cannot.<br /><br />Another benefit of simplicity is lower costs. Vizu can charge just $5,000 to $10,000 per campaign, allowing marketers to use it on a regular basis rather than only for special projects. Vizu also has little impact on the performance of the Web sites running the surveys, reducing cost from the site owner's perspective.<br /><br />The disadvantage of asking just one question is that you get just one answer. This prevents detailed analysis of results by audience segments, or exploration of how an ad affects multiple brand attributes. Vizu actually does provide a little information about the impact of frequency, drawn from cookies that track how often a given person has been exposed to a particular advertisement. Vizu also tracks where the person saw the ad, allowing some inferences about respondents based on the demographics of the host sites. Mostly, however, Vizu argues that a single answer is a good thing in itself because it keeps everyone involved focused on the ad campaign’s primary objective.<br /><br />According to Beltramo, Vizu’s main customers are online ad networks and site publishers, who use the Vizu results as a way to show their accountability to ad agencies and brand advertisers. Some agencies and advertisers also contract with the firm directly.<br /><br />What, you may be asking, has all this to do with social media measurement? Vizu’s approach applies not just to display advertising but also to social media projects such as downloadable widgets and micro sites.<br /><br />Even though Vizu can’t fully bridge the measurement gap between exposure and actual purchases, it does offer more insights than simply counting downloads, clickthroughs or traffic. In a world where so little measurement is available, every improvement is welcome.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com2tag:blogger.com,1999:blog-1380589722433800422.post-41600014998693259002009-02-19T10:41:00.000-08:002009-02-19T13:01:34.138-08:00Tools for Social Media MeasurementI was whining last week on my other blog about the <a href="http://customerexperiencematrix.blogspot.com/2009/02/blog-posts-ill-never-write-apologies-to.html">lack of integrated solutions for social media analytics</a>. No sooner had I written that, of course, than up popped several interesting solutions to prove me wrong. I plan to write soon about a couple of specific products, but will use this post to set a framework for evaluation.<br /><br />I suppose I should start with a definition of “social media”. By this I simply mean any communication method that allow users to interact directly with each other, as opposed to a broadcast medium where only a few people can send messages. I’m not intending to be especially restrictive here – I’d include blogs, public forums, <a href="http://www.facebook.com/">Facebook</a> , <a href="http://www.myspace.com/">Myspace</a>, <a href="http://www.youtube.com/">YouTube</a> , <a href="http://www.flickr.com/">Flickr</a> , <a href="http://www.twitter.com/">Twitter</a>, <a href="http://www.linkedin.com/">LinkedIn</a>, <a href="http://www.plaxo.com/">Plaxo</a> and many others. These all provide a huge stream of public chatter that marketers can tap into both to monitor what is being said about their products and to proactively spread their preferred messages.<br /><br />From a measurement perspective, I see several distinct functions. Today, these are largely served by separate point solutions. Integrated systems are beginning to emerge that combine at least a few. The ultimate integrated system would service them all. The functions are:<br /><br />- tracking mentions. This is the simplest goal; it simply means uncovering and reporting on social media events that relate to your product, brand or company. The fundamental tool here is the keyword search. Many systems do these, and some even combine different social network sources to provide a consolidated report. <a href="http://www.google.com/alerts?hl=en">Google Alerts</a> is probably the best known, although it doesn’t do much with social media aside from blogs. <a href="http://www.boardtracker.com/">BoardTracker</a> and <a href="http://www.linqia.com/">Linqia</a> are more focused on social communities.<br /><br />- identifying mentioners. Most social media comments are signed with a user ID of some sort, but the identity of the person behind that ID is often not clear. I haven’t actually seen tools that address this, but they probably exist. What’s needed is to look at whatever public profile is available, use that to find out other information about the person, and then in turn see if you can find that person in other social media. As a not-too-scary example, I recently saw a Twitter post that mentioned a vendor I follow. Checking out the poster's profile to see if she was worth “following”, I saw that she was from a small town where I used to live. Curious, I then found her in Linked In and discovered the company she worked for. Yes, this sounds uncomfortably like stalking, but it’s old news that the Internet is really good for that. What’s interesting here is the potential to help understand background of an individual and her social media profile. The steps that I took could easily be automated; indeed, products like <a href="http://www.zoominfo.com/">ZoomInfo</a> do something similar, although so far as I can tell they don't include social media other than blogs.<br /><br />- measuring influence. Influence has two overlapping dimensions: the influence of an individual mentioner, and the influence of a particular event. The mentioner’s influence is related to the profile I just mentioned, but also to blog readership, “friends” and network members in various social platforms, authority as measured by links and recommendations, etc. Again, these statistics are available in a scattered fashion for individual social media, and it wouldn’t be hard to build a system to pull them together once you had linked the user IDs. Surely someone is out there doing this but I haven’t tripped over them. Maybe if I spent more time at the gym?<br /><br />Measuring the influence of a particular event is actually easier. It is a matter of links, views, downloads, recommendations, ratings, etc. The statistics are often published along with the item itself. One possible tool is <a href="http://www.trackur.com/">TrackUr</a>, a low-cost product (from $18 to $197 per month) that scores Web sites based on “the number of backlinks pointing to a web site, the number of blog discussions, an estimate of traffic, and even the number of times the web site has discussed the phrase in the past.” Another that I suspect costs much more is <a href="http://www.radian6.com/">Radian6</a>, which “tracks comments, viewership, user engagement and other metrics, 24/7, so that you can clearly see the reach and affect[sic] each post has on the community.” It also can “uncover the influencers online by topic, based on user-determined formula weightings.”<br /><br />- understanding sentiment. This is the domain of semantic analysis (that’s a pun, kind of), which is a long-established field with many players. One specialist applying its technology to real-time Web content is <a href="http://www.hapax.com/">Hapax</a>. Solutions integrated more closely with social media search include <a href="http://www.crimsonhexagon.com/">Crimson Hexagon</a> and newly-launched <a href="http://www.scoutlabs.com./">Scout Labs</a> Scout Labs is also a low-cost option, with plans starting at $99 per month and currently offering 30-day free trial.<br /><br />- measuring impact. Ah, the bottom line: what did people exposed to the social media event actually <em>do</em>? Even the Web hasn’t yet reached the stage of universal behavior tracking that would really let you answer this, and I personally hope it never does. But one product that gets close is <a href="http://www.tealium.com/">Tealium Social Media</a>, which builds a list of Web URLs (both social media and regular online media) related to your product, checks which of those your Web site visitors had seen previously, and pops the results into Google Analytics so you can treat the Web events like any other visitor source. (See my earlier <a href="http://mpmtoolkit.blogspot.com/2009/01/tealium-measures-response-to-social.html">blog post on Tealium</a> for details.) At the other end of the process, <a href="http://www.vizu.com/">Vizu</a> lets marketers embed a question in Web ads that asks about the brand attitudes, and compares this against answers of people who didn’t see the ad, thereby measuring the net impact of the ad itself. The vendor has embedded its questions in social media applications from vendors including <a href="http://www.lotame.com/">Lotame</a> (ads in social networks), <a href="http://www.adnectar.com/">AdNectar</a> (social ‘gifting’) and <a href="http://www.buddymedia.com/">Buddy Media</a> (custom social applications). See their <a href="http://answers.vizu.com/solutions/pr/press-release/20090210.htm">press release</a> for details.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com2tag:blogger.com,1999:blog-1380589722433800422.post-74121965174835907512009-02-06T09:23:00.000-08:002009-02-06T09:39:15.018-08:00When All Marketing is Internet Marketing, All Agencies are Internet AgenciesA little press notice this week reported a <a href="http://www.ogilvy.com/press/showpress.php?ID=6963">January 20 announcement</a> from <a href="http://www.ogilvy.com/">Ogilvy North America</a> of “strategic alliances” with marketing automation software vendor <a href="http://www.unica.com/">Unica</a> and marketing database integrator <a href="http://www.pluris.com/">Pluris</a>. On its face, this seemed to suggest a change in strategy for all three firms, moving towards a database marketing agency approach that combines technology, marketing strategy, data and analytics. But close reading of the press release shows this is just an agreement to make referrals. When I asked one of the players involved, they confirmed that’s all there is.<br /><br />Nevertheless, the announcement prompted a little flurry of speculation in the Twittersphere / blogosphere (we need a new term -- blabosphere?) about changes in the role of traditional advertising agencies. Even though the database marketing agency model has been held a relatively small niche for decades (pioneers like <a href="http://www.epsilon.com/">Epsilon</a> were founded in the late 1960’s), the thought seems to be that it will soon become the dominant model. <br /><br />I’m skeptical. In some ways, the basic technologies for customer management have actually become more accessible to non-specialist companies. In particular, the hardest part, building a customer database, has largely been taken over by customer relationship management systems. Once that’s in place, it’s not much more work to add a serious marketing automation system. In fact, all you do is buy software like Unica’s—which is why a firm like Ogilvy doesn’t need to build its own, or to have a particularly intimate relationship with Unica itself. Yes, Ogilvy and other agencies need database marketing competencies. But all they really need to do is manage a firm like <a href="http://www.acxiom.com/">Acxiom</a> doing the actual work. This takes expertise but much less capital and human investment than doing it yourself.<br /><br />So, if database marketing has become easier, there is even less need than in the past for an integrated database marketing agency. Database marketing has remained a small part of the industry because its scope is too limited, particularly in dealing with non-customers (who mostly are not in your database). (Yes, the credit card industry is an exception.)<br /><br />But the Internet is changing the equation substantially. Advertising agencies marginalized database marketing because customer management is not their core business. But advertising agencies exist to buy ads, and Internet advertising is now too important for them ignore. Plus, Internet advertising is much closer to agencies’ traditional core business of regular advertising, so it’s much easier for them to conceive it as a logical extension of their offerings. Even though many specialist agencies sprung up to handle early Internet advertising, the traditional agencies are now reasserting their control.<br /><br />Now here’s the key point: managing Internet ads is not the same as managing traditional advertising. Ad agencies will develop new skills and methods for the Internet, and those skills and methods will eventually spread throughout the agency as a whole. Doing a good job at creating, buying and evaluating Internet advertising requires vastly more data and analysis than doing a good job at traditional mass media. It will take a while for the agencies to develop these skills and procedures, but these are smart people with ample resources who know their survival is at stake. They will keep working at it until they get it right.<br /><br />Once that happens, those skills and methods won’t stop at the door of the Internet department. Agencies will recognize that the same skills and methods can be applied to other parts of their business, and frankly I expect that they’ll find themselves frustrated to be reminded how poorly traditional marketing has been measured. Equipped with new tools and enlightened by a vision of how truly modern marketing management, agency leaders will bring the rest of their business up to Internet marketing standards of measurement and accountability. It’s like any technology: once you’ve seen color TV, you won’t go back to black and white.<br /><br />We’re already seeing hints of this in public relations, where the traditional near-total lack of performance measurement is rapidly being replaced by detailed analyses of the impact of individual placements. In fact, the public relations people are even pioneering quantification of social network impact, perhaps the trickiest of all Internet marketing measurement challenges.<br /><br />So, yes, I do see a great change in the role of advertising agencies. I even expect they will resemble the integrated strategy, technology, analytics and data of today’s database marketing agencies. But it won’t happen because the ad agencies adopt a database marketing mindset. It will happen because they want to keep on making ads.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com1tag:blogger.com,1999:blog-1380589722433800422.post-66355716296611088262009-02-01T11:40:00.000-08:002009-02-01T11:55:15.383-08:00Razorfish Study Measures Direct Response to Social MediaI’ve been spending more time than I should recently on <a href="http://www.twitter.com/">Twitter</a> (follow me at @draab). It provides a fascinating peek into the communal stream-of-consciousness, which would be pretty horrifying (“Britney…Brad…Jen…Obama…groceries…Britney…Britney…Britney”) if you couldn’t choose the people and search terms you follow. This filtering (which I do via a great product called <a href="http://www.tweetdeck.com/">Tweetdeck</a>) turns Twitter into a very efficient source of information I wouldn’t see otherwise.<br /><br />Naturally, my interest in Twitter also extends to how you measure its business value, and by extension the value of social media in general. Since the people I follow on Twitter are both marketers and Twitter users, they discuss this fairly often. One recent post (technically a “tweet” but the term seems so childish) pointed to a study <a href="http://www.razorfish.com/download/img/content/Social%20Media%20Measurement%20Widgets%20and%20Applications.pdf">Social Media Measurement: Widgets and Applications</a> by interactive marketing agency <a href="http://www.razorfish.com/">Razorfish</a>. <br /><br />The study turns out to be a very brief and straightforward presentation of two projects, both involving creation of downloadable widgets. One was promoted largely through conventional media and the other through widget distribution service <a href="http://www.gigya.com/">Gigya</a>. For each project, we’re told the costs, number of visitors and/or downloads, how much time and money they spent, and the return on investment. The not-very-surprising findings were that people who spent more time also spent more money and, more broadly, that “social media may be used effectively as a way of engaging users and potential customers.” A less predictable and potentially more significant finding from the first project was that people who were referred by a friend downloaded more often and spent much more money than people who were attracted by the media. The numbers were: downloads, 23% vs. 8%; spend any money, 9% vs. 1%; and amount spent, $23.00 vs $3.14. But the study points out that the numbers were very small—only 216 individuals arrived at the landing page as a result of a friend’s email, vs. 41,599 from media sources. These figures are drawn only from the first project because the second project couldn’t be measured this way.<br /><br />From a marketing measurement standpoint, none of this seems break any new ground. Visitors are tracked by their source URLs and subsequent behavior is tracked through cookies. The ROI is calculated on straight revenue (it really should be profit) and seems to include only immediate purchases. This is particularly problematic for the second project, which promoted a $399 product with very limited supply that sold out in one minute. (The study doesn’t say, but based on <a href="http://www.adoperationsonline.com/2008/09/10/avenue-arazorfish-gigya-honored-as-2008-mixx-award-finalist-for-levis%C2%AE-23501-widget-advertising-campaign/">this award citation</a> it seems to be a special edition Nike Air Jordan shoe.) Clearly the point of such Air Jordan promotions isn’t immediate revenue, but brand building at its hard-to-measure best. The real challenge of evaluating social media is measuring this type of indirect impact. This study makes no claim to do that, but I’ll keep my eyes out for others that do.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-66891297593724781112009-01-21T10:44:00.000-08:002009-01-21T15:18:29.466-08:00Tealium Measures Response to Social MediaThe Internet promises marketers an exquisite measurability: you can tell precisely where each Web site visitor came from, and what people from each source do after they arrive. But non-advertising media such as blogs, online news articles, YouTube, Facebook and Twitter are a blind spot because many references to a company don’t contain a clickable link. (<a href="http://www.tealium.com/">Tealium</a>, whose solution I’ll discuss shortly, put the figure at 80% in one study.) Without a link, users either must type the destination URL into their browser or find the site through a search engine. Either way, the visit is not associated with the original source. Therefore, marketers who want to know, say, how many site visits were prompted by a particular YouTube video have no direct way to find out.<br /><br />Tealium, a developer of specialized Web analytics tools founded last year by veterans of WebSideStory/Visual Sciences, offers Tealium Social Media as a solution. It first builds a list of Internet references to a product, based on automated searches of sources such as Google News and Blogsearch, YouTube, Bloglines, Twitter, etc., plus any other RSS source you might have available. The system then checks whether visitors to a company Web site have previously visited one of these references by checking the cache of the visitor’s browser. If a match is found, the visit is attributed to that source.<br /><br />I’m going to stop right here and say that this struck me as raising a significant privacy issue. I hadn’t really given the matter any thought but had assumed my browser history was private. But a Google search on "read browser history" shows that a method to check whether someone has visited a specified URL is widely known. This is what Tealium does and it isn’t as invasive as simply reading everything. More important, Tealium doesn't track individuals: rather, it reports how many people come from a given source. This is little different from conventional Web analytics, so I guess there is no particular privacy objection to the product. And, yes, you can always clear your browser cache or shorten the retention period. Quick show of hands: how many of you have actually done that? I thought so. End of sermon.<br /><br />Tealium’s approach won’t be 100% accurate, since some people really do clean out their browser caches,. A few people will also access a site from a different computer or browser than the one where they saw the reference. Nor will Tealium capture referrals, such as an email I sent you with a product’s name after reading an article about it. But most of these problems apply to other Web analytics techniques, and on the whole the data should be accurate enough to be useful. It will certainly give a good measure of the relative power of different sources.<br /><br />The system must also choose how to assign credit if the visitor’s cache contains more than one of the reference items. Tealium handles this by ranking the items on popularity and recency, and assigning the match to the highest ranked item. This seems reasonable.<br /><br />Of course, Tealium can only measure Web-based activities. This almost goes without saying, but it's worth reminding ourselves every so often that there are still plenty of non-Web interactions taking place.<br /><br />Tealium originally intended to present its social media results in a stand-alone interface. But the vendor decided a couple of months ago to instead feed them into existing Web analytics products, and Google Analytics in particular. This reduced the work Tealium had to perform (no reporting or data storage), hence lowering development and operating costs. From the client viewpoint, it integrates the social media results with other Web analytics, allowing direct comparisons between paid and unpaid media. In addition, downstream measures such as conversions or purchases automatically become available for the Tealium-derived sources. This was a very wise move.<br /><br />What Tealium won’t provide is measures of sentiment, such as whether a particular social media reference was praise or criticism, of comments on particular subjects, or of changes in customer attitudes. Nor does it claim to. There are of course many other systems in this field; see last week’s post on <a href="http://mpmtoolkit.blogspot.com/2009/01/interesting-conference-on-real-time.html">reputation monitoring systems</a> for a pointer to a detailed list.<br /><br />Pricing of Social Media starts at $2,000 for implementation plus $250 per month with a one year contract. Price grows slightly as users add keywords and data feeds but is not related to actual traffic volume. The system has been in beta test with six clients until recently, and is being formally launched today.<br /><br />Social Media is Tealium’s third product. The other two are WebToCRM, which captures Web visitor data and posts it to a CRM system, and Universal Tag, which lets a single page tag feed visitor data to multiple Web analytics systems.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-86573947231457131282009-01-15T05:58:00.000-08:002009-01-15T06:11:48.397-08:00Interesting Conference on Real Time Communications; Great List of Tools for Reputation MonitoringI spent yesterday morning at a conference on “Real-Time Communications” presented by the <a href="http://www.bdionline.com/">Business Development Institute</a> and sponsored by <a href="http://www.prnewswire.com/">PR Newswire</a>. Not surprisingly, given the sponsor, this turned out to be mostly by and for public relations professionals. This group’s main concern seemed to be reacting to public criticism, and “real time media” meant primarily blogging and Twitter. There was heavy representation from the pharmaceutical industry in particular, which, as several speakers mentioned with obvious frustration, is highly constrained by regulatory rules from making proactive comments. Beyond reacting to immediate crises, it seems the main media relations strategy of this group is to reach out to better educate the press about industry issues, so any reporting will be based on a reasonably accurate understanding of the situation. Apparently even this basic approach is somewhat revolutionary in the industry: keynote Ray Kerins of <a href="http://www.pfizer.com/">Pfizer</a> said that until he took over as VP Worldwide Communications two years ago, the company policy was to simply ignore the first phone call from any reporter. Interesting attitude, that.<br /><br />Kerins also provided perhaps the most intriguing factoid of the day, which was that 15,000 journalists lost their jobs in 2008. (I traced this figure to the Web site <a href="http://graphicdesignr.net/papercuts/">Paper Cuts</a> , which tracks reports of newspaper layoffs and buyouts. Apparently the total includes all newspaper employees, not just newsroom staff. But either way, it’s a big number.) Kerins’ comment was that many of the people being let go are well-trained and experienced reporters, who provide “context and analysis”. They are being replaced in many cases by bloggers and other non-professional observers who offer “speed” but are often not as knowledgeable, thorough or objective. This is a big issue, particularly for someone in a complicated industry such as pharmaceuticals.<br /><br />Another, related point came from Morgan Johnston, Corporate Communications Manager of <a href="http://www.jetblue.com/">JetBlue</a>, who described a situation where a customer complained while at the airport to 10,000 online readers about not being compensated properly when her baggage didn’t show up—only to have it appear 15 minutes later. (I’m not clear whether this was on Twitter or a conventional blog.) His point was that the damage was done, even if she posted a follow-up message saying that all was well. The original complaint will live on more or less forever, and people may not notice the final resolution. The particular moral here was the need to respond very quickly to such complaints so the company’s reaction becomes part of the permanent record. <br /><br />From my own perspective, I was struck by the focus on reacting to other people’s comments in real-time media, as opposed to using those media for a company’s own marketing programs. I suppose the outbound programs are run by marketing rather than public relations. <br /><br />On the specific issue of marketing measurement, no one at the conference seemed to feel they could meaningfully measure the return on investment of blogging and other projects. From the reactive PR perspective, it’s largely about being defensive and preventing damage to reputation, so it’s probably something you can’t afford not to do. The very little discussion I heard about proactive programs mentioned that it’s occasionally possible to count the direct leads or revenue, but there isn’t much of a way to measure the long-term financial value. This matches my own observations, mostly because the impact of these programs is usually too small to isolate from other factors that also affect performance. There might however be non-financial measures that are more sensitive, like Web site traffic by source.<br /><br />One very specific and highly valuable product of the conference was a casual remark by one panelist to look at a Web post by Dan Schawbel at <a href="http://mashable.com/">Mashable.com</a> for tools to measure brand reputation online. I tracked this down and found two extremely valuable posts, one describing <a href="http://mashable.com/2008/12/24/free-brand-monitoring-tools/">free brand monitoring tools</a> and another describing <a href="http://mashable.com/2008/12/29/brand-reputation-monitoring-tools/">paid reputation monitoring tools</a> (many of which are very inexpensive). There’s no point to my listing the products here, since you can just read the posts themselves. But this is very useful information – indeed, it made the whole morning worthwhile.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-44103770125583464822008-12-18T18:53:00.000-08:002008-12-18T19:20:26.449-08:00Aberdeen Reports Show Varied Roles for Performance MeasurementOur friends at <a href="http://www.aberdeen.com/">Aberdeen Group</a> apply a highly standardized research process to technology issues. They take a survey that asks companies about their business performance and the business processes, organization, knowledge management, technologies and performance measures related to a technology. They then divide the companies into leaders (“best-in-class”), laggards and industry average based on their business performance, and compare replies for the different groups. The not-quite-stated implication is that the differences in performance are caused by differences in the other factors. This is not necessarily correct (the ever-popular <a href="http://en.wikipedia.org/wiki/Post_hoc_ergo_propter_hoc">post hoc ergo propter hoc</a> fallacy) and you could also wonder about the sample size (usually around 200) and how accurately people can answer such detailed questions. But so long as you don’t take the studies too seriously, they always give an interesting look at how firms at different maturity levels manage the technologies at hand.<br /><br />It so happens that three of the Aberdeen studies have been sitting on my desk for some time, so I had a chance to look at them together. The topics were <a href="http://www.aberdeen.com/summary/report/benchmark/5378-RA-successful-lead-generation.asp">Lead Nurturing</a>, <a href="http://www.aberdeen.com/summary/report/benchmark/5361-RA-trigger-marketing-timing.asp">Trigger Marketing</a> and <a href="http://www.aberdeen.com/summary/report/benchmark/5309-RA-multichannel-marketing-management.asp">Cross-Channel Campaign Management</a>. All are currently available for free although the sponsors may contact you in exchange. <br /><br />Since the Aberdeen reports all follow a similar format, it’s easy to compare their contents. From the perspective of marketing performance measurement, they contain two elements of interest. These are the performance measures highlighted as distinguishing best-in-class companies, and the role of measurement among recommended strategic actions. Here’s a brief look at each of these in the three reports:<br /><br /><strong>Lead Nurturing.</strong> The report highlighted number of qualified leads and lead-to-close ratio as critical performance measures, and found that 77% of best-in-class companies were tracking them. It also recommended tracking revenue associated with leads, although it found only 35% of best-in-class companies could do this. But otherwise, it didn’t see performance measurement as a central issue: the primary focus was on matching marketing messages to the prospect’s current stage in the buying cycle. Other important strategies were leveraging multiple channels, identifying prospect buying cycle and needs, and using automated lead scoring to move customers through the cycle.<br /><br /><strong>Trigger Marketing.</strong> This report did not identify particular marketing measures as critical, although it did say that having defined performance goals for trigger marketing programs is important. It reported the most common measure is change in response rates, used by 69% of all respondents. (The next most common measure, change in retention rates, was used by just 54%.) I take this as a sign of immaturity (among the respondents, not Aberdeen), since response rate is a primitive measure compared with profitability and return on marketing investment, which were used by 43% and 42% respectively. This is consistent with another finding: the most common strategic action is to “link trigger marketing activities to increased revenues and other business results” (32%). I interpret that as meaning people are just learning to do make that linkage and are simply using response rate until they figure it out. It might be worth noting that the Aberdeen analyst highlighted digital dashboards as next step for best-in-class companies wishing to do still better, although I didn’t see a particularly compelling case for selecting that over other possible activities. But I’m all in favor of dashboards, so I’m glad to see it.<br /><br /><strong>Cross-Channel Campaign Management.</strong> Again, the report doesn’t specify particular performance measures. It does say that it’s important to optimize future campaigns based on past performance (pretty obvious) and highlight real-time tracking of results across channels (less obvious, although I’m not so sure I agree. Immediate results may not in fact correlate with long-term profitability). This report did include segmentation and analytics as a strategic actions. (I consider these as part of performance measurement.) In particular, it stressed that best-in-class companies were focused on identifying their high value customers and treating them uniquely. Most of the recommendations, however, were about building the infrastructure needed to coordinate marketing messages across channels, and then executing those coordinated campaigns.<br /><br />So where does this leave us? I don’t draw any grand lessons from these three reports, except to note that financial measures (i.e., customer profitability and return on investment) don’t play much of a role in any of them. Even that probably just confirms that such measures not widely available, which we already knew. But it’s good to know that people are working on performance measurement and that Aberdeen is baking it into its research.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-75572369216238216202008-12-11T15:14:00.000-08:002008-12-11T15:29:01.757-08:00Survey: Marketing Accountability Measures Remain WeakEvery year since 2005, the <a href="http://www.ana.net/">Association of National Advertisers</a> and vendor <a href="http://www.mma.com/">MMA</a> (Marketing Management Analytics) have joined forces to produce a survey on marketing accountability. Although the details change each year, the general results have been sadly consistent: marketers, finance executives and senior management are very unhappy with their marketing measurement capabilities. <br /><br />In the 2008 study, <a href="http://www.mma.com/PressReleases/Press%20Release%2007.02.08.pdf">released in July</a> and just recapitulated in a <a href="http://www.mma.com/whitepapers.htm">new MMA white paper</a>, only 23% of the marketers were satisfied with their metrics for marketing’s impact on sales, and just 19% were satisfied with metrics showing marketing impact on ROI and brand equity. <br /><br />Furthermore, only 14% of the marketers felt their senior management had confidence in marketing’s forecasts of sales impact. And even this is probably optimistic: a separate MMA-funded study, also cited in the new white paper, found that only 10% of financial executives use marketing forecasts to help set the marketing budget.<br /><br />The obvious question is why so little progress has been made. Marketers consistently rank performance measurement as their top priority (for example, see the <a href="http://www.cmocouncil.org/">CMO Council</a>’s <a href="http://www.cmocouncil.org/resources/form_mo_execsummary.asp">Marketing Outlook 2008</a> survey). Nor are marketers doing this out of the goodness of their hearts: they know that being able to show the impact of their expenditures is the best way to protect and grow their budgets. So marketers have every reason to work hard at developing performance measures that finance and senior management will accept.<br /><br />And yet...when the ANA survey asked marketers to rank their accountability challenges, the top score (45%) went to “understanding the impact of changes in consumer attitudes and perceptions on sales”. This strikes me as odd, if the marketers’ ultimate goal is to understand the impact of marketing programs on sales. Measuring the impact of marketing programs and measuring the impact of customer attitudes are not the same thing. <br /><br />Nor is this a simple fluke of the wording. A separate question showed the most common accountability investment was in “brand and customer equity models” (53%). These also measure the link between attitudes and sales. <br /><br />One explanation for the disconnect would be that marketers can already measure the relationship between marketing programs and consumer attitudes, so they can complete the analysis by adding the link between attitudes and sales. This seems a bit optimistic, especially since it also assumes that marketers also understand the impact on sales of marketing programs that are not aimed at consumer attitudes, such as price and trade promotions.<br /><br />A more plausible explanation would be that the link between attitudes and sales is the hardest thing to measure, so that’s where marketers put their effort. Or, maybe that relationship is the question that marketers find most intriguing because, well, that’s the sort of thing they care about. A cynic might suggest that marketers don’t want to measure the link between marketing programs and sales because they don’t want to know the answer. But even the cynic would acknowledge that marketers need a way to justify their budgets, so that can’t be it.<br /><br />None of these answers really satisfies me, but let’s put this question aside. I think we can safely assume that marketers really do want to measure their performance. This leaves the question of why they haven’t made much progress in doing it.<br /><br />One reason could be that they simply don’t know how. Marketing measurement is truly difficult, so that’s surely part of it. <br /><br />Another possibility is that they know how, but lack the resources. Since good marketing measurement can be quite expensive, this is probably part of the problem as well. Remember that the resources involved will ultimately come from the corporate budget, so finance departments and senior management must also agree that marketing measurement is the best thing to spend them on. And, indeed, this doesn’t seem to be their priority. The white paper states that “the number of CEOs and CFOs championing marketing accountability programs within their firms remained negligible and unchanged from 2007.”<br /><br />This is a pretty depressing conclusion, although to me it has the ring of truth. Fuss though they may, CEOs and CFOs are not willing to invest money to solve the problem. Indeed our friend the cynic might argue that they are the ones with a motivation to avoid measurement, since it gives them more flexibility to allocate funds as they prefer. <br /><br />The white paper doesn’t dwell on this. It just lists lack of senior management involvement as one of many obstacles. The paper authors then go on to propose a four step process for developing an accountability program:<br /><br />- assess and benchmark existing capabilities and resources<br />- define an achievable future state, in terms of the business questions to answer and the resources required to answer them<br />- work with stakeholders to align metrics with corporate goals and key business questions<br />- establish a roadmap with a multi-year phased approach<br /><br />There’s not much to argue with here. The paper also provides a reasonable list of success factors, including:<br /><br />- realistic stakeholders expectations<br />- agreement on scope at the start of the project<br />- cross-functional team with clearly defined roles, responsibilities and communication points<br />- simple math and analytics<br />- integration of analytics for pricing, ROI, and brand analysis<br /><br />Again, it’s all sound advice. Let’s hope you can get the resources to follow it.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-493327318702108332008-12-05T09:38:00.000-08:002008-12-05T09:59:14.466-08:00TraceWorks' Headlight Integrates Online Measurement and ExecutionI’ve been looking at an interesting product called Headlight from a Danish firm <a href="http://www.traceworks.com/">TraceWorks</a>. Headlight is an online advertising management system, which means that it helps marketers to plan, execute and measure paid and unpaid Web advertising.<br /><br />According to TraceWorks CEO Christian Dam, Headlight traces its origins to an earlier product, Statlynx, which measured the return on investment of search marketing campaigns. (This is why Headlight belongs on this blog.) The core technology of Headlight is still the ability to capture data sent by tags inserted in Web pages. These are used to track initial responses to a promotion and eventual conversion events. The conversion tracking is especially critical because it can capture revenue, which provides the basis for detailed return on investment calculations. (Setting this up does require help from your company's technology group; it is not something marketers can do for themselves.) <br /><br />These functions are now supplemented by functions that let the system actually deliver banner ads, including both an ad serving capability and digital asset management of the ad contents. The system can also integrate with Google AdWords paid search campaigns, automatically sending tracking URLs to AdWords and using those URLs in its reports. It can also capture tracking URLs from email campaigns.<br /><br />All Web activity tracking may make Headlight sound like a Web analytics tool, but it’s quite different. The main distinction is that Headlight lets users set up and deliver ad campaigns, which is well outside the scope of Web analytics. Nor, on the other hand, does Headlight offer the detailed visitor behavior analysis of a Web analytics system.<br /><br />The campaign management functions extend both to the planning that precedes execution and to the evaluation that follows it. The planning functions are not especially fancy but should be adequate: users can define activities (a term that Headlight uses more or less interchangeably with campaigns), give them start and end dates, and assign costs. The system can also distinguish between firm plans and drafts. TraceWorks expects to significantly expand workflow capabilities, including sub-tasks with assigned users, due dates and alerts of overdue items, in early 2009.<br /><br />Evaluation functions are more extensive. Users can define both corporate goals (e.g., total number of conversions) and individual goals (related to specific metrics and activities) for specific users, and have the system generate reports that will compare these to actual results. Separate Key Performance Indicator (KPI) reports show selected actual results over time. In addition, something the vendor calls a “WhyChart” adds marketing activity dates to the KPI charts, so users can see the correlation between different marketing efforts and results. Summary reports can also show the volume of traffic generated by different sources.<br /><br />The value of Headlight comes not only from the power of the individual features but the fact that they are tightly integrated. For example, the asset management portion of the system can show users the actual results for each asset in previous campaigns. This makes it much easier for marketers to pick the elements that work best and to make changes during campaigns when some items work better than others. The system can also be integrated with other products through a Web Service API that lets external systems call its functions for AdWords campaign management, conversion definition, activity setup, and reporting.<br /><br />Technology aside, I was quite impressed with the openness of TraceWorks as a company. The Web site provides substantial detail about the product, and includes a <a href="http://wiki.headlighthq.com/index.php?title=Main_Page">Wiki</a> with what looks like fairly complete documentation. The vendor also offers a 14 day free trial of the system.<br /><br />Pricing also seems quite reasonable. Headlight is offered as a hosted service, with fees ranging from $1,000 to $5,000 per month depending on Web traffic. According to Dam, the average fee is about $1,300 per month. Larger clients include ad agencies who use Headlight for their own clients.<br /><br />Incidentally, the company Web site also includes an interesting benchmarking offer, which lets you enter information about your own company's online marketing and get back a report comparing you to industry peers. (Yes, I know a marketing information gathering tool when I see one.) At the moment, unfortunately, the company doesn't seem to have enough data gathered to report back results. Or maybe it just didn't like my answers.<br /><br />TraceWorks released its original Statlynx product in 2003 and launched Headlight in early 2007. The system currently serves about 500 companies directly and through agencies.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com1tag:blogger.com,1999:blog-1380589722433800422.post-29897737310383199012008-11-28T12:25:00.000-08:002008-11-28T12:29:46.521-08:00Judging the Value of Marketing Data<p>Last week’s post on <a href="http://mpmtoolkit.blogspot.com/2008/11/i-recently-wanted-to-measure-relative.html">ranking demand generation vendors </a>highlighted a fundamental challenge in marketing measurement: the data you want often isn’t available. So a great deal of marketing measurement comes down to deciding which of the available data best suits your needs, and ultimately whether that data is better than nothing.<br /><br />It’s probably obvious why using bad data can be worse than doing nothing, but in case this is read by, say, a creature from Mars: we humans tend to assume others are telling the truth unless we have a specific reason to question them. This innate optimism is probably a good thing for society as a whole. But it also means we’ll use bad data to make decisions which we would approach more cautiously if we had no data at all.<br /><br />But how do you judge a piece of data? Here is a list of criteria presented in my book <em>The MPM Toolkit</em>, due in late January.<br /><br />· <em>Existence.</em> Ok, this is pretty basic, but the information does have to exist. Let’s avoid the deeper philosophical issues and just say that data exists if it is recorded somewhere, or can be derived from something that’s recorded. So the color of your customers’ eyes only exists as data if you’ve stored it on their records or can look it up somewhere else. If the data doesn’t exist, you may be able to capture it. Then you have to compare the cost of capturing it with its value. But that’s a topic for another day.<br /><br /><em>· Accessibility.</em> Can you actually access the data? To get back to last week’s post, we’d love to know the revenue of each demand generation vendor. This data certainly exists in their accounting systems, but they haven’t shared it with us so we can’t use it. Again, it’s often possible to gain access to information if you’re willing to pay the price, and you must once more compare the price with the value. In fact, the price / value tradeoff will apply to every factor in this list, so I won’t bother to mention it from here on out.<br /><br /><em>· Coverage.</em> What portion of the universe is covered by the data? In the case of demand generation vendors, the number of blog posts was a poor measure of market attention because the available sources clearly didn’t capture all the posts. In itself, this isn’t necessarily fatal flaw, since a fair sample could still give a useful relative ranking. But we can’t judge whether the coverage was a fair sample because we don’t know why it was incomplete. This is a critical issue when assessing whether, or more precisely how, to use incomplete data. (In the demand generation case, the very small numbers of blog posts added another issue, which is that the statistical noise of a few random posts could distort the results. This is also something to consider, although hopefully most of your marketing data deals with larger quantities.)<br /><br /><em>· Accuracy.</em> Data may not have been accurate to begin with or it may be outdated. Data can be inaccurate because someone purposely provided false information or because the mechanism is inherently flawed. Survey replies can have both problems: people lie for various reasons and they may not actually know the correct answers. Even seemingly objective data can be incorrect: a simple temperature reading may inaccurate because the thermometer was miscalibrated, someone read it wrong, or the scale was Celsius rather than Fahrenheit. Errors can also be introduced after the data is captured, such as incorrect conversions (e.g., inflation adjustments used to create “constant dollar” values) or incorrect aggregation (e.g., customer value statistics that do not associate transactions with the correct customers). In our demand generation example, statistics on search volume were highly inaccurate because the counts for some terms included results that were clearly irrelevant. As with other factors listed here, you need to determine the level of accuracy that’s required for your specific purpose and assess whether the particular source is adequate.<br /><br /><em>· Consistency.</em> Individually accurate items can be collectively incorrect. To continue with the thermometer example, readings from some stations may be in Celsius and others in Fahrenheit, or readings from a single station may have changed from Fahrenheit to Celsius over time. This particular difference would be obvious to anyone examining the data, although it could easily be overlooked in a large data set that combined information from many sources. Other inconsistencies are much more subtle, such as changes in wording of survey questions or the collection mechanism (e.g., media consumption diaries vs. automated “people meters”). As with coverage, it’s important to understand any bias introduced by these factors. In our demand generation analysis, <a href="http://www.compete.com/">Compete.com </a>used several different techniques to measure Web traffic, and it appeared that these yielded inconsistent results for sites with different traffic levels.<br /><br /><em>· Timeliness.</em> The primary issue with timeliness is how quickly data becomes available. In the past, it often took weeks or months to gather marketing information. Today, data in general moves much more quickly, although some information still take months to assemble. There is a danger that quickly available data will overwhelm higher-quality data that appears later. For example, initial response rate to a promotion is immediately available, but the value of those responses can only be measured over time. Decisions based only on gross response often turn out to be incorrect once the later performance is included in the analysis. Still, timely data can be extremely important when it can lead to adjustments that improve results, such as moving funds from one promotion to another. Online marketing in particular often allows for such reactions because changes can be made in hours or minutes, rather than the weeks and months needed for traditional marketing programs.<br /><br />I haven’t listed cost as a separates consideration only because there are often incremental investments that can made to change a data element’s existence, accessibility, coverage, etc. Those investments would change its value as well. But you will ultimately still need to assess the total cost and value of a particular element, and then compare it with the cost and value of other elements that could serve a similar purpose. This assessment will often be fairly informal, as it was in last week’s blog post. But you still need to do it: while an unexamined life may or not be worth living, unexamined marketing data will get you in trouble for sure.</p>David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-77181313727498555482008-11-21T05:56:00.000-08:002008-11-21T11:05:42.271-08:00Twitter Volume for Demand Generation Vendors<p>A comment on my <a href="http://mpmtoolkit.blogspot.com/2008/11/i-recently-wanted-to-measure-relative.html">previous post </a>suggested Twitter mentions as a possible measure of vendor market presence. That had in fact occurred to me, but I hadn't bothered to check because I assumed the volume would be too low. But since the topic had been raised, I figured I'd take a peek.<br /><p>The first two Twitter monitoring sites I looked at, <a href="http://www.twitscoop.com/">Twitscoop</a> and <a href="http://www.twitterment.com/">Twitterment</a>, seemed to confirm my suspicion: of the three most popular vendors, <a href="http://www.eloqua.com/">Eloqua</a> had 6 Twitscoop hits and 3 Twitterment hits; <a href="http://www.silverpop.com/">Silverpop</a> had 2 on each; and <a href="http://www.marketo.com/">Marketo</a> had 3 on Twitscoop and none on the other. No point in looking further here.</p><p>But then I checked <a href="http://www.twitstat.com/">Twitstat</a>. In addition to having a slightly less childish name, it seems to either do a more thorough search or look back further in time: for whatever reason, it found 152 hits for Eloqua, 65 for Silverpop, and 133 for Marketo. Much more interesting.</p><p>Alas, the numbers dropped down considerably after that, as you can see in the table below. Everything else is in single digits except for two anomalies <a href="http://www.loopfuse.com/">LoopFuse</a> with 22 mentions and <a href="http://www.bulldogsolutions.com/">Bulldog Solutions</a> with a whopping 217. Interestingly, both those sites also had exceptionally high blog hit numbers on IceRocket. The root cause is probably the same: one or two active bloggers or Twitter users (which seems to be the accepted term; I guess we can't call them Twits) are enough to skew the figures when volumes are so low. More particularly, LoopFuse gets a lot of attention because some of its founders are closely tied to the open source community. Bulldog Solutions just seems to have a group of employees who are serious into Twitter. In fact, I now know more about their lives than I really care to (although there was nothing indiscreet in the posts, I'm pleased to report).</p><p>A couple of side notes:</p><p>- the very short length of the messages does make them easy to read, which paradoxically means you can actually gather more information from Twitter than by scanning blog posts, because reading the blog posts takes too much time. Of course, when we're dealing with such tiny volumes, there is no way to generalize from what you read: Twitter is strictly anecdotal evidence, and perhaps even dangerous for that reason.</p><p>- there seemed to be several Tweets that were purposely sent for marketing purposes. Nothing wrong with that, and they were quite open about it. Just interesting how quickly some firms have picked up on this. (OK, not so quickly: Twitter has been around since 2006 and very popular for about a year now.)</p><p>Still, the bottom line for the purposes of measuring demand generation vendors is still the same as for blogs: too little volume to be a reliable measure of relative market interest.<br /><br /><table cellspacing="0" cellpadding="0" width="520" border="1"><tbody><tr><td valign="top" width="171"></td><td valign="top" width="100"><p align="center">Twitterment</p></td><td valign="top" width="83"><p align="center">Ice Rocket</p></td><td valign="top" width="79"><p align="center">Alexa</p></td><td valign="top" width="88"><p align="center">Alexa</p></td></tr><tr><td valign="top" width="171"><p align="center"></p></td><td valign="top" width="100"><p align="center">twitter mentions</p></td><td valign="top" width="83"><p align="center">blog posts</p></td><td valign="top" width="79"><p align="center">rank</p></td><td valign="top" width="88"><p align="center">share x 10^7</p></td></tr><tr><td valign="top" width="171">Already in Guide:</td><td valign="top" width="100"></td><td valign="top" width="83"></td><td valign="top" width="79"></td><td valign="top" width="88"></td></tr><tr><td valign="top" width="171">Eloqua</td><td valign="top" width="100"><p align="right">152</p></td><td valign="top" width="83"><p align="right">286</p></td><td valign="top" width="79"><p align="right">20,234</p></td><td valign="top" width="88"><p align="right">70,700</p></td></tr><tr><td valign="top" width="171">Silverpop</td><td valign="top" width="100"><p align="right">65</p></td><td valign="top" width="83"><p align="right">188</p></td><td valign="top" width="79"><p align="right">29,080</p></td><td valign="top" width="88"><p align="right">30,500</p></td></tr><tr><td valign="top" width="171"><p align="left">Marketo</p></td><td valign="top" width="100"><p align="right">122</p></td><td valign="top" width="83"><p align="right">229</p></td><td valign="top" width="79"><p align="right">68,088</p></td><td valign="top" width="88"><p align="right">17,000</p></td></tr><tr><td valign="top" width="171"><p align="left">Manticore Technology</p></td><td valign="top" width="100"><p align="right">0</p></td><td valign="top" width="83"><p align="right">56</p></td><td valign="top" width="79"><p align="right">213,546</p></td><td valign="top" width="88"><p align="right">6,100</p></td></tr><tr><td valign="top" width="171"><p align="left">Market2Lead</p></td><td valign="top" width="100"><p align="right">5</p></td><td valign="top" width="83"><p align="right">5</p></td><td valign="top" width="79"><p align="right">235,244</p></td><td valign="top" width="88"><p align="right">4,800</p></td></tr><tr><td valign="top" width="171"><p align="left">Vtrenz</p></td><td valign="top" width="100"><p align="right">8</p></td><br /><td valign="top" width="83"><p align="right">53</p></td><td valign="top" width="79"><p align="right">295,636</p></td><td valign="top" width="88"><p align="right">3,600</p></td></tr><tr><td valign="top" width="171"><p align="right">Marketing Automation:</p></td><td valign="top" width="100"><p align="right"></p></td><td valign="top" width="83"><p align="right"></p></td><td valign="top" width="79"><p align="right"></p></td><td valign="top" width="88"><p align="right">- </p></td></tr><tr><td valign="top" width="171"><p align="left">Unica Affinium</p></td><td valign="top" width="100"><p align="right">6</p></td><td valign="top" width="83"><p align="right">43</p></td><td valign="top" width="79"><p align="right">126,215</p></td><td valign="top" width="88"><p align="right">8,500</p></td></tr><tr><td valign="top" width="171"><p align="left">Alterian</p></td><td valign="top" width="100"><p align="right">5</p></td><td valign="top" width="83"><p align="right">145</p></td><td valign="top" width="79"><p align="right">345,543</p></td><td valign="top" width="88"><p align="right">2,500</p></td></tr><tr><td valign="top" width="171"><p align="left">Aprimo</p></td><td valign="top" width="100"><p align="right">6</p></td><td valign="top" width="83"><p align="right">139</p></td><td valign="top" width="79"><p align="right">416,446</p></td><td valign="top" width="88"><p align="right">2,200</p></td></tr><tr><td valign="top" width="171"><p align="left">Neolane</p></td><td valign="top" width="100"><p align="right">5</p></td><td valign="top" width="83"><p align="right">64</p></td><td valign="top" width="79"><p align="right">566,977</p></td><td valign="top" width="88"><p align="right">1,690</p></td></tr><tr><td valign="top" width="171"><p align="left">Other Demand Generation:</p></td><td valign="top" width="100"><p align="right"></p></td><td valign="top" width="83"><p align="right"></p></td><td valign="top" width="79"><p align="right"></p></td><td valign="top" width="88"><p align="right">- </p></td></tr><tr><td valign="top" width="171"><p align="left">Marketbright</p></td><td valign="top" width="100"><p align="right">9</p></td><td valign="top" width="83"><p align="right"></p></td><td valign="top" width="79"><p align="right">167,306</p></td><td valign="top" width="88"><p align="right">5,400</p></td></tr><tr><td valign="top" width="171"><p align="left">Pardot</p></td><td valign="top" width="100"><p align="right">4</p></td><td valign="top" width="83"><p align="right">33</p></td><td valign="top" width="79"><p align="right">211,309</p></td><td valign="top" width="88"><p align="right">3,600</p></td></tr><tr><td valign="top" width="171"><p align="left">Marqui Software</p></td><td valign="top" width="100"><p align="right">2</p></td><td valign="top" width="83"><p align="right">19</p></td><td valign="top" width="79"><p align="right">211,767</p></td><td valign="top" width="88"><p align="right">4,400</p></td></tr><tr><td valign="top" width="171"><p align="left">ActiveConversion</p></td><td valign="top" width="100"><p align="right">2</p></td><td valign="top" width="83"><p align="right">12</p></td><td valign="top" width="79"><p align="right">257,058</p></td><td valign="top" width="88"><p align="right">3,400</p></td></tr><tr><td valign="top" width="171"><p align="left">Bulldog Solutions</p></td><td valign="top" width="100"><p align="right">219</p></td><td valign="top" width="83"><p align="right">43</p></td><td valign="top" width="79"><p align="right">338,337</p></td><td valign="top" width="88"><p align="right">3,200</p></td></tr><tr><td valign="top" width="171"><p align="left">OfficeAutoPilot</p></td><td valign="top" width="100"><p align="right">2</p></td><td valign="top" width="83"><p align="right">5</p></td><td valign="top" width="79"><p align="right">509,868</p></td><td valign="top" width="88"><p align="right">2,000</p></td></tr><tr><td valign="top" width="171"><p align="left">Lead Genesys</p></td><td valign="top" width="100"><p align="right">1</p></td><td valign="top" width="83"><p align="right">5</p></td><td valign="top" width="79"><p align="right">557,199</p></td><td valign="top" width="88"><p align="right">1,450</p></td></tr><tr><td valign="top" width="171"><p align="left">LoopFuse</p></td><td valign="top" width="100"><p align="right">22</p></td><td valign="top" width="83"><p align="right">43</p></td><td valign="top" width="79"><p align="right">734,098</p></td><td valign="top" width="88"><p align="right">1,090</p></td></tr><tr><td valign="top" width="171"><p align="left">eTrigue</p></td><td valign="top" width="100"><p align="right">1</p></td><td valign="top" width="83"><p align="right"></p></td><td valign="top" width="79"><p align="right">1,510,207</p></td><td valign="top" width="88"><p align="right">430</p></td></tr><tr><td valign="top" width="171"><p align="left">PredictiveResponse</p></td><td valign="top" width="100"><p align="right">1</p></td><td valign="top" width="83"><p align="right">0</p></td><td valign="top" width="79"><p align="right">2,313,880</p></td><td valign="top" width="88"><p align="right">330</p></td></tr><tr><td valign="top" width="171"><p align="left">FirstWave Technologies</p></td><td valign="top" width="100"><p align="right">0</p></td><td valign="top" width="83"><p align="right">11</p></td><td valign="top" width="79"><p align="right">2,872,765</p></td><td valign="top" width="88"><p align="right">170</p></td></tr><tr><td valign="top" width="171"><p align="left">NurtureMyLeads</p></td><td valign="top" width="100"><p align="right">0</p></td><td valign="top" width="83"><p align="right">5</p></td><td valign="top" width="79"><p align="right">4,157,304</p></td><td valign="top" width="88"><p align="right">140</p></td></tr><tr><td valign="top" width="171"><p align="left">Customer Portfolios</p></td><td valign="top" width="100"><p align="right">0</p></td><td valign="top" width="83"><p align="right">3</p></td><td valign="top" width="79"><p align="right">5,097,525</p></td><td valign="top" width="88"><p align="right">90</p></td></tr><tr><td valign="top" width="171"><p align="left">Conversen</p></td><td valign="top" width="100"><p align="right">1</p></td><td valign="top" width="83"><p align="right">0</p></td><td valign="top" width="79"><p align="right">6,062,462</p></td><td valign="top" width="88"><p align="right">70</p></td></tr><tr><td valign="top" width="171"><p align="left">FirstReef</p></td><td valign="top" width="100"><p align="right">0</p></td><td valign="top" width="83"><p align="right">0</p></td><td valign="top" width="79"><p align="right">11,688,817</p></td><td valign="top" width="88"><p align="right">10</p></td></tr></tbody></table></p>David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-18420641428580764922008-11-18T20:23:00.000-08:002008-11-18T21:56:51.471-08:00Comparing Web Activity Measures for Demand Generation VendorsI recently wanted to measure the relative popularity of several demand generation vendors, as part of a deciding how to expand the <a href="http://www.raabguide.com/">Raab Guide to Demand Generation Systems</a>. This led to an interesting little journey which I think is worth sharing.<br /><br />I started with a list of 23 marketing system vendors. A couple are fairly large but most are quite small. These were grouped into three categories: five demand generation vendors already in the Guide; four marketing automation vendors with significant demand generation market presence; and fourteen other demand generation vendors. (See <a href="http://www.raabguide.com/">http://www.raabguide.com/</a> for definitions of demand generation and marketing automation.)<br /><br />My first thought was to look at their Web site traffic directly. The easiest way to do this is at <a href="http://www.alexa.com/">Alexa.com</a>, which tracks site visits of people who download its search toolbar. The number of users in this base is apparently a well-guarded secret, or at least well enough guarded that I would have had to look beyond the first Google search page for the answer. Alexa was originally classified by many experts as spyware, and is still somewhat controversial. But it was purchased by Amazon.com in 1999 and has since become more or less grudgingly accepted.<br /><br />Be that as it may. I captured two statistics for each of my sites from Alexa: a ranking which basically reflects the number of pages viewed by unique visitors each month (the busiest site gets rank 1, next busiest gets rank 2, etc.); and a share figure that shows the percentage of total toolbar users who visit each site each month. (I think I have that correct; you can check the definitions at Alexa.com.) Ranking on either figure gives the same sequence (except for Pardot; I have no idea why). If you’re creating ratios or an index, the difference in the share figures is probably a better indicator of relative popularity, since a company with twice the share of another has twice as many visitors, but will not necessarily a rank number that is twice as low. (Lower rank means more traffic.)<br /><br />Here are the ranks I came up with, broken into the three segments I mentioned earlier:<br /><br /><table cellspacing="0" cellpadding="0" width="304" border="1"><tbody><tr><td width="145" colspan="2"></td><td width="159" colspan="2">Alexa - 3 mo average</td></tr><tr><td width="145" colspan="2">Already in Guide:</td><td width="81">rank</td><td width="77">share</td></tr><tr><td width="145" colspan="2"><a href="http://www.eloqua.com/">Eloqua</a></td><td width="81">20,234</td><td width="77">0.0070700</td></tr><tr><td width="145" colspan="2"><a href="http://www.silverpop.com/">Silverpop</a></td><td width="81">29,080</td><td width="77">0.0030500</td></tr><tr><td width="145" colspan="2"><a href="http://www.marketo.com/">Marketo</a></td><td width="81">68,088</td><td width="77">0.0017000</td></tr><tr><td width="145" colspan="2"><a href="http://www.manticoretechnology.com/">Manticore Technology</a></td><td width="81">213,546</td><td width="77">0.0006100</td></tr><tr><td width="145" colspan="2"><a href="http://www.market2lead.com/">Market2Lead</a></td><td width="81">235,244</td><td width="77">0.0004800</td></tr><tr><td width="145" colspan="2"><a href="http://www.vtrenz.com/">Vtrenz</a></td><td width="81">295,636</td><td width="77">0.0003600</td></tr><tr><td width="304" colspan="4">Marketing Automation Vendors:</td></tr><tr><td width="145" colspan="2"><a href="http://www.unica.com/">Unica</a> / Affinium*</td><td width="81">126,215</td><td width="77">0.0008500</td></tr><tr><td width="145" colspan="2"><a href="http://www.alterian.com/">Alterian</a></td><td width="81">345,543</td><td width="77">0.0002500</td></tr><tr><td width="145" colspan="2"><a href="http://www.aprimo.com/">Aprimo</a></td><td width="81">416,446</td><td width="77">0.0002200</td></tr><tr><td width="145" colspan="2"><a href="http://www.neolane.com/">Neolane</a></td><td width="81">566,977</td><td width="77">0.0001690</td></tr><tr><td width="304" colspan="4">Other Demand Generation:</td></tr><tr><td width="138"><a href="http://www.marketbright.com/">MarketBright</a></td><td width="88" colspan="2">167,306</td><td width="77">0.0005400</td></tr><tr><td width="138"><a href="http://www.pardot.com/">Pardot</a></td><td width="88" colspan="2">211,309</td><td width="77">0.0003600</td></tr><tr><td width="138"><a href="http://www.marqui.com/">Marqui</a> *</td><td width="88" colspan="2">211,767</td><td width="77">0.0004400</td></tr><tr><td width="138"><a href="http://www.activeconversion.com/">ActiveConversion</a></td><td width="88" colspan="2">257,058</td><td width="77">0.0003400</td></tr><tr><td width="138"><a href="http://www.bulldogsolutions.com/">Bulldog Solutions</a></td><td width="88" colspan="2">338,337</td><td width="77">0.0003200</td></tr><tr><td width="138"><a href="http://www.officeautopilot.com/">OfficeAutoPilot</a></td><td width="88" colspan="2">509,868</td><td width="77">0.0002000</td></tr><tr><td width="138"><a href="http://www.leadgenesys.com/">Lead Genesys</a></td><td width="88" colspan="2">557,199</td><td width="77">0.0001450</td></tr><tr><td width="138"><a href="http://www.loopfuse.com/">LoopFuse</a></td><td width="88" colspan="2">734,098</td><td width="77">0.0001090</td></tr><tr><td width="138"><a href="http://www.predictiveresponse.com/">PredictiveResponse</a></td><td width="88" colspan="2">2,313,880</td><td width="77">0.0000330</td></tr><tr><td width="138"><a href="http://www.firstwave.com/">FirstWave Technologies</a></td><td width="88" colspan="2">2,872,765</td><td width="77">0.0000170</td></tr><tr><td width="138"><a href="http://www.nurturemyleads.com/">NurtureMyLeads</a></td><td width="88" colspan="2">4,157,304</td><td width="77">0.0000140</td></tr><tr><td width="138"><a href="http://www.customerportfolios.com/">Customer Portfolios</a></td><td width="88" colspan="2">5,097,525</td><td width="77">0.0000090</td></tr><tr><td width="138"><a href="http://www.conversen.com/">Conversen</a>*</td><td width="88" colspan="2">6,062,462</td><td width="77">0.0000070</td></tr><tr><td width="138"><a href="http://www.firstreef.com/">FirstReef</a></td><td width="88" colspan="2">11,688,817</td><td width="77">0.0000010</td></tr></tbody></table><br />These rankings were more or less as I expected. Within the first group, Eloqua is definitely the largest vendor, while Marketo is probably the most aggressive marketer at the moment. Vtrenz is the second-largest demand generation company, based on number of clients and almost certainly on revenue. But it is a subsidiary of Silverpop, so its traffic is split between Vtrenz.com and visits to Silverpop.com. This means that the Vtrenz.com ranking understates the company’s position, whie Silverpop ranking includes traffic unrelated to demand generation. I’ve therefore tracked both here. Manticore and Market2Lead get much less attention than the other three, so it makes sense that they have much less traffic.<br /><br />Figures for the next group also seem to be ranked about correctly. Unica is certainly the most prominent of this group, with Alterian, Aprimo and Neolane trailing quite far behind. I would have expected a bit more traffic for Neolane, but it is definitely the new kid on this block and only entered the U.S. market about one year ago. The real surprise here is that this group as a whole ranks so far below the big demand generation vendors, even though the marketing automation firms are in fact larger and probably do more promotion. Perhaps the marketing automation vendors appeal to a smaller number of potential users (primarily, marketers in large companies with direct customer contact, such as financial services, retail, travel and telecommunications) and generate less traffic as a result.<br /><br />I didn’t have much sense of the relative positions of the other demand generation vendors, although I would have guessed that MarketBright and Pardot were near the top. Marqui has had little attention recently, perhaps because they’ve been through financial difficulties culminating in the purchase of their assets by a private investor group this past August. ActiveConversions I do know, only because I’ve spoken with them, and they rank about where I expected given their number of clients. The other names were somewhat familiar but the only one I’d ever spoken with was OfficeAutoPilot, which I knew to be small. Since I had no fully formed expectations, the rankings couldn’t surprise me.<br /><br />In other words, the rankings provided by Alexa seemed generally reasonable given my knowledge of the companies concerned.<br /><br />But Web traffic is just one measure. Where else could I look to confirm or challenge these impressions?<br /><br />Well, there is another Web traffic site that is somewhat similar to Alexa, called <a href="http://www.compete.com/">Compete.com</a>. I actually hadn’t heard of them before but they came up in my research. They apparently use their own toolbar but also some other Web traffic measures such as volumes reported by Internet Service Providers (ISPs). You’d expect them to pretty much match the Alexa figures. But do they? Here is a chart comparing the two, with the Alexa multiplied by 10 ^7 to make them more legible.<br /><br /><table cellspacing="0" cellpadding="0" width="321" border="1"><tbody><tr><td width="145"></td><td width="99">Compete.com</td><td width="77">Alexa.com</td></tr><tr><td width="145"></td><td width="99">unique visitors / month</td><td width="77">share x 10^7</td></tr><tr><td width="321" colspan="3">Already in Guide:</td></tr><tr><td width="145">Eloqua</td><td width="99">560,288</td><td width="77">70,700</td></tr><tr><td width="145">Silverpop</td><td width="99">293,580</td><td width="77">30,500</td></tr><tr><td width="145">Marketo</td><td width="99">34,244</td><td width="77">17,000</td></tr><tr><td width="145">Manticore Technology</td><td width="99">15,789</td><td width="77">6,100</td></tr><tr><td width="145">Market2Lead</td><td width="99">10,689</td><td width="77">4,800</td></tr><tr><td width="145">Vtrenz</td><td width="99">5,313</td><td width="77">3,600</td></tr><tr><td width="321" colspan="3">Marketing Automation Vendors:</td></tr><tr><td width="145">Unica / Affinium*</td><td width="99">23,138</td><td width="77">8,500</td></tr><tr><td width="145">Alterian</td><td width="99">4,497</td><td width="77">2,500</td></tr><tr><td width="145">Aprimo</td><td width="99">5,131</td><td width="77">2,200</td></tr><tr><td width="145">Neolane</td><td width="99">3,927</td><td width="77">1,690</td></tr><tr><td width="321" colspan="3">Other Demand Generation:</td></tr><tr><td width="145">MarketBright</td><td width="99">13,993</td><td width="77">5,400</td></tr><tr><td width="145">Pardot</td><td width="99">7,339</td><td width="77">3,600</td></tr><tr><td width="145">Marqui *</td><td width="99">3,282</td><td width="77">4,400</td></tr><tr><td width="145">ActiveConversion</td><td width="99">1,503</td><td width="77">3,400</td></tr><tr><td width="145">Bulldog Solutions</td><td width="99">6,408</td><td width="77">3,200</td></tr><tr><td width="145">OfficeAutoPilot</td><td width="99">1,567</td><td width="77">2,000</td></tr><tr><td width="145">Lead Genesys</td><td width="99">2,630</td><td width="77">1,450</td></tr><tr><td width="145">LoopFuse</td><td width="99">1,930</td><td width="77">1,090</td></tr><tr><td width="145">PredictiveResponse</td><td width="99">1,099</td><td width="77">330</td></tr><tr><td width="145">FirstWave Technologies</td><td width="99">-</td><td width="77">170</td></tr><tr><td width="145">NurtureMyLeads</td><td width="99">-</td><td width="77">140</td></tr><tr><td width="145">Customer Portfolios</td><td width="99">-</td><td width="77">90</td></tr><tr><td width="145">Conversen*</td><td width="99">-</td><td width="77">70</td></tr><tr><td width="145">FirstReef</td><td width="99">-</td><td width="77">10</td></tr></tbody></table><br />You don’t need Sherlock Holmes to spot the problem: the Compete.com figures for Eloqua and Silverpop seem much too high compared with the others. I could concoct a theory that this reflects the difference between counting unique visitors in Compete.com and counting page views in Alexa, and throw in the fact that Eloqua and Silverpop/Vtrenz host landing pages for their clients. But the other demand generation vendors also host their clients’ pages, so this shouldn’t really matter. I suspect what really happens is that Compete measures low volumes differently from higher volumes (remember, they use a combination of techniques), and thus the figures for high-volume Eloqua and Silverpop are inconsistent with figures for the other, much lower-volume domains.<br /><br />Anyway, if we throw away those two, the rest of the Compete figures seem more or less in line with the Alexa figures, apart from some small exceptions (Bulldog in particular ranks higher). All told, it doesn’t seem that Compete adds much value to what I already got from Alexis.<br /><br />So much for Web traffic. How about search volume? <a href="http://www.google.com/">Google</a> Keywords will give that to me. Again, we’ll compare to Alexa as a reference:<br /><br /><table cellspacing="0" cellpadding="0" width="365" border="1"><tbody><tr><td width="171"></td><td width="105">Google Keywords</td><td width="89">Alexa</td></tr><tr><td width="171"></td><td width="105">avg search volume</td><td width="89">share x 10^7</td></tr><tr><td width="171">Already in Guide:</td><td width="105"></td><td width="89"></td></tr><tr><td width="171">Eloqua</td><td width="105"><p align="right">1,900</p></td><td width="89"><p align="right">70,700</p></td></tr><tr><td width="171">Silverpop</td><td width="105"><p align="right">1,790</p></td><td width="89"><p align="right">30,500</p></td></tr><tr><td width="171">Marketo</td><td width="105"><p align="right">839</p></td><td width="89"><p align="right">17,000</p></td></tr><tr><td width="171">Manticore Technology</td><td width="105"><p align="right">113</p></td><td width="89"><p align="right">6,100</p></td></tr><tr><td width="171">Market2Lead</td><td width="105"><p align="right">318</p></td><td width="89"><p align="right">4,800</p></td></tr><tr><td width="171">Vtrenz</td><td width="105"><p align="right">752</p></td><td width="89"><p align="right">3,600</p></td></tr><tr><td width="171">Marketing Automation:</td><td width="105"><p align="right"></p></td><td width="89"><p align="right">- </p></td></tr><tr><td width="171">Unica / Affinium*</td><td width="105"><p align="right">6,600</p></td><td width="89"><p align="right">8,500</p></td></tr><tr><td width="171">Alterian</td><td width="105"><p align="right">861</p></td><td width="89"><p align="right">2,500</p></td></tr><tr><td width="171">Aprimo</td><td width="105"><p align="right">1,600</p></td><td width="89"><p align="right">2,200</p></td></tr><tr><td width="171">Neolane</td><td width="105"><p align="right">1,340</p></td><td width="89"><p align="right">1,690</p></td></tr><tr><td width="171">Other Demand Generation:</td><td width="105"><p align="right"></p></td><td width="89"><p align="right">- </p></td></tr><tr><td width="171">MarketBright</td><td width="105"><p align="right">186</p></td><td width="89"><p align="right">5,400</p></td></tr><tr><td width="171">Pardot</td><td width="105"><p align="right">210</p></td><td width="89"><p align="right">3,600</p></td></tr><tr><td width="171">Marqui *</td><td width="105"><p align="right">1,300</p></td><td width="89"><p align="right">4,400</p></td></tr><tr><td width="171">ActiveConversion</td><td width="105"><p align="right">46</p></td><td width="89"><p align="right">3,400</p></td></tr><tr><td width="171">Bulldog Solutions</td><td width="105"><p align="right">442</p></td><td width="89"><p align="right">3,200</p></td></tr><tr><td width="171">OfficeAutoPilot</td><td width="105"><p align="right">0</p></td><td width="89"><p align="right">2,000</p></td></tr><tr><td width="171">Lead Genesys</td><td width="105"><p align="right">74</p></td><td width="89"><p align="right">1,450</p></td></tr><tr><td width="171">LoopFuse</td><td width="105"><p align="right">260</p></td><td width="89"><p align="right">1,090</p></td></tr><tr><td width="171">PredictiveResponse</td><td width="105"><p align="right">36</p></td><td width="89"><p align="right">330</p></td></tr><tr><td width="171">FirstWave Technologies</td><td width="105"><p align="right">386</p></td><td width="89"><p align="right">170</p></td></tr><tr><td width="171">NurtureMyLeads</td><td width="105"><p align="right">0</p></td><td width="89"><p align="right">140</p></td></tr><tr><td width="171">Customer Portfolios</td><td width="105"><p align="right">0</p></td><td width="89"><p align="right">90</p></td></tr><tr><td width="171">Conversen*</td><td width="105"><p align="right">170</p></td><td width="89"><p align="right">70</p></td></tr><tr><td width="171">FirstReef</td><td width="105"><p align="right">12</p></td><td width="89"><p align="right">10</p></td></tr></tbody></table><br />If we limit ourselves to the first two groups, the search numbers look mostly plausible. The low figure for Manticore could have to do with checking specifically for “Manticore Technology”, since a looser “Manticore” would incorporate an unrelated company and references to the mythical beast. The high value for Unica probably reflects some unrelated uses of the word in other languages or as an acronym. I have no particular explanation for the relatively low value for Alterian or the substantial flattening of the range between Eloqua and its competitors. Perhaps Eloqua’s traffic is less search-driven than other vendors’. Or not. In any event, I think the implicit rankings here are about as plausible as the Alexa rankings.<br /><br />But things get crazier in the Other Demand Generation vendor segment. I understand the Marqui number, which is high because Marqui can be a misspelling of other words (marquis, marque, marquee) and has some unrelated non-English meanings. Similarly, Conversen is a verb form in Spanish. I think that Bulldog Solutions, FirstWave and LoopFuse also gain some hits because of their component words, even though I tried to keep them out of the search results. The bottom line here is you have to throw away so many terms that the remaining rankings don’t signify much. So, in general, search keyword rankings need close consideration before you can accept them as a meaningful measure of importance.<br /><br />How about Google hits? I’ll show them alongside the Google Keywords as well as Alexa rank.<br /><br /><table cellspacing="0" cellpadding="0" width="439" border="1"><tbody><tr><td width="171"></td><td width="87">Google hits</td><td width="96">Google Keywords</td><td width="86">Alexa</td></tr><tr><td width="171"></td><td width="87"></td><td width="96">avg search volume</td><td width="86">share x 10^7</td></tr><tr><td width="171">Already in Guide:</td><td width="87"></td><td width="96"></td><td width="86"></td></tr><tr><td width="171">Eloqua</td><td width="87"><p align="right">118,000</p></td><td width="96"><p align="right">1,900</p></td><td width="86"><p align="right">70,700</p></td></tr><tr><td width="171">Silverpop</td><td width="87"><p align="right">111,000</p></td><td width="96"><p align="right">1,790</p></td><td width="86"><p align="right">30,500</p></td></tr><tr><td width="171">Marketo</td><td width="87"><p align="right">103,000</p></td><td width="96"><p align="right">839</p></td><td width="86"><p align="right">17,000</p></td></tr><tr><td width="171">Manticore Technology</td><td width="87"><p align="right">9,620</p></td><td width="96"><p align="right">113</p></td><td width="86"><p align="right">6,100</p></td></tr><tr><td width="171">Market2Lead</td><td width="87"><p align="right">25,900</p></td><td width="96"><p align="right">318</p></td><td width="86"><p align="right">4,800</p></td></tr><tr><td width="171">Vtrenz</td><td width="87"><p align="right">35,200</p></td><td width="96"><p align="right">752</p></td><td width="86"><p align="right">3,600</p></td></tr><tr><td width="171">Marketing Automation:</td><td width="87"><p align="right"></p></td><td width="96"><p align="right"></p></td><td width="86"><p align="right">- </p></td></tr><tr><td width="171">Unica / Affinium*</td><td width="87"><p align="right">7,750</p></td><td width="96"><p align="right">6,600</p></td><td width="86"><p align="right">8,500</p></td></tr><tr><td width="171">Alterian</td><td width="87"><p align="right">262,000</p></td><td width="96"><p align="right">861</p></td><td width="86"><p align="right">2,500</p></td></tr><tr><td width="171">Aprimo</td><td width="87"><p align="right">161,000</p></td><td width="96"><p align="right">1,600</p></td><td width="86"><p align="right">2,200</p></td></tr><tr><td width="171">Neolane</td><td width="87"><p align="right">40,200</p></td><td width="96"><p align="right">1,340</p></td><td width="86"><p align="right">1,690</p></td></tr><tr><td width="171">Other Demand Generation:</td><td width="87"><p align="right"></p></td><td width="9"><p align="right"></p></td><td width="86"><p align="right">- </p></td></tr><tr><td width="171">MarketBright</td><td width="87"><p align="right">34,500</p></td><td width="96"><p align="right">186</p></td><td width="86"><p align="right">5,400</p></td></tr><tr><td width="171">Pardot</td><td width="87"><p align="right">27,600</p></td><td width="96"><p align="right">210</p></td><td width="86"><p align="right">3,600</p></td></tr><tr><td width="171">Marqui *</td><td width="87"><p align="right">1,370,000</p></td><td width="96"><p align="right">1,300</p></td><td width="86"><p align="right">4,400</p></td></tr><tr><td width="171">ActiveConversion</td><td width="87"><p align="right">16,800</p></td><td width="96"><p align="right">46/p></p></td><td width="86"><p align="right">3,400</p></td></tr><tr><td width="171">Bulldog Solutions</td><td width="87"><p align="right">9,340</p></td><td width="96"><p align="right">442</p></td><td width="86"><p align="right">3,200</p></td></tr><tr><td width="171">OfficeAutoPilot</td><td width="87"><p align="right">777</p></td><td width="96"><p align="right">0</p></td><td width="86"><p align="right">2,000</p></td></tr><tr><td width="171">Lead Genesys</td><td width="87"><p align="right">5,880</p></td><td width="96"><p align="right">74</p></td><td width="86"><p align="right">1,450</p></td></tr><tr><td width="171">LoopFuse</td><td width="87"><p align="right">95,400</p></td><td width="96"><p align="right">260</p></td><td width="86"><p align="right">1,090</p></td></tr><tr><td width="171">PredictiveResponse</td><td width="87"><p align="right">21,800</p></td><td width="96"><p align="right">36</p></td><td width="86"><p align="right">330</p></td></tr><tr><td width="171">FirstWave Technologies</td><td width="87"><p align="right">13,400</p></td><td width="96"><p align="right">386</p></td><td width="86"><p align="right">170</p></td></tr><tr><td width="171">NurtureMyLeads</td><td width="87"><p align="right">1,050</p></td><td width="96"><p align="right">0</p></td><td width="86"><p align="right">140</p></td></tr><tr><td width="171">Customer Portfolios</td><td width="87"><p align="right">12,200</p></td><td width="96"><p align="right">0</p></td><td width="86"><p align="right">90</p></td></tr><tr><td width="171">Conversen*</td><td width="87"><p align="right">2,790</p></td><td width="96"><p align="right">170</p></td><td width="86"><p align="right">70</p></td></tr><tr><td width="171">FirstReef</td><td width="87"><p align="right">18,100</p></td><td width="96"><p align="right">12</p></td><td width="86"><p align="right">10</p></td></tr></tbody></table><br />Here the impact of limiting Manticore to “Manticore Technology” shows up even more clearly (although Manticore truly doesn’t get much Web attention). I limited the Unica test to “Unica Affinium” since the number of hits is otherwise over 100 million; but this seems to excessively depress the results. Note that the low ranking for Alterian has now been reversed; in fact, Alterian has the most hits of all, and the marketing automation group in general shows more activity than the demand generation vendors. That could be true – those vendors have been around longer. Or it could be a fluke.<br /><br />Once again, the Other Demand Generation group has a big problem with Marqui and perhaps smaller problems with LoopFuse and FirstReef. Even excluding those, the numbers jump around a great deal. As with keywords, these figures don’t seem to be a reliable measure of anything.<br /><br />Let’s try one more measure: the blogosphere. Here I tried three different services: <a href="http://www.technorati.com/">Technorati</a>, <a href="http://www.blogpulse.com/">BlogPulse</a> and <a href="http://www.icerocket.com/">Ice Rocket</a>.<br /><br /><table cellspacing="0" cellpadding="0" width="473" border="1"><tbody><tr><td width="171"></td><td width="75">Technorati</td><td width="67">Blogpulse</td><td width="77">Ice Rocket</td><td width="84">Alexa</td></tr><tr><td width="171"></td><td width="75">blog posts</td><td width="67">blog posts</td><td width="77">all posts</td><td width="84">share x 10^7</td></tr><tr><td width="171">Already in Guide:</td><td width="75"></td><td width="67"></td><td width="77"></td><td width="84"></td></tr><tr><td width="171">Eloqua</td><td width="75"><p align="right">130</p></td><td width="67"><p align="right">267</p></td><td width="77"><p align="right">286</p></td><td width="84"><p align="right">70,700</p></td></tr><tr><td width="171">Silverpop</td><td width="75"><p align="right">70</p></td><td width="67"><p align="right">119</p></td><td width="77"><p align="right">188</p></td><td width="84"><p align="right">30,500</p></td></tr><tr><td width="171">Marketo</td><td width="75"><p align="right">3</p></td><td width="67"><p align="right">179</p></td><td width="77"><p align="right">229</p></td><td width="84"><p align="right">17,000</p></td></tr><tr><td width="171">Manticore Technology</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">12</p></td><td width="77"><p align="right">56</p></td><td width="84"><p align="right">6,100</p></td></tr><tr><td width="171">Market2Lead</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">7</p></td><td width="77"><p align="right">25</p></td><td width="84"><p align="right">4,800</p></td></tr><tr><td width="171">Vtrenz</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">30</p></td><td width="77"><p align="right">53</p></td><td width="84"><p align="right">3,600</p></td></tr><tr><td width="171">Marketing Automation:</td><td width="75"><p align="right"></p></td><td width="67"><p align="right"></p></td><td width="77"><p align="right"></p></td><td width="84"><p align="right">- </p></td></tr><tr><td width="171">Unica / Affinium*</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">6</p></td><td width="77"><p align="right">43</p></td><td width="84"><p align="right">8,500</p></td></tr><tr><td width="171">Alterian</td><td width="75"><p align="right">8</p></td><td width="67"><p align="right">119</p></td><td width="77"><p align="right">145</p></td><td width="84"><p align="right">2,500</p></td></tr><tr><td width="171">Aprimo</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">118</p></td><td width="77"><p align="right">139</p></td><td width="84"><p align="right">2,200</p></td></tr><tr><td width="171">Neolane</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">33</p></td><td width="77"><p align="right">64</p></td><td width="84"><p align="right">1,690</p></td></tr><tr><td width="171">Other Demand Generation:</td><td width="75"><p align="right"></p></td><td width="67"><p align="right"></p></td><td width="77"><p align="right"></p></td><td width="84"><p align="right">- </p></td></tr><tr><td width="171">MarketBright</td><td width="75"><p align="right">1</p></td><td width="67"><p align="right">23</p></td><td width="77"><p align="right">33</p></td><td width="84"><p align="right">5,400</p></td></tr><tr><td width="171">Pardot</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">32</p></td><td width="77"><p align="right">33</p></td><td width="84"><p align="right">3,600</p></td></tr><tr><td width="171">Marqui software*</td><td width="75"><p align="right">5</p></td><td width="67"><p align="right">15</p></td><td width="77"><p align="right">19</p></td><td width="84"><p align="right">4,400</p></td></tr><tr><td width="171">ActiveConversion</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">6</p></td><td width="77"><p align="right">12</p></td><td width="84"><p align="right">3,400</p></td></tr><tr><td width="171">Bulldog Solutions</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">30</p></td><td width="77"><p align="right">43</p></td><td width="84"><p align="right">3,200</p></td></tr><tr><td width="171">OfficeAutoPilot</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">5</p></td><td width="77"><p align="right">5</p></td><td width="84"><p align="right">2,000</p></td></tr><tr><td width="171">Lead Genesys</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">1</p></td><td width="77"><p align="right">5</p></td><td width="84"><p align="right">1,450</p></td></tr><tr><td width="171">LoopFuse</td><td width="75"><p align="right">4</p></td><td width="67"><p align="right">48</p></td><td width="77"><p align="right">43</p></td><td width="84"><p align="right">1,090</p></td></tr><tr><td width="171">PredictiveResponse</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">0</p></td><td width="77"><p align="right">0</p></td><td width="84"><p align="right">330</p></td></tr><tr><td width="171">FirstWave Technologies</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">5</p></td><td width="77"><p align="right">11</p></td><td width="84"><p align="right">170</p></td></tr><tr><td width="171">NurtureMyLeads</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">1</p></td><td width="77"><p align="right">5</p></td><td width="84"><p align="right">140</p></td></tr><tr><td width="171">Customer Portfolios</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">0</p></td><td width="77"><p align="right">3</p></td><td width="84"><p align="right">90</p></td></tr><tr><td width="171">Conversen*</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">2</p></td><td width="77"><p align="right">0</p></td><td width="84"><p align="right">70</p></td></tr><tr><td width="171">FirstReef</td><td width="75"><p align="right">0</p></td><td width="67"><p align="right">0</p></td><td width="77"><p align="right">0</p></td><td width="84"><p align="right">10</p></td></tr></tbody></table><br /><br />Results for all three services are roughly consistent, although Technorati gets many fewer hits and Ice Rocket finds a few more than Blogpulse. The major anomaly is the low value for Unica, but that happens because I actually searched on Unica Affinium, to avoid all the irrelevant hits on Unica alone. Similarly, I searched on Marqui Software to avoid unrelated hits on Marqui. The high values for Bulldog Solutions and Loopfuse are valid (I scanned the actual hits); these two vendors just managed to snag a relatively high number of blog mentions. Remember we are looking at very small numbers here: it doesn’t take much to get 40 blog mentions. Nor, if we trust the Alexa, do they translate into much Web traffic. However, the blog hits might explain the relatively high keyword search counts for those two vendors.<br /><br />Well, I hope you enjoyed the trip. This is far from an exhaustive analysis of the issue, but based on the information available, I’d say that Alexa Web traffic is the most useful measure for assessing the market presence of different demand generation vendors, and blog mentions have at least some value. Google hits and keyword searches capture too many unrelated items to be reliable.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com7tag:blogger.com,1999:blog-1380589722433800422.post-6443137993333829602008-11-12T16:31:00.000-08:002008-11-13T06:26:51.541-08:00No Silver Bullets for Social Media MeasurementThe editor of my forthcoming book on marketing measurement asked me to add something on social media, which led to several days of research. Although there are many smart and articulate people writing on the topic, the bottom line is, well, you can’t really measure the bottom line.<br /><br />There are plenty of activity measures such as numbers of page views, comments and subscribers. Sometimes there are specific benefits such as reduced costs if technical questions are answered through a user forum instead of company staff. Sometimes you can compare behavior of social media participants vs. non-participants, although that raises a self-selection problem – obviously those people are more engaged to begin with.<br /><br />But measuring the impact of social media on attitudes in the population as a whole—that is, on brand value—is even harder than measuring the impact of traditional marketing and advertising methods because the audience size is so small. Measuring the impact of brand value on actual sales is already a problem, what you have with social media could be considered the brand value problem, squared.<br /><br />In fact, the closest analogy is measuring the value of traditional public relations, which is notoriously difficult. Social media is more like a subset of public relations than anything else, although it feels odd to describe it that way because social media is so much larger and more complicated than traditional PR. Maybe we'll need to think of PR as a subset of social media.<br /><br />The best advice I saw boiled down to setting targets for something measurable, and then watching whether you reach them. This is pretty much the best practice for measuring public relations and other marketing programs without a direct impact on sales. I guess there’s nothing surprising about this, although I was still a bit disappointed.<br /><br />Still, as I say, there is plenty of interesting material available if you want to learn about concrete measurements and how people use them. Just about every hit on the first two pages of a <a href="http://www.google.com/search?hl=en&ie=UTF-8&q=social+media+marketing+measurement&start=0&sa=N">Google search on “social media marketing measurement”</a> was valuable. In particular, I kept tripping across Jeremiah Owyang, currently an analyst with Forrester Research, who has created many useful lists on his <a href="http://www.web-strategist.com/blog/">Web Strategy by Jeremiah</a> blog. For example, the post <a href="http://www.web-strategist.com/blog/2008/02/26/social-media-faq-3-how-do-i-measure-roi/">Social Media FAQ #3: How Do I Measure ROI?</a> provides a good overview of the subject. You can also search his category of <a href="http://www.web-strategist.com/blog/category/social-media-measurement/">Social Media Measurement</a>. Another post I found helpful was <a href="http://www.socialmediaexplorer.com/2008/10/28/what-is-the-roi-for-social-media/">What Is The ROI For Social Media?</a> from Jason Falls’ <a href="http://www.socialmediaexplorer.com/">Social Media Explorer</a> blog.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-20723480324218999392008-11-07T12:42:00.000-08:002008-11-07T12:51:19.626-08:00Cognos Papers Propose Sales and Marketing Metrics<p>I’ve always felt that defining a standard set of marketing measures is like prescribing medicine without first examining the patient. But people love those sorts of lists, and they offer a starting point for a more tailored analysis. So I guess they have some value. <br /><br />Based on that somewhat crotchety premise, I’ll call your attention to a pair of papers from <a href="http://www.cognos.com/">Cognos</a> on “Delivering the reports, plans, & metrics Sales needs” and “Delivering reports, plans, and metrics for better Marketing” (idiosyncratic capitalization in the original). These are widely available on the Internet; you can find both easily if you <a href="http://businessintelligence.ittoolbox.com/topics/t.asp?t=307&p=307&h1=307">run this search</a> at IT Toolbox.<br /><br />Since the whole point of standard measures is to be broadly applicable, I suppose it’s a compliment to say that the measures in this paper are reasonable if not particularly exciting. One point they do illustrate is the difference between marketing and sales, which are often conflated into a single entity but are in fact quite distinct. Let’s look at the metric categories for each:<br /><br />- Sales: sales results; customer/product profitability; sales tactics; sales pipeline; and sales plan variance.<br /><br />- Marketing: market opportunities; competitive positioning; product life cycle management; pricing; and demand generation.<br /><br />It’s surely a cliché, but these measures suggest that marketing is strategic while sales is almost exclusively tactical. That’s a bit blunt but it sounds about right to me.<br /><br />Given my admittedly parochial focus on demand generation these days (see <a href="http://www.raabguide.com/">www.raabguide.com</a>), I couldn’t avoid noticing that Cognos gave demand generation just one of its five marketing slots. That seems a bit underweighted, given that it probably accounts for the bulk of most marketing budgets. But I do have to agree that strategically, marketing should be spending its time on those other topics too.<br /><br />The papers list specific measures within each category. It’s going to as boring to type these as you’ll find it to read them, but I guess it’s worth the trouble to have them readily available for future reference. So here goes:<br /><br />Sales metrics:<br /><br />Sales results<br />- new customer sales<br />- sales growth<br />- sales orders<br /><br />Customer/product profitability<br />- average customer profit, lifetime profit and net profit<br />- net sales<br />- gross profit<br />- customer acquisition and retention cost<br />- sales revenue<br />- units sold<br /><br />Sales tactics<br />- average selling price<br />- direct cost (of sales efforts)<br />- discount<br />- sales calls and sales rep days<br />- sales orders<br />- units quoted<br /><br />Sales pipeline<br />- pipeline ratio (they don’t define this; I’m not sure what they mean. Maybe distribution by sales stage)<br />- pipeline revenue<br />- sales orders and conversions<br />- cancelled order count<br />- active and inactive customers<br />- inquiries<br />- new customers and lost business<br /><br />Sales plan variance<br />- sales order variance<br />- sales plan variance<br />- sales growth rate variance<br />- units ordered and sold variance<br /><br />You’ll notice a bit of overlap across groups, and I’m not sure why “Sales plan variance” is a separate area: I would expect to measure variances against plan for everything. The list is also missing a few common measures such as profit margin (which shows the net impact of decisions regarding product mix, pricing and discounts), actual vs. potential sales (hard to measure but critical), lead-to-customer conversion rates, and win ratios in competitive deals. <br /><br />Marketing metrics:<br /><br />Market opportunities<br />- company share<br />- market growth<br />- market revenue<br />- profit<br />- sales<br /><br />Competitive positioning<br />- competitor growth<br />- competitor price change<br />- competitor share<br />- competitor sales<br />- market growth<br />- market revenue and profit<br />- sales<br /><br />Product life cycle management<br />- new products developed<br />- new product growth, share, & profit<br />- new competitor product sales & growth<br />- market growth<br />- brand equity score<br />- new product share of revenue<br /><br />Pricing<br />- price change<br />- sales<br />- price segment share and growth<br />- discount ($)<br />- discount spread (%)<br />- list price, net price, & average price<br />- price elasticity factor<br />- price segment sales and value<br />Demand generation<br />- marketing campaigns (#)<br />- marketing spend<br />- marketing spend per lead<br />- qualified leads (#)<br />- promotions ROI<br />- baseline and incremental sales<br /><br />If these weren’t two separate papers, I’d say the author had gotten tired by the time she wrote this one. We see even more redundancy (sales appears in three of the five lists) and “brand equity score” sticks out like a moose at a Sarah Palin rally. (Now there’s a joke that will age quickly.) It’s interesting that the competitive measures provide some of the relative performance information that was lacking in the sales metrics, and that reporting on profit addresses to some degree my earlier question about margins. Is the author implicitly suggesting that sales shouldn’t be held accountable for such things? I disagree. On the other hand, measures of customer value or quality are all assigned to sales. I think marketing is primarily responsible for that one.<br /><br />Well, that’s interesting: I hadn’t really planned to criticize these measures when I sat down to write this, but now that I look more closely, I do have some objections. It honestly doesn’t seem fair to be harsh, since any list can be criticized. Maybe I’m just crotchety after all. In any event, you can add this list to your personal inventory of metrics to consider for your own business. Maybe something in it will prove useful.<br /> </p>David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0tag:blogger.com,1999:blog-1380589722433800422.post-53452081948308276782008-10-29T11:52:00.000-07:002008-10-29T12:06:53.266-07:00Is Measuring Brand Value Worth the Effort?Hello, blogosphere! Did you miss me?<br /><br />Probably not, but, whatever. Launch of the new <em>Raab Guide to Demand Generation Systems</em> <a href="http://www.raabguide.com/">http://www.raabguide.com/</a> is largely complete, so I can now find some time for this blog. Also, the publisher of my <em>MPM Toolkit</em> book seems to have settled on a January publication date, so I need to pay more attention to this side of the industry.<br /><br />I’ll ease back into this blog with a survey on brand value from the <a href="http://www.ana.net/">Association of National Advertisers</a> (ANA) and <a href="http://www.interbrand.com/">Interbrand</a> consultancy. The press release is available <a href="http://www.ana.net/news/content/1447">here</a> .<br /><br />Key findings of the survey were that 55% of senior marketers “lack a quantitative understanding of brand value” and that 64% said “brands do not influence decisions made at their organizations.” Bear in mind that these are ANA members, who tend to be large media consumers. If they can’t measure or use brand value, nobody can.<br /><br />Taken together, these two figures mean that brands don’t influence decisions even at some companies which are able to measure their value. The survey explored this a bit, and found that at companies where brands lack influence, the most common reason (cited by 51%) was that “incentives do not support importance of brand”. In other words, if I interpret that correctly, people are not rewarded for increasing brand value—so they don’t work to do that, even if they do have the ability to measure it. The next most common reason, at 49%, was the more expected “inability to prove brand’s financial benefit”. Other answers ranked at 40% or below.<br /><br />This wouldn’t matter to anyone who's not a brand valuation consultant, except for one thing: 80% of the responders report that “demands from the C-suite and boardroom were steadily increasing” to demonstrate that branding initiatives add profit. That means even existing branding budgets are at risk.<br /><br />If you accept, as I do, that branding programs do add value, then not being able to justify them is a serious problem. But there’s a difference between knowing something has value and knowing what that value is. As I’ve pointed out <a href="http://mpmtoolkit.blogspot.com/2008/05/one-more-comment-on-ana-s-integrated.html">prevously</a> and the good people at <a href="http://www.marketingnpv.com/">MarketingNPV</a> recently <a href="http://www.marketingnpv.com/articles/features/My_Brand_Bigger_Than_Your_Brand_How_NOT_to_Get_Caught_in_the_Brand_Valuation_Trap">wrote</a> at more length, different brand valuation methodologies give widely varying results, and even the same methodology gives different results from year to year. <br /><br />This has important practical implications: specifically, <em><strong>brand measurements are not precise enough to guide tactical decisions.</strong></em> Yet that is exactly what the ANA survey says marketers want: 93% felt a quantified understanding would allow “more focused investment in marketing” and 82% felt it would provide “an opportunity to custom out underperforming initiatives”. Frankly, I’d say those are unrealistic expectations.<br /><br />The MarketingNPV paper argues that marketers should not attempt to measure brand value by itself, and instead focus on “quantifying the impact of marketing on cash flows”. That may seem like begging the question: after all, the value of a brand is precisely its impact on future cash flows. But I think of brand value as a residual factor which accounts for future cash flows that cannot be attributed to more direct influences such as promotions, distribution and pricing. So it does make sense to say, first let’s do a better job of predicting the impact on cash flows of those directly measurable items. Once we've taken that as far as we can--and I'd say most firms are nowhere near--then we can spend energy on brand value to explain the rest.David Raabhttp://www.blogger.com/profile/03489754392712536104noreply@blogger.com0