Management and measurement

Tis KPI season — “key performance indicators” — and I need to pick the three or four measures that will drive a web marketing operation.

Any suggestions? If you could track everything with utter precision, what would you track? How would you measure brand strength? Awareness?

I have my theories. I need more.

Author: David Churbuck

Cape Codder with an itch to write

0 thoughts on “Management and measurement”

  1. My Short list

    -Overall CSAT (look and feel, navigation, ability to find what people look for, etc.)
    -Overall searches in major search engines for terms related to the brand mentions & links in social networks (technorati, digg, del.icio.us, etc) Of course you need to somehow learn if such mentions are positive or negative.
    -Treat your site like a blog: look who’s linking to it and why; social is everything.
    -“Traditional” measurements. Visits, revenue, downloads (in case of drivers, for example) campaign effectiveness.

    The shot explanation:
    Nowadays “Social” rules. Besides being in vogue (this might fade away with time), I think social networks and their evolutions will be a major player in Web from now on. A web marketer who can take good advantage of them has: -a low cost way of advertising and an effective way to rise brand awareness. Of course one must be extra careful, the “masses” can smell the fake… and that can seriously backfire.

    Social networks, forums, discussion groups and the like hold niche groups of people for everything. One must make sure some mentions are made in the right niches

    That being said, the traditional channels of marketing need not be abandoned, since a brand needs presence in the form of banners or ads in certain key sites.

    I think the overall concept is fairly simple. Be mentioned (linked, quoted) more, in the right places and for the right reasons. There are several means to this purpose, some traditional, some new… it’s just a matter of learning which work best for each particular case.

  2. Monthly duration of engagement, the actual time and substance of interaction with customers. It can be measured as a passive metric (time the customer is indirectly engaged with your product or brand) and actively (time the customer is directly engaged with promotional and advertising) as a ratio to revenue generated.

  3. Top 3 from my standpoint with some questions to help you put boundaries.

    1. Conversation index: Is your website a static bunch of pages people view? Or are you having bidirectional conversation (sorta like comments ON EACH page) so customers can give you feedback on the content you put on each page and their perspectives? Where are people coming from and what pages are they going to – simple metrics that you can start to collect together and create the macro state of the conversation with the customer.

    2. Engagement index: How engaged is your audience?

    3. Contribution index: What are they giving you back.

    Take the metrics you already have and “do mashups” to create a higher level of thinking.

    The old website stuff is that – old.

  4. I like what Esteban had in conjunction with Mokund’s index idea. For something like awareness, without having actual surveys (which I think we should consider)…you could conjur up an awareness index based on a few factors…like technorati search engine search postive vs negative comments on 20 imporant blogs (or something relevant like that)

    We need to find where our target audience hangs out and find out what they say about us or if they know we exist.

  5. Without context, one widely applicable and core KPI you should consider is “revenue per thousand,” then segment, dimensionalize, and derive off of that based on yr goal.

    You can draw whatever organic shapes and clever derivatives around attention, conversation, engagement, and beyond as long as you center it on something material, like revenue.

    I know that sounds basic, but the revenue signal is harder to detect amidst noise so great.

  6. David,

    While not addressing so much which ones, I had some thoughts about the general questions that measures should be chosen to answer. Where are you now, and where do you want to be? The metrics chosen should then report progress about your journey from here to there, in a transformational sense.

    I also had some thoughts and questions concerning the effeciency of an organization with regard to how much resources spent to understand vs how much available to change. The skinner that ratio, the nimbler the business will be. Assuming it can still get the right answers.

    http://markitude.wordpress.com/2007/04/05/management-mantra/

  7. I knew you guys would be smart. Waiting for Uncle Fester, the Brainiac MBA to weigh in — he always pops my trial balloons.

    To summarize the suggestions …

    Juan de Fuca:
    * time on site/segment visits (e.g. “About”, corporate info)

    Esteban:
    *CSAT (aye, but what specifically about customer satisfaction: “rocks or sucks?”)
    * Brand mentions via search: rocks or sucks, and volume
    * Traditional traffic and revenue measures [yup. agreed. remember, I need three gauges, so this exercise is about the three a CMO or CEO needs to operate and drive the business]

    Mitch Ratcliffe
    *Monthly Duration of engagement (time spent on brand interaction — passive or active) expressed as a ratio to revenue. [novel, need to understand better]

    Mukund Mohan
    * Conversation
    * Engagement
    * Contribution
    [I might argue conversation and contribution are drivers of engagement, but yes, something we need to figure out how to tally and score]

    Jim Hazen
    * Blend CSAT (Esteban) with “CEC” (Mukund) and external monitoring of target markets in those social areas where they would discuss the brand. [very interesting. The combo of internal measures with external observations is compelling]

    Judah
    * Ah, yes, the Revenue per Thousand measure. That drove us at IDG. Essentially, for every thousand uniques, what kind of revenue are you generating. Good, but I have a traffic purification issue when looking at gross inbounds. Some of that is skewed by browser preloads and homepage defaults. We perpetually face a challenge of currying “good” from “specious” traffic. Hazen can elaborate better than I. Media sites can safely assume all inbound traffic is “good” and ratio it against revenue.

    Mark
    Great post at Markitude. And a great point — overengineering the measurement is like the classic cause of airplane accidents. The captain and copilot obsess about why the dashboard light is blinking and forget to look out the window at the ground rushing up to remind them why altitude matters.

    Okay, my turn soon. Thanks to all for excellent ideas.

  8. After thirty years of corporate life and dealing with measurement requests from executives at many levels – I have learned that there are many metrics and corelations out there to be had and some have more value than others.

    In order to determine the value of the effort required to measure and analyze ai have always insisted on asking the following before I would commit to the work of fulling the measurement request:

    If you had the results in your hand at this very moment – what business decision would you make?

    There can be many attractive measurements that can be obtained, but their value is only realized when they are something that can be acted upon.

  9. Not sure about the brainiac thing but I’m not sure what the point of your web marketing is. Is it something hard or soft? Hard like: Are you trying to drive sales? Soft like: Build brand awareness, improve customer satisifaction/communication, etc.

    What is the goal of the money spent?

    If it’s revenue, I’d go with the easiest…

    Measure your revenue. Does it change as you throw dollars down the well, er, web?

    If it’s any of the soft goals: you’re screwed. The cost to measure brand impression or customer satisfaction can be more than it’s worth. Surveys, focus groups, and bears oh my.

    Part of your job is to manage the online fires. I don’t see how one can measure Tony Snow’s performance. You either know he can handle it or he can’t.

    You can make pretty pie charts and graphs and throw letters together to make cute words but in the end:

    revenue either goes up or it doesn’t and the best web campaign in the world can’t sell if the product isn’t up-to-date

    your gut and your pulse on who is talking about what, where traffic is going, etc is all that will matter.

    Make the charts for management. Allocate the dollars according to your gut.

    So much for that MBA…

  10. We have mocked up something to address the measurement and management from a community perspective just based on this conversation.

    I would appreciate any feedback on it. Please let me know if you want me to send a link by email.

  11. Mukund — please send.
    Fester, I’m bringing you into my next review with the CMO, you can tell him to stuff his metrics.

  12. Promise?

    Instead, we should concoct the “Freedonia” of web metrics. Something so complex and convoluted, it makes Rube Goldberg choke on his cashews. It should have sub-metrics, tied to time, the weather, and a “news index” that measures the activity of daily news on the premise that web users flocking to a large event will be drawn to your marketing. Anna Nicole pushes the metrics off the chart!

    It should comprise no less than 14 pie, bar, and the Nichols Plot. It should cumulate into a single grade: A-F for the prior week which should lead each report.

    If we can incorporate the tidal charts of Nova Scotia, all the better.

    We should back test it for 10 years, accounting for the change in computing power and the cost of a NYC subway ride (which correlates nicely with a slice of cheese pizza for those who love these sorts of things). We will initially find an error as we failed to account for the first Victoria Secret fashion show streamed live. It will be known as the Vicky’s error, and there will be two schools of thought on how to correct for it which will split the Lenovo marketing team in two. All the better for softball season.

    You should be the only one on this planet to understand it while everyone else will pretend to. Your faithful minions will start dropping references to the “Churbuck/Heimlich Regression Slope Impression Scale” all over the web until others start using it.

    Because, in the end, you’re either making revenue or you’re not. 2 plot lines: revenue; marketing spent

    But I like the first way much better.

  13. Funny enough I made the same observation on Mark’s post: “follow your gut”. Regretably I don’t think Managers are too fond of a powerpoint slide that reads: “as for the next year I’ll measure if my gut feeling s were correct… I’ll get back to you in 12 months”.

    True enough, revenue is simple to follow, this X bucks I spent on advertising yielded this other X bucks in profit. Yet, unless you are eBay or Amazon, the web serves another whole range of purposes.

    In general you also have to take into account brand awareness and public perception. If you are trying to sell “the best engineered PCs” you’d better have “the best engineered website” to back-up your statement. And make sure people notice it.

    Once I was approached by a client to take care of an AdWords campaign. Before I ever logged in to google I went to their site and then came back to them saying “you are going to increase traffic, but your public-perception is going down with this”, see, their site was plain horrible. Of course they thought I wanted to make a redesign for them so they just ignored me. Off we went with the campaign, visits increased but goal achievement didn’t went up a single notch. (after that they DID offer me if I wanted to do a redesign)

    Before you spend a single dollar on advertising make sure that the place you want to drive people too is great (in “some” sense; great design, great navigation or great offers…)

    Then a site like Lenovo has other areas which need to be looked up closely. Support and Downloads is a sensitive, time and money saving piece of web. If you have a good support section you’ll save a lot of money in call centers. Regrettably this isn’t as easy to measure as revenue. I don’t think there’s a good way to measure how much money you are saving, yet you can see if you are loosing money instead. Just have your help desk guys ask “did you visit the support section” -> yes -> why couldn’t you find a solution? // No -> why didn’t you in the first place? that should give you a pretty good idea. I have already written in the past that metrics and measurement in web need to be “completed” with data coming in from non-web channels.

    I could keep writing scenarios for a good while, but I’ll save Mr. Churbuck (and hes loyal following) a visit to the eye-doctor.

  14. While in my heart I agree with Uncle Fester and absolutely love his Freedonia approach, I cannot help but be intrigued by the puzzle of coming up with a credible form of measurement.

    Are we measuring outputs or outcomes? Age old question. Much easier to measure the former vs. the latter. Many people – Uncle Fester? – believe only the outcome matters. More sales, more revenue.

    But I do believe in the value of a brand – it’s impact on business today and the longevity of the brand (i.e. more predictable revenue stream). I am clearly liking the work of Fred Reichheld on Net Promoter Score but it takes a while to sell that idea into a company or organization.

    Aside from pure sales measures, web traffic and aside from online advertising outputs (all important), I am liking the following categories and measures. It is essentially an output measure of a different sort – halfway to the NPS model. It is based upon the belief that WOM and especially recommendations are of premium value to a brand:

    Online WOM
    1. Number of online mentions for product or brand: blog posts, message board mentions
    2. Positive, Neutral, Negative
    3. Number and tone of opinion site mentions

    Search
    1. Higher results in pages 1-3 of Google, AOL, Yahoo, MSN and Ask.com
    2. More results (denser) in pages 1-3 of same
    3. How many 3rd Party supporting results

  15. David
    I will send it from mukund at canvasgroup dot com email, for some reason most of these emails go to the junk folder. 🙂
    Mukund

  16. John Bell,

    While I wish I could pretend to, no, the evenue outcome is but one of the many things that matters. However, in the end, marketing merely supports the rest of the ship. As Esteban pointed out, a crappy site is still a crappy site regardless of the traffic drawn there.

    Marketing iPods is an easier task (in this respect) than marketing CueCats.

    David still has not shared his goals. I assume Lenovo would want to drive revenue. Like all good companies.

    I absolutely love branding campaigns. Especially good ones. And there are a dozen ways to measure buzz or recognition or affection for a brand. I’m no marketer. But whether it is PR or marketing or sponsorship, the end result is, at some point, to increase sales. Even an after sale campaign to improve customer satisfaction in a product already purchased (e.g. car), is meant to drive a future sale down the road. If the product sucks, that won’t happen no matter what else is done.

    1. Online mentions are the outliers of the world. Only people who take the time and effort actually write their congressman. Support and the like boards are filled with trolls and fanboys. Neutral? Godwin’s Law rulz

    2. I like the climb higher on the search rankings. Always better than paying for adwords but they’ll pay anyway because they have to. The Google Mafia extracts their tribute.

    Part of David’s mandate, it seems, is try and dull the fires that roar. Not spin, but act as a human, engage the unhappy, the unwashed, the dull brained like me. Can we measure that value? Not sure.

    I guess my Freedonia Metric was meant to highlight that not always what gets presented to bosses and management is what is used by those in the trenches.

    Are we looking for metrics to actually help the decision making process or merely justify the decisions that get made? Watching game film is key for a quarterback but it can’t make me a starting QB. David knows what works and what doesn’t. Maybe he’s a little unsure of apples versus oranges and which is the bigger bang for the buck but in the end it’s a Turing Test and unless it’s a drastic deviation from expectations, it probably won’t change his mind. David’s gut should and will win in the end.

    I’ve rambled too long and my undergrad stat prof who begged me to go into the PhD program would be aghast at the substance of all this but as with all cost centers, marketing must constantly justify itself. Whenever that happens, accuracy is really no longer the true goal. We know — just know — if Tony Snow does a good job or not. Presidential approval ratings (the revenue) can’t really reflect it. The best measurebaters can still end up like Long Term Capital while Paul Tudor Jones kicks everyone else’s butt.

    But WTF do I know.

  17. Good Lord, I wish the web and you guys had been around in the 70’s when I was doing truck branding and spent an entire summer going from a county fair in APathy, Idaho, to the State Fair in Tedium,CA. I took my little caravan to 17 different fairs that long hot wet summer.
    The highlight of the trip was following the Oscar Meyer Weiner Mobile from Iowa City, to Lincoln, Nebraska.
    This discussion has been more useful than a grad-level Internet marketing class. I have a sneaky feeling that future metrics will be born from this comment string.
    Meanwhile, I gotta prepare my TPS Repot and find that damn red Swingine Stapler.

    Best
    Jim Forbes

  18. Engagement:
    – post views:comments ratio
    – RSS feeds vs. email unsubscribes (those who want to hear what you have to say versus those who don’t)

    Email Marketing:
    – cost per email aquisition vs. profit per email
    Tie emails to specific campaigns to measure aquisition cost, track that email address versus online purchases

    Brand Performance:
    – aquisition:support ratio
    Compare the addresses aquired against the addresses in the customer support DB (phone and online). This is not unlike Esteban’s recommendation.

  19. I would measure;

    Remarkability
    Approachability
    Relevance
    Attentiveness
    Capability of making meaningful customer relationships

    Chris @ rawstylus.wordpress.com

Leave a Reply to John BellCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Churbuck.com

Subscribe now to keep reading and get access to the full archive.

Continue reading