BOBCM Measurement Round Table

- in ARTICLES, FEATURES, Research & Measurement
0

measuring_branded_content_tapestry

The BCMA ran their first leadership event at Havas on 4th December 2013 with presentations from Tapestry Research’s MD Ian Wright, pointlogic’s MD Tim Foley, Realeyes’s Commercial Director Alex Slater, and Sander Saar, BeOn Product Manager at AOL Networks (now at Maker Studios). The theme was ‘How to: Effectively use market research to make your branded content more effective’, and you can download the presentation here.

I subsequently facilitated a round table discussion with the speakers that will feed into my Future of Branded Content Marketing report in the second edition of the BOBCM ebook. I’m looking at the problems brands are trying to solve with branded content, and also the challenges they are facing. Measurement is a key topic, and the panel had lots to say about it, but to kick things off I asked the panel the following question:

Where is the market at, and where is it heading?

Tim_FoleyTim Foley at pointlogic thinks it’s easy for branded content to fall into the trap of the being seen as the chairman’s special project, e.g. something that someone is interest in, but without there necessarily being any great expectation about massive reach being or real impact achieved. He thinks that as long as Branded Content stays in this box, it will remain interesting, but not huge. And in order for it to become huge, Tim explains that there needs to be both some sort of mechanism for a lot of people to see it (i.e. reach), and also ability to value the expected higher impact that’s needed to offset what’s likely to be the higher cost.

Ian_wrightThe real challenge for Tapestry Research’s Ian Wright related to what branded content actually is because it seems such a broad church of things to him. He thinks that for some people it’s about sticking videos up on a Youtube channel, where some work and some don’t and it doesn’t really matter that much. For others, he thinks it’s much more of a core part of their overall marketing strategy that includes product placement, Advertiser Funded Programming-type TV productions, etc. As such, he thinks it’s very hard to a get a clear answer and generalise about what brands are trying to get from it:

I am not sure that is going to change, and in some ways possibly the digital channels have confused it more than helped it because you have such an easy way of distributing branded content now, so maybe it makes it a sideline rather than core part of things. That’s something I’m not clear on.

In terms of measurement, Ian thinks the more digital it becomes the more it becomes measured in more simplistic ways, e.g. the number of Youtube views or number of tweets, which are just used as proxies of success, but nothing beyond that in terms of what the activity is actually moving:

When you look at some of the rigour that’s applied to TV advertising campaigns, and the whole kind of return on investment and econometric modeling, there’s an awful lot of time and money spent on understanding how that’s working, and the pre-testing of TV ads and so forth. It just seems to be a bit of free for all in branded content, which is a challenge.

I asked Ian whether this was just the result of the kind of budgets being allocated for branded content  campaigns at this stage, particularly when compared to TV advertising where there’s usually a significant media spend:

Which is kind of my point because with digital distribution it doesn’t cost you very much to get the stuff out there, so why invest so much in measuring it, you might as well just put it out there and if it gets 2 million Youtube views then that’s been pretty good. No one really knows what it’s doing for the brand but that’s 2 million Youtube views. That’s just a number and it’s a challenge from a market research company to say you need to more than that, you need to understand what it’s doing and how it’s actually working with other parts of your media mix. So is it actually helping your TV ads, is it underpinning other elements of your media spent in ways you don’t understand

As Ian explains, companies like Tapestry and Pointlogic try and unpick a branded content campaign to see what it is doing in combination with other things a brand is doing, but he sees little of that kind of sophistication right now and doesn’t think that many advertisers are evaluating their campaigns in this way.

Sander_SaarSander Saar from AOL Networks comes across similar problems with  commoditized metrics being used rather than measuring actual value. As he explains, when clients and agencies come to AOL Networks they usually have 3 challenges or questions:

  1. Content Creation: This is about how brands move from ads to assets, and how they can start using new online formats and the opportunities they bring, rather than just sticking a TV ad online. It’s also about how brands can do something that’s relevant and engaging for their audiences.
  2. Distribution: Brands then want to know how they get to the right audiences, which Sander sees very much as being about moving from push to pull, e.g. reach you audience using a completely different model where branded entertainment becomes a click to play environment rather than interrupting  audiences by sticking ads as a pre-roll in front of video that people want to see.
  3. Measurement: The third challenge for brands is how they measure all of that. The big change for Sander is the moving from views and social engagement metrics to more value and brand-led metrics, such as purchase intent, brand awareness, brand favourability, etc. And that’s because he thinks that branded entertainment sits at the top of the funnel in the awareness phase.

Alex_SlaterAlex Slater thinks there’s missing piece of the performance measurement jigsaw right now, and that’s measuring the strength of actual content. This is where he says Realeyes add value with their platform for measuring how people feel as they view video content because if you look at earned media then people don’t share content that isn’t emotional. His point being that if you are looking at performance, then you need to understand how emotional that content is, and to rank and assess that.

For Realeyes, Alex says it’s all about making that link between how strong a piece of content is creatively and how well its going to perform not just in a social media sphere, but also in terms of moving the needle on sales, preference, purchase intent and these types of areas:

We’ve found it very hard to get sales data, as it’s very hard to isolate the effects of a piece of branded content on sales, so what we’ve set about doing is collaborating very closely with brands and their agencies to get hold of that data in a bespoke tailored fashion, and I think we’re six to nine months away from that holy grail connection.

Sander also mentions what he thinks is the related issue of brands often looking at all the different elements of campaign in isolation:

So they just go to a creative agency, and to understand the performance of the creative they test it with research agencies, then they go to media owner, and say hey we want to get this out there so give me some views and measure what you do with my content. Then they probably have another agency trying to measure the complex matrix of all the different elements they’ve invested in and understand the performance of all these different channels. So in the end it becomes very difficult to put together the whole picture of what’s going on. How does the creative effect your brand lift performance, paired up with your media and cases strategy.

He explains that AOL’s vision is to unify all this, and have an end-to-end view of the creative performance from understanding the emotional profile of a piece of video and then matching it against the audience that finds it most engaging by comparing to the other content. By measuring the ROI of different channels, he explains that they’ll be able to understand the performance of creative, and apply this to their media syndicating strategy – helping brands get the highest possible return because they’ll understand the overall value chain of the funnel.

A predictive and programmatic future?
I asked whether Sander was describing a predictive distribution and programmatic platform. He said that’s the ultimate holy grail, and Alex added that although it’s the way things are heading it’s still a good year away yet.

Sander agreed that programmatic solutions are AOL’s focus with the future being about automated ways of buying and selling media, and the use of very smart computer algorithms to do the job for you based on relevant KPIs. However, right now he says they have a lot of different solutions (e.g. display advertising, video advertising, branded entertainment, etc), and so some are more automated than others. For example, editorial placement is an area that isn’t fully automated yet and still involves human decision making.

However, Sander thinks the predictive and programmatic tools they’re developing will provide brands with hugely improved performance through the better analysis and optimisation of emotionally engaging content, and how this in turn improves targeting and distribution at scale, as well as a better understand of the value to brands along the funnel.

The evolution of measurement: It’s all about emotional performance?
Taking a step back, I pointed out that one of the reasons branded content has come to the fore is the change in media habits with people skipping ads on their PVRs, etc, and their increased use of digital technologies and social media. And so measuring the way we now consume media with the tools we used to measure traditional media could be a bit like a trying to fit a square peg into a round hole. I asked whether we should also be changing the way we think about measuring branded content?

Alex thinks this makes a lot of sense, and makes the following comparison:

We can test all form of video content, and the largest part of that is TV advertising. But for the last 50 years TV advertising researching return on investment has been done in certain ways, and there a lot of value wrapped in companies doing it in certain ways. So we’ve found it very hard to smash through that brick wall with this next generation approach to understanding performance and return on investment. However, when we move into the realm of branded content, no one has got the answer at the moment. Everyone is looking for the answer in terms of ROI and that means we’re getting a great reception and everyone is listening to what we have to say.

Ian adds that if you look at the traditional TV advertising research, and you go back in the day to the US model on which recall is based, it’s very cognitive approach to measuring:

So the research would ask whether you could name the brand that was advertised, and that’s why American advertising in particular had a lot of pack shots at the end and reiteration of the brand name because that meant it tested better because you’d get a much higher recall score over all.

But as Ian points out, brand recall is a very rational metric whereas over the last 5 to 8 years there’s been a greater appreciation that it’s actually an emotional response to a piece of creative that has a bigger impact than a rational one. That’s why he thinks a more softer approach is required, using new powerful techniques that help measure our emotional unconscious response to something. He also recommends that the link between the content and the brand is also researched because there’s no point creating great pieces of content that everyone loves if no one knows it has anything with the brand behind it.

Ian thinks there’s probably always a role for traditional survey based research, but increasingly brands are going to have to get to that emotional response because that really is the foundation stone:

If you get that emotional response then everything else will start to follow as long as you’ve got sufficient frequency and reach, and you’ve got sufficient attribution to the brand behind it. But I’d start with measuring that before you start layering on the other bits and pieces.

Creative and Media are two sides of the same coin?

Sander points out that it’s not just about creating a really emotionally strong piece of content because if you then chop it up to a 15 second pre-roll you can ruin the performance you were expecting based on your measurement of the emotional response to your content:

You still have to take the content and be smart in your media buying decisions, and be in the right environments where you can reach the right people – an environment where they are emotionally engaged with the content they are expecting to get, so you still need to be really cautious and still need to know what you are going to do with your media buying.

Alex agrees that emotions are not a silver bullet, but it’s an important part of equation as are the channels you are pushing your content through:

Some of that emotional response can colour the media strategy, they are not opposing, they work better together, so it’s that old word synergy.

Ian thinks you should still start by measuring the emotional content, because if you have a piece of content that doesn’t resonate emotionally with people then you’ll always be fighting an uphill battle.

What role does the Dr Heath’s Cognitive and Emotional Power Test now play?
I asked whether the new models of measuring emotional performance still used Dr Heath’s Cognitive and Emotional Power (CEP) Test that the BCMA’s Andrew Canter discussed in the first edition on the BOBCM book (see here).

Ian explained that when they launched the CEP test back in 2006, the likes of biometrics and brainscans would cost you a six figure sum, and this was beyond the reach of most people. The CEP Test, however would only cost a fraction of that and was a good proxy for measure emotional response that works. But he thinks new tools like Realeyes provide you with real time measurement of the emotional response for not much more money, and is arguably more precise than any survey based metric:

It’s not to say that SEP’s wrong, it’s possibly just a slightly cheaper, slight inferior technique compared to something like Realeyes where you are tuning in second by second to that content.

But is measuring Emotional Performance of creative a double edge sword for agencies?

One of the challenges Alex mentions in response to Ian‘s comments is that the Realeyes tool can end up showing that a piece of creative is not very strong emotionally:

If the emotional score is 2s and 3s there is an argument that says hold onto your purse, and don’t put any money behind it because you are pissing against the wind, but it takes a very brave agency to turn round to their client and say don’t spent any money with us. So there is a bit of friction there.

How is emotional data about content mapped to distribution channels?
I wanted to try and join the dots, because I can understand how measuring emotional performance of a piece of content can in theory help you improve the planning of your media strategy, as well as show whether you’re likely to improve your earned media performance. But I’m less clear about how the emotional data is mapped on to the distribution channels.

Sander mentioned that they’ve been working with Realeyes for more than a year to map the emotional profile of  video content to its performance data (ie. the shares, the likes, the comments and click throughs, and have people watched through the content). This forms part of their research into how creative and media work together in order to develop a very easy and simple to understand reporting that helps brands understand in 5 minutes how their content is performing through a range of relevant metrics. For example, if you have 5 videos and you want to run them in 8 markets, the reporting structure allows you to see which of the videos work in each market and which of them don’t resonate. You can then allocate your budget accordingly based on which work best and which don’t.

For Sander, having the data that you can action on is very valuable because you can start delivering greater value back to client, and that’s how this comes together by using emotional performance data and applying it in a media planning or syndication strategy. That he thinks will help clients get the highest possible return because that’s how the whole system they’ve been putting together has been designed.

But is the mapping of emotional performance data to distribution strategies based on demographics or psychographics?
What I was I was really trying to understand is whether the mapping of the emotional performance to the AOL Network was based on demographic or psychographic data.

Sanders says that it’s firstly about the demographic data of their audiences across the Network, but it’s also about understanding the target profile of the video and linking the two. So knowing the emotional profile of the video allows them optimise the distribution of the content to those audiences that find it most engaging, which helps improve earned views through sharing, etc.

As Alex points out, it’s about media efficiency and as Sander adds the model will become even stronger and more valuable when they start adding more data points like behavioural profiles in future. Nonetheless, Alex says they still have something that is valuable to brands despite being at the “foot hills of this mountain that they are climbing up”.

But who measures what, when and where?
I wanted to get some idea of what that panelist do that’s different from each other measurement-wise, and what’s the common ground:

Ian started by pointing out that for Alex and Sander it’s very much about optimising in markets, where you’ve got a piece of creative and it’s about making sure it goes to the right audience so that you can get the maximum value you can get from that content. Whereas he thinks what he and Tim do is look at campaign optimisation after it’s finished:

So did you have a fantastic piece of creative which you just under-utilised with your media strategy or did you do a fantastic media strategy with a crap piece of content where it was a lead weight around media’s leg. So that’s what we want to optimized against, so what could you have done differently and therefore how can you apply this going forward.

That’s why Ian thinks the starting point for Alex and Sander is trying to understand that creative effect, i.e. were the dice in your favour or were they against you. From there they start looking at the overall media strategy, and then Tapestry look at other parts of the media campaign as well, or the marketing campaign in terms of what else was working alongside this branded content, before moving on start dissecting it all. So although what Alex and Sander do might sound slightly different, Ian explains that they’re also trying to address the same question that Tapestry and point logic look at, and that’s how can the advertiser get more out of their marketing spend.

Ian thinks that Tapestry sit very much in the middle, and that’s why they want to utilize Realeyes within their market research surveys because it gives them that creative understanding, so that they can make sure that the data they produce is going to be actionable in the real world. But they also need to use Pointlogic to apply planning principles to bring in things like reach and cost to convert Tapestry’s market research data – that’s heavily focused into one particular audience – into the real world with real costs, etc. That’s why Ian thinks Tapestry sit in the middle, with Tim at pointlogic being very much at the back end, and Alex and Sander being at the front-end by starting with the creative.

For Tim it’s about the context from a modelling sense because although good creative is obviously better than bad creative, it doesn’t really matter to them as what they look at was whether it was effective or not. Clearly both creative and distribution are important factors in measuring efficacy, but for Tim modelling is really about putting things in context because you can have a good piece of creative that only talks about product range but doesn’t make an impact on another KPI like value, so it’s also about understanding the effect of different creative on different KPIs.

Alex sees the measuring of emotions through facial coding as a really new dimension of data, which is an area that very few companies are doing globally – making what they are doing very different from the other panelists:

What we are interested at the moment is empowering brands, agencies, and media companies generally, and helping them enhance their existing offerings that tend to be rationally based with some of this gold dust, and we think it’s a very good way of getting to understand the dark matter of content through the aggregated emotional profiles.

At AOL, Sander says it’s all about simplifying advertising among premium audiences at scale, and by offering clients an end-to-end solution from understanding the creative through partnering up with the best technology providers in this area. This data is then applied to media syndication, which is their core strength, and lastly measuring this along the way by partnering with different research providers.

But what opportunities and challenges lie ahead?
Lastly, I asked what the panel thought would be different about what they will be doing in 5 years and what they though will be that same:

Ian thinks we can end up being more smaller picture to a degree because the marketplace in terms of the media landscape is more complex now:

To understand just what all these potential channels are doing for an advertiser, we’re increasingly having to add them all up together to say what’s the net effect and how to maximize them. The problem for brands with lots of different properties is the lack of any holistic understand of how they work together. So we are going to have this dual challenge of really understanding at micro-level of how individual channels or touch points work, but also the more combined holistic level of how they all fit together.

For Ian that’s the a real challenge from a market research point of view, and that’s where he thinks we are heading.

Tim thinks that market research will get turned on its head by the explosion of more data. He predicts we won’t be aggregating audiences around their demographics, but will instead be “valuing individuals based on purchase probabilities” as a result of media being able to be delivered on an IP or individual basis; and as he points out “this changes everything in terms of how media works and who should be on the team to deliver and evaluate it.

Alex predicts that Emotional Data will become ubiquitous:

We’ve got a stream at the moment that will become a waterfall. Right now when we are testing people for their emotional reaction it is very controlled. The tests are relatively small, even with 300 completes. But as the pace steps up through advances in technology, and as we build up our platform. then we’ll be able to test people as they go about their online day in a live sense, rather than controlled or test sense.

And he thinks this is going to go big because it will provide a lot more emotional data across many different devices, helping brands and their partners to better understand a more visceral response to their message and other information that they put out.

Sander thinks formats are going to change, and that these will become more native. He predicts that technology will drive real time delivery, with measurement also changing as a results of more data points becoming available. What he thinks won’t change is the need for brands to connect with their consumers and users, and that’s where he sees AOL fitting in with new and better tools to help advertisers reach their audiences at scale in premium environments.

You can read more Dr Heath’s Cognitive and Emotional Power (CEP) Test in an article about measuring the success of the Branded Content by the BCMA’s Andrew Canter that was originally published in the first edition on the BOBCM book.