Mark Fletcher, Director, ShopScience discusses with Darren his view on the difference between data and Big Data and the way marketers are using it to get insights into consumer behaviour and the new role of market research in validating the insights and revealing the underlying motivations.
You can listen to the podcast here:
Follow Managing Marketing on Soundcloud or iTunes
Transcription:
Darren:
Welcome to Managing Marketing and today I’m joined by Mark Fletcher, Director of Shop Science. Welcome, Mark.
Mark:
Thanks Darren, it’s very good to see you again.
Darren:
Well, we have, as you pointed out earlier, known each other for many years.
Mark:
It might even be more than twenty it’s sad to say.
Darren:
Well let’s not go into numbers because that just makes us feel older but I think a bit of silver in the hair is testament to the fact that we’ve been around for a while, huh?
Mark:
Indeed.
Darren:
And in that time, I think we’ve both seen a lot of changes happening in the advertising and marketing industry.
Mark:
Yeah, it’s certainly a case of either evolve or die to be part of that world now, because virtually every paradigm that I experienced over twenty years ago starting in the industry, has pretty much broken up now so yes, very few of the old rules remain except fundamental things like if people aren’t happy, they’re not going to buy things, all those type of things, but yeah, almost everything else has changed.
Darren:
Well I say to people, one of the great things is human beings don’t change.
Mark:
True.
Darren:
They evolve.
Mark:
True.
Darren:
But technology is changing and I think that’s really the big issue in our careers. I think I remember a time when you were given a computer for the first time to actually write copy and do things on rather than giving it to the copy typist.
Mark:
Yep!
Darren:
And now I don’t think there’s anyone that knows how to write with a pen because everyone does it on their tablets or phones or desktops.
Mark:
Yep.
Darren:
So technology clearly has a big impact and we hear things like the industry is being disrupted and there’s transformation happening and things like that, but I think a lot of people don’t really take the time to understand how big this is.
Mark:
Look, it’s hard, it’s impossible to disagree with you. What we find in our work is that they don’t take the time and a lot of times, they don’t even want to know because it is too challenging.
You’ll talk to a senior manager and they’ll say, “Oh that Facebook, I know it’s getting bigger but I can’t really get my head around it. My daughter does, she’s great on Facebook and she shows me stuff every now and again”, they won’t appreciate in some industries it’s a principle method of communication for people conversing on Facebook and exchanging ideas.
So that particular mechanism is just a kids one and a lot of those people don’t want to face up to the change.
Darren:
Mark it’s interesting you choose Facebook because my eighty-year-old father has discovered Facebook, but unfortunately he thinks that it’s actually like online email.
Mark:
Right.
Darren:
So he puts things on there that I have to respond to and then delete because you know, it’s actually very personal and quite revealing about him but he’s actually not clued into the fact that about 800 million active users around the world can probably see what he’s writing.
Mark:
And enjoy it.
Data versus Big Data
Darren:
Well, I exaggerate because I have set his privacy settings so at least all of my friends and his friends would be able to see it. But one of the big areas, and I think probably the most exciting area about technology, is the fact that the data technology can collect is giving us better insight in real-time, to the behaviours of customers and consumers, than ever before.
Was it last year or the year before, everyone was going, “Big data, big data, big data” but I wouldn’t mind exploring this with you because I think it is a huge transformational change that’s happening for marketers. So big data and data, what are they and what are the differences from your perspective?
Mark:
Sure. Look, in really simplistic terms, big data is a mess. It’s massive, it’s like grains of sand and individually, those grains may be interesting but they can’t do anything individually so you can’t build a sand castle with it.
Darren:
Or you can but it will get washed away at the high tide.
Mark:
Indeed. Let’s not go that far into the analogy. I mean, one of the ones we use a lot is to say that you take your kids down to a beach and they go, “Wow, look at all the sand”, and you say, “Okay, let’s build a sand castle”. What do we need to take these grains and make them into something useful like a sand castle?
You need two things; one is you need the raw material, sand, but you also need some tools, a spade or whatever, and some water. Because if you just try to build sand without those tools, you’ve got a heap, you don’t have a castle. On the other dimension if you like, unless you have a plan or a pretty good idea of what it’s going to end up like at the end, then again, you might have a beach, you might have a big flat surface.
You’ve got to have a plan of where you want to go. So in the simple sense, big data is that beach, that huge collection of dots of grains of sand, but to make some sense out of them, then what you’ve got to have is the tools and the vision to do it.
What that castle represents in this very laboured analogy is the data. Data in my view, data is the three or four dot points on the presentation slide that management can actually walk away and remember. Big data is, as we were talking before, the forty or fifty pages of appendices that no one ever reads.
Darren:
Well, it’s interesting that you use that because I actually love the distinction which is, there is data, there is information, there is insight and then there is wisdom.
Mark:
Absolutely, yes.
Darren:
And so data is just a collection of information.
Mark:
Sure.
Darren:
Points, numbers, whatever. Then there is insights or information.
Mark:
Information, then.
Darren:
Information then collects that data into packages that you can then analyse and consume and from information, there goes insights and this is where the real value comes from the whole process. Because it’s insight that gives you an opening to an opportunity that you hadn’t seen before and hopefully, other people hadn’t seen before because that then gives you a potential competitive advantage in the marketplace.
Then over time will grow knowledge and wisdom which will start to inform you of what to look for in information to get insights in the first place because you start to learn the ways that patterns emerge?
Information, Knowledge and Wisdom
Mark:
Couldn’t agree more Darren, the only comment I’d make is that in our experience, the most common place where that breaks down is actually at the insight, when you’re going from the insight stage to, what was your next one? Wisdom?
Darren:
From insights to knowledge and wisdom, yeah.
Mark:
Knowledge and wisdom, okay. There’s some great people out there who can take big data and turn it into data, right. That’s largely a scientific process. And then there are a number of, lots of people who can extract the data into information.
Darren:
And information would be 52% of people do this.
Mark:
Absolutely.
Darren:
Or, the people doing this has increased by 10%.
Mark:
Absolutely.
Darren:
This is all information. Because really it doesn’t give me anything other than information. Insight would be then to say, “Well, this is a trend and the motivation or underlying motivations could be this and therefore if I find ways of encouraging that, I could drive the trend further because it could be a positive, or it is a positive for my business”.
Mark:
I completely agree. Ideally it should go from all those steps to the end where an organisation makes or invents fabulous new products or innovates in its services or whatever, but typically the place it fails is where the insights person can’t actually communicate effectively with the organisation.
Because the organisation is coming from one paradigm where they’ve got costs, where they’ve got infrastructure, where they’ve got business objectives and all those type of things and they’re very rooted in their organisation, in their very being.
Then you’ve got these insight people who quite often run around going, “Oh, you should build something over here, you should do that” but the organisation goes, “I don’t know how to use that information. It’s kind of interesting but it’s unusable”.
So we see that an enormous amount in market research. Market research which is basically just a poor man’s version of big data, will create segments right, segmentation. And it will be fantastic. You’ll look at them and you’ll go, “Wow, that’s so insightful” and particularly the ones which are psychographically based, you go, “My neighbour is just like that one and my daughter is just like that one” and then you take it to the organisation and they go, “Oh, interesting but we can’t use that.
We can’t leverage those segments because they’re psychographically based or something and we can’t find those people. We might know what to say to them because we’ve got some idea of their hits, but we can’t find them, we can’t measure them if we’re actually working with them and getting more money out of them. So that to me would be a great example where the segment, where the insight, can’t interface with the organisation.
Darren:
It reminds me, and I can’t remember if it was Coca Cola, it was one of those beverage brands and someone had done a huge amount of segmentation and they said, “Seriously, all we need to find is any bastard with a mouth”. That’s the segment, any bastard with a mouth, that’ll be fine.
Mark:
Absolutely, yeah.
Information versus Market Research
Darren:
But you raised market research in there and I think it’s really interesting because I’ve had a number of conversations with very senior marketers and very large organisations where their data analytics team, which has access to huge amounts of big data and is busy chopping it into data and then information and then insights, turns around and says things like, “Because we’ve got all this data, we don’t need to do market research anymore”, because you did say it’s a poor man’s version of big data.
Mark:
Sure, absolutely.
Darren:
So is this the death of market research? Do you need market research in your opinion?
Mark:
Absolutely you do, but it’s the death of the industry as we knew it. So again, twenty plus years ago when I started in the industry, it was at that stage called market research, it really was the only way that you could understand much about anyone was to go out and ask them or do a survey of them or something like that.
So it had to fulfill a whole range of functions. It had to measure behaviour. It had to measure perceptions. It had to measure needs. It had to do all of these different things and it was a master of none really. And that’s leaving aside all of the psychological research which says that when people answer a question, they’re answering a question, they’re not actually telling you what they feel and think and do.
Leaving all that aside, you’re trying to use this one tool to fulfil this vast range of functions. So not surprisingly, it didn’t do any of them very well and it was inordinately expensive. So we tended to simplify the things that we tried to find out because it would’ve been way too expensive to ask a survey that was an hour and a half long, to ask a census of someone. So we cut it down to ten minute nodules or surveys.
Darren:
Sound bites.
Mark:
Sound bites, yes, which aren’t representative really of people. So anyway, that’s a long story. So that was all the failings of the industry and that was in its heyday. Well, along come all these other things, social media, which you could argue is a form of big data in some ways.
Darren:
It is; it generates a huge amount of data.
Mark:
Data, exactly.
Darren:
Around relationships, opinions, all sorts of things.
Mark:
All that type of stuff.
Darren:
Just ask Facebook.
Mark:
Indeed. You were talking about behavioural data of customers and transactions and all those type of things so you know, any sort of rational person would say, “Okay, there’s strengths and weaknesses in all of those fields, all of those mechanisms.
Let’s try and leverage the strengths, be aware of the weaknesses and potentially join up as many of them as you can to get a holistic view or an understanding of the situation”. So that doesn’t cut out market research. It just means that it’s got a particular role and we’d argue, one of our basic ethos is that market research works most effectively as a tool to explore things which you probably already have some hypotheses or ideas about.
It can put sense and meaning around those. But if you just use it the old way, which was, “I’ve got this ocean of consumers out there, we’ll run some focus groups and try to lift out the insights from that”, well, who knows what fell between the net and who knows if you were fishing in the right spot in the ocean. So the likelihood of catching the right trout or whatever you’re after is very, very rare from traditional market research.
The science of data and marketing
Darren:
I like that approach because we both have science backgrounds. So for me it talks straight to the scientific method.
Mark:
Absolutely.
Darren:
And in some ways the traditional, classical scientific method was observe the natural world. So observing the natural world is looking at behavioural data.
Mark:
Yep.
Darren:
Looking for trends or behaviours and then making a hypothesis from your observations.
Mark:
Yes.
Darren:
And then setting up an experiment.
Mark:
To test that.
Darren:
To actually test the hypothesis with a control to get a positive or negative outcome or a new try outcome. And so in some ways I explain to people, because there is this real trend away from market research, that market research probably, properly constructed and executed, is a great way of testing a particular hypothesis without committing a lot of money to actually go to the market to test it.
Mark:
Absolutely. That’s a great way of putting it because that, to our view, is how market research should be used and it doesn’t mean that it always has to be qualitative, quantitative can be effective as well.
It’s just you know, the right tool for the job. So in that sense, the other thing, coming back to big data though, and we had this conversation before about the traditional way of doing market research, it would take you a month or six weeks to get the answer from your survey or your focus groups or whatever and it would cost you tens of thousands of dollars.
So even if you did develop some hypotheses early in the process of a market research study, you’ve got a limited amount of time and money in which to sort of explore and develop those.
With big data, we can develop hypotheses so quickly and so cost effectively through analysis and segmentation and all those type of things that we can generate thirty, fifty hypotheses through manipulating the data and not just data in terms of numbers in a customer database, but data in a broader sense of taking into account census data and trends, taking a very broad view and saying, “Oh, I think it might be this, this, this, and this”, do a little bit more qualification and then take some really good things into research.
Darren:
Yeah.
Mark:
So very quickly and cheaply you can develop some good hypotheses.
Darren:
And even in a way, filter or sense check them without actually testing them.
Mark:
Absolutely.
Darren:
From the point of view of research and you mentioned qual and quant, the fact is that whether you use qual and quant, they should always be used together.
Mark:
Absolutely.
Darren:
Or you’re looking at psychographic testing, what you’re really looking for is the underlying motivations of people.
Mark:
Yes.
Darren:
Because one of the limitations of most of the data and we started off this conversation saying technology is driving this, most of technology can tell you where I am, and possibly what I’m doing but the one thing it can’t do is actually read my mind, yet.
Mark:
Indeed.
Darren:
To tell you why I’m doing it.
Mark:
Indeed.
Quality versus Quantity Market Research
Darren:
I think that’s why there is still a role for, and I’ll say it, for good market research because there is a lot isn’t there? There is a lot of crappy, cheap, research out there that is like band-aiding problems.
Mark:
It takes all these different forms of band-aids but yeah, it is a bit terrifying.
I have this discussion with clients a lot where they show me previous research and some of it’s beautifully done and obviously they’ve had a graphic designer involved and all those type of things and I literally ask the question, “You’ve got this 20-page report and it’s fantastic and looks great, so how many times have you looked past the first two pages of the executive summary, and, if I’d just written those points on a piece of paper, would they have given you as much insight?”
So out of the $30,000 on the market research study, $20,000 would have been wasted in stuff that they didn’t actually value beyond that moment when they handed over the report and went, “Oh, that looks great”, or they handed it to their boss. So to me, I think that’s all crappy market research personally.
Darren:
And that’s actually what’s dragging down the perception of market research because there is so much of this, it’s either poorly constructed or it’s based on a poor proposition to test anyway.
I mean, I think anyone who spends a dollar on market research just to get the answer they want to justify their decision, is wasting their time and money doing that. Because to me, market research has always been about either finding or proving the insights that you’ve identified, the hypothesis, because that’s really where you generate value.
Mark:
That’s right. You’re so right and yeah, that’s a very hard paradigm for a lot of people in market research to accept. It hasn’t been that way traditionally in the industry. So I often find that the people who are best and most useful in these type of situations are ones who have come from a different background. A really good science background.
We worked with a guy who’d be, what’s the term, a theologian, so someone who’d studied religions. He was fantastic because he was great at wading through masses of disparate information and different ways of expressing things and trying to come to what is the essence of things. And that person was remarkably good in this other context because his skill set was so powerful.
So coming back to a basic point of, there is so much crappy market research and that’s partly because most people are coming at it from the same paradigm if you like of what they’ve done previously. We’ve always done surveys, we’ve tracked the same thing over time. Who cares if it’s not relevant anymore, it’s the measure we use in here!
Darren:
Or the organisation that says, “We’re not going to back anything until you’ve done the market research”, so it reduces it to a rubber stamp.
Mark:
Yes.
Positive and Negative Outcomes are Valuable
Darren:
In a way it becomes a gate and in actual fact, any sort of research is an intuitive process because I always say to people, for six years I was working in medical research and it was publish or perish.
Mark:
Right.
Darren:
But you would put a grant application in to get your funding and even if you got to the end of the process, if the experiments were properly constructed, a negative outcome was still an insightful outcome.
Mark:
Absolutely.
Darren:
Whereas I think too often in business, negative…..
Mark:
Means you did something wrong.
Darren:
It means you’ve failed. Negative is fail. No, negative just says the proposition has been disproved. You know, it’s a bit like, I love watching that show, “The Myth Busters”.
Mark:
Yes!
Darren:
They say, “It’s disproved”, well, that’s just as interesting as the one that’s been proven and that we should take that same approach.
Mark:
No. Thank you, I’ll use that medical science example because it’s a really good one. I think look, if there’s been two factors that have led organisations not to take that rational approach, sorry about that.
One of those is that it would seem to take more time to go through and disprove a lot of things. Can’t we just focus on those things that we think are going to be right? It’s easier.
There’s this perception that having some disproved and some proved is just a long way to get to the same point. That’s number one and there’s the ancillary cost cut there, but I think if you’re a middle manager, it’s very hard to go to your boss and say, “This great idea I had, it doesn’t actually fly”. It’s not seen to be as a promotion vehicle.
So there’s a whole ethos of saying things like, if they don’t fly they don’t work. I mean, if they’re not, what’s that saying? You never take bad news to your boss. It’s that type of thing.
Now it’s interesting, I just heard from a fellow from Atlassian this morning and they’ve really tried to create from day one thirteen years ago, a different type of culture where they do things like encourage their staff to work from home because that’s often more productive but they have worked very hard at a culture of not just open communication in a sort of corporate speak way, but in a way where it’s effective and where they do value the things which don’t work but they also record those things that don’t work.
So that becomes of itself learnings along the way instead of, they were just failures that we put in the bottom drawer and never talk about. So that’s one company that I think has got a terrific approach to leveraging those failures, leveraging those things which didn’t fly, and probably over time, they will be more cost-effective than the traditional organisation which says, “We’re only going ahead if it meets those narrow criteria that we set and if it doesn’t meet those, don’t tell me about it”.
Darren:
Yeah, well, I think it was Edison with the light bulb, he said, “I tried 5,682 different filaments to find the right one”. Often the biggest developments, the biggest leap forward, can take a lot of negative outcomes, “failures”, to actually move forward.
I think there is a level of cynicism in the marketplace about the promise of data and big data, you know, has it become the tool for big tech companies to sell technology to handle it or what are the ways that marketers could use it or even not use it?
The promise of Big Data and the reality
Mark:
Sure. On that point, if I hear that example of Target, that US Target.
Darren:
Oh yes who found the pregnant girl before her parents knew.
Mark:
Yes. That’s become the catch cry of big data. And they don’t mention the two hundred million times when they send the wrong product to someone because they use these simplistic algorithms which said that just because Darren buys Gin, he must like Tonic.
Well actually, you’re buying it for your wife, you’re not buying it for you. So yes, there absolutely is this myth building around the use of big data to solve all sorts of problems. So it’s like any new invention that’s going to fix everything. So there’s inevitably that.
I think it’s still so early in its development that we really haven’t developed those mechanisms for integrating it into what the business should do. In the simple terms, as of now, I’ll always advise people, “Don’t let your data scientists or whoever, loose on the data and just say “see what you can find”, right?
The data science isn’t sophisticated enough yet to do that. They just don’t know enough. They’re finding correlations everywhere and they don’t know if they’re causal or not, they’re just correlations. So at this point in time, until the data science industry becomes more expert and more sophisticated, the best thing a marketer can do is to go to a market and say, “Look, we’re hearing from our sales reps, this thing is happening, can you conduct an A-B test on the data and actually work out whether that’s happening or not?”
So you direct the data to test things like we were talking about before. But I think there’s a great tendency at the moment for marketers to go, “Oh, I won’t have to think anymore if I just get the data scientist to come up with stuff” and it’s a bit like the example before about the segmentations which are useless.
Darren:
I think with those books about freakonomics it is so easy to get cause and effect mixed up if you’re just looking at data trending. You know, if you start seeing two points and they seem to be trending in the same way, then to make some quantum leap to say there’s a cause and effect here, can lead you down the very wrong path. But what are the times that marketers should probably consider not using data? Are there times?
Mark:
Absolutely. Here’s a couple of dangerous times. Right, it’s been well proven in a lot of academic research that first adopters actually have different behaviours to the early majority. Right?
First adopters are a weird breed and what they love to do is find the latest thing and then they get bored with it and they go and find the next latest thing. So it’s really, really easy for a marketer to launch a product and say, “Oh, we’ve got instant take-up, and look, the data is showing this trend”, and they interpret that as meaning that it’s going to be a success. You’re reacting.
Darren:
Mark I’ve got my head in my hands because I recently opened a box and it’s got the Palm V and the Vx, with all the bits there, because you know, I think for a while I was the early adopter and this box has got a lot of other technology in there.
I’m thinking, “No one will want to buy it so I’ll open a little museum to technology” and I think it’s the only value that it has left. But yes, early adopters are a unique breed.
Mark:
They are.
Darren:
And in fact, a lot of the evidence is showing that you’ve got to not look for the early adopters but the first wave.
Mark:
Yes.
Darren:
The edge of the first wave is where you move to a mass market, because there have been things that have been successful with the early adopters that never moved on.
Mark:
Yes.
Darren:
Sony Betacam.
Mark:
Yes, thank you!
Darren:
Because it was the best quality video recorder.
Mark:
At the time.
Darren:
But VHS which was poorer technical quality was the one that took off because it was what, more widely available in all of the video libraries.
Mark:
Yes.
Darren:
Oh my God, video libraries, remember that?
Mark:
No.
Darren:
That’s how old you and I are.
Mark:
Great place to meet girls.
Darren:
Well, Quentin Tarantino got his start there, he just sat there watching movies.
Mark:
Oh right!
Darren:
Until he started coming out with ideas for his own scripts.
Mark:
Cool. So sorry, just to finish things, so that’s one real great danger of big data is to use the data to instantly make a judgement about something.
Another time that I think a marketer shouldn’t use big data is when they don’t interrogate it properly.
I’ll give you a quick example. So, a company sent me, they’ve been doing some NPS tracking, so net promoter scores. So they ask these distributors of theirs to rate things and they ask, when you do a net promoter score, you say what score out of 0 to 10 and then, why did you say that? So they had these pieces of data, over 2,000 pieces of data with a rating and a comment and they said, “Oh, we can’t interpret the comments, there’s too many of them”.
I said, “That’s fine, why don’t we just for nothing, run it through a word cloud software, it creates this big mosaic of words”, and they all looked at me and said, “Oh that doesn’t really tell us much, that’s just a collection”. And I said, “Why don’t we separate those who said 9 and 10 versus those who said 1-6 and run the software again”, and suddenly out of the mass, we looked at the 9 and 10s, we got three or four names of service reps.
Darren:
Right.
Mark:
Right. But were really clear.
Darren:
That were popping out.
Mark:
That popped out because you’d taken it from this vast sea of data and cut it into some rivers that made sense. So I think.
Darren:
You started segmenting it looking at those individual pieces of data.
Mark:
That’s right.
Darren:
As pools rather than as the whole. I mean, from a mathematics point of view, the problem with whole large numbers, more sample points you get, the more you’ll find the mean and more and more will fall within the mean because of the standard distribution.
Mark:
Absolutely.
Darren:
It just gets higher and higher and higher. So suddenly you’re overcome by everything is average and there are no insights.
Mark:
No.
Darren:
In average. It’s the outliers that actually create the insights.
Mark:
Exactly, yeah. So in simple terms, don’t use it to, or don’t look too early to spot trends just because you’ve got the data to do it. And secondly, never look at the whole of the big data, it’s too big and too averaged. Always try to break it up and try different ways of breaking it up, preferably in ones that you’ve got an opportunity to respond to. So states or whatever.
Darren:
So if I get what you said earlier, the best way forward if you’re using data is to make an observation and have a hypothesis, and then get a data scientist or a data analyst to actually look at the data to see if it supports the hypothesis or not.
Mark:
Yes, that’s right.
Darren:
To see whether it appears to be a legitimate observation. Then to look at finding ways of testing that hypothesis.
Mark:
Yes.
Darren:
Beyond the data alone.
Mark:
Yes.
Darren:
To be able to get a secondary validation for it, yes?
Mark:
Validate. Yes, precisely.
Darren:
Okay.
Mark:
If we could only educate marketers.
The next step in data and market research
Darren:
What’s beyond this then? What’s going to be the next step beyond this process? Or do you think just getting everyone to this process is going to be enough?
Mark:
I think that’s going to take a retirement of a whole generation of baby boomers really to get there.
Darren:
Possibly.
Mark:
Arguably artificial intelligence will be the next step. Learning machines.
Darren:
Of course, yeah.
Mark:
They will potentially.
Darren:
Which is wisdom.
Mark:
It is.
Darren:
It’s the ability to run a process over and over again and through each iteration gather knowledge and wisdom of what to do to find the insights.
Mark:
Yes, indeed. Now the thing that worries me about that is again, from our science background, we know that the fundamental how evolution happens is that any given species is always varying itself. It’s having genetics brings up tall people and short people and people with brown skin and white skin.
Darren:
And then mutations, spontaneous mutations.
Mark:
Absolutely and some of those are more suited to the environment than others so they succeed, right. So it’s an outgoing process always, evolution. When we look at it in a series of pictures, it just looks like a constant thing but what’s actually happening is it’s constantly varying back and forth and eventually it gets to the end. Now what worries me with artificial intelligence is I think it will be constantly moving in.
Darren:
Normalising.
Mark:
Thank you. It’ll be normalising. It would take an extraordinarily brave machine to be spontaneously breaking out the way that each baby that’s born is different from another.
Darren:
But ultimately the measure of artificial intelligence is its ability to replicate human behaviour. Because the test is, I feel like I’m actually conversing or working with a human being and the thing about human beings is our ability to make quantum leaps and find new patterns. So anything, if it’s purely artificial intelligence of how to get things to fit a particular pattern, then you’re absolutely right, but let’s hope that as they move forward, that they find ways of looking for that almost spontaneous generation.
Mark:
Yep. I hope so too.
Darren:
To look for the mutation and to encourage the mutation because in actual fact, that’s creativity.
Mark:
Yes.
Darren:
Creativity is constantly looking for ways that will break the pattern.
Mark:
Yes, or else it wouldn’t be creative by definition.
Darren:
Yeah, that’s right. Now speaking of creativity, because my bug bear in market research for the fifteen years as a copywriter was concept testing.
Mark:
Yes.
Darren:
And particularly Millward Brown Link Testing.
Mark:
Okay, let’s just name names here.
Darren:
No, no. They sell it as Millward Brown Link Testing you know, where it links the execution to the particular messages that they want to take out and it comes back with, “You need to change”, it actually tells you how to change the ad and is very directional in doing that.
Now, I mean, knowing the limitations of asking people questions as opposed to the way they process it, what do you think of concept testing?
Mark:
I am totally opposed to concept testing within a structured environment.
So any tool, Millward Brown or there’s a number of those proprietary tools, is by definition coming from a particular paradigm alright? And it says, “This is the way the world works”.
Well, I’m sorry, no set of 30 questions, no matter how well constructed, actually can represent all of the reactions that people have to a complex stimuli like advertising. So it’s inevitably a limited paradigm so things will succeed or fail through that will be very limited.
And we’ll know that a new ad can come out with really left field execution and suddenly everyone’s going, “That’s fantastic” and as you said, that’s creativity. So conceptually, I disagree entirely with those structured tools for assessing advertising.
But even more basically, people are very very poor at telling you what they want. They’re even relatively poor at telling you what they like. Because what they like now within the one minute or two minute of viewing the ad might be a totally different way to how they feel about it after they’ve been sitting at home for a couple of months watching this ad coming through and they’ve got the jingle in their head and it’s become almost a friend in their lounge room.
So however you view that ad in again, in a structured testing environment where they say, “Oh, I don’t like that woman because she’s got a pink dress on” and the Millward Brown says, “Actually if you change it to a green dress, it will be much more successful”. But it might be over time that the consumer comes to love that pink dress and it becomes part of the execution.
Darren:
Yeah.
Mark:
So does that answer your question?
Darren:
Absolutely, thank you.
Mark:
We’re in agreement.
Darren:
I appreciate that because you know, I still, especially with a lot of the consumer goods companies that have their process, one of the processes is the link testing and the number I’ve heard as many times that it won the link testing but when it was finally made, it failed. And it didn’t deliver any of the results and I go, “Because the link testing is wrong”.
Mark:
Yep.
Darren:
Concept testing is not the way to test.
Mark:
No.
Darren:
I think you know, I share with you the fact that you can actually, online, put a whole lot of messages out there and test which message gets a reaction is real-time testing.
Mark:
Absolutely.
Darren:
And low cost and quick and very, very targeted against a particular demographic that you want to test it on. So if you’re going to do anything, that would be the way to do it.
Mark:
Yep and completely and that’s again, that’s kind of the essence for scientific methodology of an A-B testing or any of those things. You’re just putting an ad into a practical context but it’s exactly the same thing. And sadly.
Darren:
Well Mark, I’m afraid we’re out of time but it’s been terrific catching up and we’ll do it again very soon.
Mark:
Cool.
Darren:
But before you go, I’ve got a question for you, because you raised it before, and that is, what’s your favourite ad at the moment?
Mark:
Oh gosh.