Dear MBD reader,
I am here with you today to stand on a soapbox.
You have clicked on this article titled "The Science of Marketing".
And I can be fairly sure in assuming ...
That you are expecting to see something certain, specific, and measured in terms of creating an outcome with marketing.
That's the subtext that "science" tends to communicate.
Measured, accurate, precise, known.
BUT
That is not what science is.
(and, turns out, that's not what marketing is either)
So I'm here today, on my soapbox, to try to disabuse you of the notion that the marketing campaigns/ideas you are hoping to implement are at all certain.
Those who have contact with reality, who regularly engage the field1, already know this. Whether consciously or not.
That any new marketing is an experiment.
It's a hypothesis which is put to the test in reality.
Yet
(And this is the reason I'm on my soapbox)
Time and time again I see people directly influenced by marketers who would make you believe that the outcome they are promising is going to happen for sure. They might not go so far as to make that complete claim, but they sure would like you to believe it.
And you would probably like to believe it as well.
After all, it's much easier to spend money on a marketing campaign if you know the outcome is going to be what you want.
However ... your expectation and reality are unlikely to meet.
And if they DO meet, it creates a further problem.
Imagine this:
You want to achieve a 20% revenue increase in the next 3 months through email, expanding your email strategy in a way you've never done it before.
One person makes a proposal, and says that it's really not clear whether you'll get that in the timeline you want.
Another person makes a proposal, and tells you absolutely you'll get the result that you want within 3 months (they might even put a nice sounding money back guarantee in there to tip you over).
You chose the second, because they sound more certain about what they are doing.
One of two things is going to happen.
Either,
You won't get the result you want in the 3 months, and you'll be quite displeased with the person who told you it would happen, probably let them go and be starting from scratch.
You DO get the result you want in 3 months, and you think the person you picked is an absolute genius and now pay them a lot more to do a lot more.
I'll preface the conclusion that, even in reading the argument, it's still likely that you your thoughts are - "yea if someone gets the results they say they are going to get of course they are good! of course I'm going to trust them!"
The problem here is our built in biases.
We are highly likely to trust and believe someone if they demonstrate beforehand that they know what is going to happen (this is a core component of Allegiance Capital2 by the way - we just don't use it in a scummy way).
BUT, someone can have a long string of accurate guesses.
So we have to be clear about what really matters in The Science of Marketing for our own businesses.
Before there is any data, there is no reasonable way to draw a conclusion of what can happen. The person promising a specific outcome prior to data is either delusional and doesn't realize the uncertain nature of applying complex systems to new environments ...
Or they know exactly what they are doing and they are gambling that they will be correct, knowing that if they are right, you'll trust them for a long time.
Any conclusion about the outcome in that state is useless (and may actually be harmful).
So,
If you go into a new marketing campaign, test, etc (where you don't have any of YOUR historical data to recreate), with the expectation of a particular outcome, you are either going to be pleased or disappointed BUT none of that has any bearing on the legitimacy and usefulness of the actual marketing.
If you are expecting to do X and get Y outcome ... run facebook ads and get 5x ROI on your ad spend (for example) ... and you have no personal data to back your expectation ... you are stepping into an experiment with a gambler's mindset.
It's easy to look at marketing campaigns, tactics and tools of copy, see what's worked for other people, even see what's worked repeatedly and say ...
"Do X to get Y outcome"
THE PROBLEM IS ...
There's another variable in the equation.
YOU.
Everything about who you are, your business, how it runs, how you do things, why you do things, who you speak to, why you speak to them, etc, can take a perfectly effective strategic and tactical application - proven effective in other similar businesses and scenarios - and turn out completely opposite results.
At least, initially.
The only way you can have certainty in an outcome of a marketing campaign (any marketing test, an ad, a line of text, an email, etc), is if you've done it before.
Certainty comes from data.
Data comes from action.
This is why, in The Guardian Academy, we talk about using The Rear View Mirror3.
Because when we seek certainty of a particular outcome, the only way to have that certainty is to look back at ways you've achieved that outcome before, and repeat what you did.
HOWEVER
It's also not infallibly certain.
Because even though you've done something before, you may be different now (and the context you're in may also be different).
This is all very conceptual, but also quite practical.
When you accept the uncertainty inherent in most marketing, whenever you are looking to apply something new (like a new approach to email, a new approach to paid ads), if you can accept that the outcome can't be predicted, you are in a much better place to leverage and properly value the effort and the work being put in.
This is why Laurel says "90 Days to Data.4" Because when you start out with any system, you don’t know what that system is actually going to do for you. Even if you predict good results because you've seen other people do the same.
This is why we walk about being your own control.
None of this is to say that you shouldn't try out strategies and tactics which appear to work well for other people in other businesses.
It's to say that if you chose to do that, recognize you have no historical base data from YOURSELF to know how it's all going to turn out.
You must put into action your ideas specifically to collect the data, not to get the outcome.
Because it's from that data, which you will determine more accurate assumptions and ideas. Which you will then put into action and get further data, from which you will determine even more accurate assumptions and ideas.
That's the learning process.
Observe
Reflect
Engage
Reflect
Repeat
That's The Science of Marketing.
Make a hypothesis - that a certain ad campaign may give you the ROI and the type of engaged customer that you want - and then put it into action long enough to gather data to see what happens. Then with that data, start from the beginning with your campaign idea. Modify. Beat established controls.
Rinse. Repeat.
The only thing to ever expect should be data - and that those working with the data analyze it well and put it to good use.
Now, if you'd like some useful and effective campaigns,
Which you can implement NOW, to start testing and gathering data ...
The Cash Now Campaigns outlined in the blitz workshop are a great place to start.
These techniques are a fantastic and effective way to *test the field,* to enact *real* scientific marketing, with tight feedback loops and data gathering which can help you get to the outcomes you desire much sooner.
Be Useful. Be Present. Love the Journey.
Joseph Robertson, CMO Man Bites Dog
Ready to Step Into The Arena?
Ready to engage the field? Man Bites Dog paid subscribers have comment access unlocked below. (They’re also sent a killer welcome package in the mail with all kinds of opportunities that are not available in digital format)
Here are some other options:
Get on the waitlist to join the Arena: engagethefield.com
Check out the Engaging The Field Handbook
Grab your your own copy of the R3 system (it’s a book and it’s not cheap)