Why Performance Is Paramount for Brands in Your Attribution Model May Be Data Driven – But It’s Still Not the Right Answer | Industry Insights | All MKC Content | ANA

Your Attribution Model May Be Data Driven – But It’s Still Not the Right Answer

Future proofing outdated practices is the best way forward

Share        

The concept of attribution had noble and necessary beginnings – entering the marketing landscape like a newly anointed royal ready to wave away our collective blindness to which ads were working and which were not. Even with its rules-based and/or data-driven origins, "attribution" has always been a misnomer. While it's been a reasonable success in understanding basic ad performance – it is one the greatest distractions of our time.

Our industry-wide obsession around building complex MTA models became an unnecessary science and essentially wasted energy for the ecosystem. This isn't to say that MTA doesn't have a place in modern measurement toolboxes, it does, but it cannot be the sole focus or seen as an oracle of results to be unchallenged.

So, what's at the root of this disconnected distraction, and what's the solution? There are a few things to understand. The way forward is more "back-to-basics."

The Walled Gardens Have Gotten Worse


For one, tending to choose the path of least resistance to get to scale, marketers can't help but continue to rely on the giants. But there's no connecting the dots on data in an environment where walled gardens have become walled forests. It's time to accept that there's just not enough data forthcoming, and any data that one may glean is questionable at best.

For instance, even Google stopped using impressions as factors in their MTA models – quietly doing so a few years ago, without the large fanfare that they usually put into major announcements. Other critical data has been hidden inside data safe havens and inaccessible without further reliance on the partners themselves. The trap on the path speaks for itself.

Everyone Thinks They Are Doing What Is Best for Consumers


All of this has been further exacerbated by numerous legal and privacy-focused factors. While there's no doubt that the industry has taken advantage of unconsented data for years, there is also a healthy appetite for consumers to engage with content and materials that speak directly to them.

There are also companies who see this as an opportunity to insert themselves into the discussion claiming privacy and protection, but behind the scenes simply use that as a way to pad their own wallets. As an example of inherent conflict, Apple functions as being both a champion of the people, but also conveniently upselling privacy features as a product to a very captured audience.

Therefore, the data, insights, and actionability that then come from these giants' is rife with gaps. Trying to extract meaningful information to develop a custom solution creates new gaps and challenges instead of solving existing ones. Charting the course with this flawed data is not a logical solution to optimize against in a vacuum.

As we accept this reality, we can and should start to exercise ourselves of these legacy notions that channel X is better than Y or that upper funnel is not valuable to driving business. We see that successful companies can focus on a media mix that hinges primarily on a Walled Garden partner, but others that are able to diversify their spend in new and uncharted spaces.

No matter which way you go, the common thread here is that savvy marketers have started to embrace that there's no perfect measurement solution and that both big "A" and small "a" attribution solutions play a specific role in their media success and not as the ultimate critical decision making function.

The A-Ha Moment: The Workaround Is Not a Workaround


Guess what? The workaround that this conflicted situation calls for is a familiar best practice – "test and learn," but with an update on an old favorite. Testing and learning today and into the future requires a marketer to embrace the concept of incrementality by understanding "the unseen" factors and focusing ultimately on the customer journey.

Additionally understanding then how results are applied is a practice that the back-end team regard as a kind of "calibration oriented" measurement. We know that the truth of performance is grayer and lives somewhere in between what Facebook says it produced, and what your attribution solution is able to credit. Of course, this approach may not guarantee a right let alone perfect answer,

At the core of the solution is the need to identify the different measurement tools that you have available and ensure they are defined properly. While not every company will have the resources or aptitude to have every capability available, that's the beauty of this calibration approach and why the alignment on having an always-on test-and-learn approach to supplement is important. It can help focus on the gaps and key learnings needed to make a decision.

As a practical example, most organizations really struggle with valuing upper funnel channels like display and social, especially due to the dynamics highlighted earlier in this article. That data is hard to get to, if at all, and requires a lot of lifting to assemble once accessed. Teams often make decisions to cut that budget because the value is unseen. This prompts pushback from marketing, using UI reporting to say, "Look it does work!"

So, the solution is in orchestrating an incrementality test, determining the lift, but then being able to take those results and work them either into the UI reporting to lessen the impact to show a more accurate representation (this is the correlation analytics) or to add it on top of an attribution solution to highlight the true impact. Often this takes the form of a multiplier or specific value number, but ultimately helps to drive a better understanding of performance to make the right decision and not just the knee jerk decision.

This moment for the modern marketer is all about coming to terms with reality: Attribution has never been a silver bullet. The fact is, if you're not doing incrementality, if you're not doing testing and obsessively focusing on calibration, you really have no sense of what is going on in your business. This is especially true if you want to achieve a more omnichannel, holistic complete picture of your business success or challenges via marketing.

Truly evolved measurement leaders can help their organizations think beyond the numbers and understand the factors of connectivity as much as the base calculations themselves. This is because the true outcome of any of these measurement tools are the valuable actions that pull all the way through, not the hollow number.


The views and opinions expressed are solely those of the contributor and do not necessarily reflect the official position of the ANA or imply endorsement from the ANA.


Jordan Cardonick is VP of analytics and technology at New Engen.

Share