The Best Attribution Uses a Surprising Secret Sauce You Need to Know
With the move away from third-party cookies, some predict the end of multi-touch attribution. But to borrow a famous remark from Mark Twain, reports of MTA's death are greatly exaggerated.
Cookies, of course, revitalized marketing attribution models, making it possible to track each step along the customer journey in the digital world. With the rise of MTA, marketers had a more sophisticated way to understand, learn from, and even optimize campaigns in-flight, ensuring the highest possible return for their efforts. And when combined with the more traditional insights of marketing mix modeling (MMM), marketers had a holistic view of measurement across all on- and offline channels, including closed digital platforms.
Consumer concerns over data privacy have led to the decline of third-party cookies and mobile ad IDs, leaving advertisers fretting about the future of attribution and marketing analytics. The statistical methods of MMM have evolved to provide insights across platforms but lack the user-level measurement marketers desire and don't provide the granularity and timeliness required for proactive campaign optimization.
Rest assured. MTA is here to stay, thanks to advances in data and analytics. Providing attribution in a privacy-friendly way is critical as more publishers become closed digital platforms and there is a growing scarcity of quality consumer data.
Even though technology creates challenges in data privacy, it can also be a bigger part of the solution. Recent technical innovations in data science have enabled new ways of thinking about privacy.
One such approach known as differential privacy brings a rigorous mathematical process to protecting confidential information about individuals while maintaining sufficient data accuracy for high-precision analytics.
In its simplest sense, differential privacy is a process that protects individuals' privacy by injecting a small amount of random data or "noise" into queries over a data set. The application of the random data prevents the identification of any user housed within the dataset with any certainty. In other words, in using a differentially private algorithm, one cannot tell whether an individual’s data was included in the original data set or not. Using such differential privacy principles, advertisers are provided an approximation of the answers they need in order to perform a measurement analysis or deliver a targeted ad without exposing any user-level information.
Advances in privacy are needed because traditional privacy safeguards, such as anonymization (the removal of identifiable attributes, such as names, addresses, social security numbers) have been found to be ineffective. In one glaring example, Harvard University professor Latanya Sweeney bought a year’s worth of supposedly anonymized patient hospital discharge data from Washington state. She and her team re-identified 43% of 81 samples by comparing the information to news blotters, according to an article by Inside Digital Health.
Differential privacy and other modern privacy-preserving methodologies emerged from a long line of work applying algorithmic ideas to sensitive data. It’s a process now used by Apple, Uber, and other big tech companies to protect customer data. Differential privacy is a game-changer for MTA because it gives marketers access to valuable user data in a privacy-safe way, including from places that are hard to reach like closed digital platforms. Advertisers have a love-hate relationship with these platforms. They have changed the way marketers interact with leads and customers, bringing brand recognition and loyalty to new heights. But the media platforms also don't share a lot of user data that is necessary to show relevant ads, assess the accuracy of campaigns, and maximize revenue.
A modern marketing analytics infrastructure needs best-in-class data assets across all channels, especially closed data platforms, which capture just under 70% of digital ad dollars, according to eMarketer. Major platforms with large repositories of first-party data require enhanced privacy methods to gain access to their marketing insights.
Neustar MTA pioneers a unique privacy methodology that leans on the principles of differential privacy. Our advanced data science approach leverages our leading consumer identity graph to provide highly accurate MTA without requiring individual-level advertising impression data. People are grouped based on advertising exposure to avoid sharing user-level data. For example, advertisers will know that 150 out of 200 people saw an ad on a social media platform and clicked through to their website, but not which 150 people. This ensures brands can measure advertising performance across closed media platforms in a privacy-centric way without relying on third-party cookies and MAIDs.
Neustar also is working with the largest closed digital platforms to build measurable cohorts of user audiences (100+) defined by similarities in ad exposure. These user set cohorts are synced with Fabrick, which enables Neustar to incorporate exposure data directly into our multi-touch attribution analytics.
The result is a powerful attribution solution that brands can count on for the granular and holistic user-level analytics they need to effectively measure across their entire marketing mix -- inclusive of closed digital platforms -- in today's privacy-first world. No more blind spots. No more wasted media spend.
So we can stop preparing for MTA's funeral with the loss of third-party cookies. Truth be told, you can get a complete picture of your customers' activities in a privacy-preserving way. And as Twain also said, "If you tell the truth, you don't have to remember anything."