Why Your Business Needs to Take Notice of Recent Data Privacy Concerns
Recent revelations that Google’s Performance Max AI may have inadvertently been showing ads to children on YouTube is concerning, not only from a data privacy perspective and breaching the Children's Online Protection Act (COPPA) but also from the perspective of advertisers wanting to maximize their returns.
For those unfamiliar with Performance Max (PMax), it’s a Google Ad product whereby you give Google your ad budget and let Google’s AI decide where to spend it (across the Google landscape), the premise being that Google’s AI can do all the hard work on behalf of advertisers. Unfortunately, PMax doesn’t have the same level of transparency as Google’s other Ad products and this has stirred up some worry.
Your business could be liable for personal data it wasn’t allowed to collect but collected regardless
Taking someone who clicks on your ad to your website is standard practice, and once there, a website typically collects as much data as possible with the aim of marketing to them and converting them to a customer. None of this is a revelation, nor is it cutting edge, however, what’s become apparent is that collecting all visitor data this way, regardless of how the visitor found your site may open up businesses to a myriad of problems and liabilities.
Take PMax for example, if an ad was inadvertently being shown to a child who clicked through to your website, and that data was automatically collected then, depending on the jurisdictions involved, by collecting that data, you may be breaching data privacy laws such as COPPA.
To be clear, this issue is not simply a PMax issue, nor is it simply a Google issue. Advertisers need to take steps to distinguish between what data they can and cannot collect and that distinction is not easy to manage.
Unfortunately for advertisers, placing blame on a third party for sending you the wrong types of visitors may not be enough. A choice was made to advertise on a platform and a choice was made to automatically collect and retain data. Should an authority look into a business’ data collection practices, ignorance will not be bliss! Pragmatically, it’s unlikely if any business could ascertain with 100% confidence that they can distinguish between visitors they can and cannot collect data on but at the very least, businesses should demonstrate through their processes and technology, that they take the matter seriously and are making a real effort to do so.
You wouldn’t want this data anyway
The recent concerns around how PMax’s AI operates leads us to another conundrum that every advertiser needs to consider and that’s whether this data is helpful or damaging. If it turns out PMax is indeed advertising to children, and those children click on ads more often than adults (a plausible scenario when considering accidental clicks or children engaging with ads they don’t understand), logic would suggest that an algorithm would recognise this high level of engagement and start optimizing towards that audience. Yes, it’s possible that this AI behavior could kick off a negative feedback loop that wastes ad budgets on advertising to more children rather than your actual target market all the while believing it’s improving your return on ad spend (ROAS).
Moving further downstream, these children are now visiting your website, only to hit the back button or close their browser window. Your website has now collected their data, adding them to the list of visitors to retarget, diluting genuine visitor data, reducing confidence in your analytics and ultimately damaging your ROAS. Ethical and liability implications aside, advertisers should go to great lengths to avoid this scenario as it undermines everything they’re striving to achieve around transparency, focus, insights and advertising investment. You really don’t want this data.
TrafficGuard’s Data Collection Filter
The scenario described above was the driver behind TrafficGuard building its Data Collection Filter (DCF), a decisioning engine that sits at the very beginning of your website’s data chain, deciding whether or not to allow inbound visitor data to propagate further downstream and be collected by your marketing and analytics tools.
Website visitors won’t experience any difference once DCF is set up. DCF doesn’t block where your ads are shown and doesn’t stop visitors from arriving at your site, it simply stops visitors that don’t meet a confidence threshold from having their data collected by your downstream marketing and analytics tools, reducing your exposure to visitor data you don’t want, and improving the quality of the visitor data you do.
Right now, DCF automatically filters out data from visitors redirected from YouTube that don’t continue to engage with your website, for example, by pressing back or closing their browser window. If a YouTube visitor begins to meaningfully engage with your site, DCF allows their data to be collected downstream on the premise they’re a genuine visitor.
DCF is a new product and we have big plans for it in the future. TrafficGuard’s data scientists and engineers have spent years building industry-leading technology preventing outright ad fraud, misattribution of (often organic) customers and low value traffic overall. We’ve saved businesses countless millions of dollars in ad spend and we intend to apply our know-how to the very real problem of deciding whether a business would want to collect data on a site visitor, helping them adhere to data privacy regulations and at the same time, improving the quality of future marketing campaigns.
Mathew Ratty, Co Founder and Chief Executive Officer at TrafficGuard
Get started - it's free
You can set up a TrafficGuard account in minutes, so we’ll be protecting your campaigns before you can say ‘sky-high ROI’.
Subscribe
Subscribe now to get all the latest news and insights on digital advertising, machine learning and ad fraud.