What’s Changing With Google Ads Conversion Data?

In the ever-evolving world of paid media, technical shifts can have massive ripple effects on campaign performance and measurement. To help us navigate a significant recent change, we’re joined by Anastasia Braitsik, a global leader in SEO, content marketing, and data analytics. She brings her deep expertise in digital marketing to help us understand Google’s latest move to tighten conversion data rules within its Ads API and what it means for advertisers and developers. This interview will explore the practical consequences of these API changes on reporting and automated bidding, outline the necessary migration path to the new Data Manager API, and look ahead at what this signals for the future of Google’s measurement ecosystem.

When conversion imports lose context from session attributes or IP data, what specific breakdowns occur in reporting and automated bidding? Could you share an example of how this data loss could degrade a campaign’s performance over time?

It’s a critical issue that can quietly sabotage your efforts. When that richer context is stripped away, two things happen. First, conversions might be rejected outright, meaning they never even get recorded, which is obviously a disaster for reporting. Second, even if the conversion is recorded, it loses important attribution data. Imagine you’re running a smart bidding campaign that relies on understanding the full user journey. If it suddenly stops receiving session-level signals, it’s like it’s trying to optimize in the dark. Over time, the algorithm’s decisions become less and less effective because its learning is based on incomplete, degraded data, leading to wasted ad spend and a steady decline in ROI.

For developers currently using the allowlisted fields, what is a realistic migration timeline? Please describe the key technical hurdles and resource allocation challenges they should anticipate when shifting from the Ads API to the Data Manager API.

While Google is allowing existing users to continue for now, the writing is on the wall. The migration is no longer optional; it’s the required path forward, and I’d advise teams not to delay. The main technical hurdle is a two-part code update: developers must first implement the new Data Manager API to send session and IP data, and then simultaneously remove that same data from their existing Ads API calls. The biggest challenge here is resource allocation. This isn’t a simple switch-flip. It requires dedicated developer time to learn the new API, build the integration, test it rigorously to prevent data loss, and then deploy it. For teams already stretched thin, carving out the resources for this migration needs to be prioritized immediately to ensure measurement continuity.

If a developer sees a CUSTOMER_NOT_ALLOWLISTED_FOR_THIS_FEATURE error, what are the immediate, step-by-step actions they must take? Please outline the ideal process for both resolving the error and ensuring long-term measurement continuity.

Seeing that specific error is a clear signal to act now. The first, most immediate step is to temporarily remove the session attributes and IP address data from your Ads API conversion imports. This is a stopgap measure to prevent further rejections and keep your basic conversion data flowing. Concurrently, your development team must start updating the code to send this richer data through the Data Manager API instead. This is the core of the long-term fix. Once that new integration is fully built, tested, and you’ve confirmed it’s working flawlessly, you can then fully discontinue the old Ads API conversion imports. Following this phased process is crucial to avoid any blind spots in your data during the transition.

Google is repositioning its Data Manager API as the central place for rich user data. What does this strategic consolidation signal about Google’s future priorities for measurement infrastructure, attribution, and user privacy signals?

This is a very telling strategic move. By creating a dedicated home for complex data payloads like session-level attributes, Google is signaling a major push towards a more specialized and robust measurement infrastructure. They are essentially separating the day-to-day campaign management functions of the Ads API from the heavy lifting of ingesting and processing rich user data. This positions the Data Manager as the long-term foundation for more sophisticated attribution and handling of privacy-centric signals. It tells us that Google is building a future where deep, nuanced data is central to its ecosystem, but it will be managed through dedicated, purpose-built tools rather than a one-size-fits-all API.

What is your forecast for Google’s API ecosystem? Do you expect to see more specialized APIs like the Data Manager, and how should development teams prepare for this ongoing trend of consolidation and specialization?

My forecast is that this trend is only going to accelerate. Google is clearly moving away from a monolithic API approach and toward a more modular ecosystem of specialized tools. We’re seeing the Ads API being refined to focus on core campaign and conversion workflows, while new APIs like the Data Manager are being spun up to handle specific, complex tasks like rich data ingestion. For development teams, the key to preparation is agility. They need to stop thinking of the Ads API as their only touchpoint and be ready to integrate with a suite of different services. This means fostering a culture of continuous learning, closely monitoring Google’s developer announcements, and building their internal systems in a way that can easily connect with new, specialized APIs as they are rolled out. It’s the only way to ensure their measurement stack remains robust and future-proof.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later