Why Your Adobe Analytics and CJA Data Don’t Match?

ADOBE ANALYTICSCUSTOMER JOURNEY ANALYTICS (CJA)

Pradeep Jaiswal

1/5/20265 min read

Because the underlying data model and “when/how” data is transformed differs, perfectly matching numbers is not always expected.

Here is the common reasoning for variance in Adobe Analytics and CJA data collection metrics.

  1. Date range : It sounds obvious, but mismatched date ranges (or inclusive/exclusive boundaries) are a frequent cause. Time zone differences make this even trickier, because the “same” calendar day in AA may not equal the same day in CJA. Always validate the exact start/end timestamps and time zone used.

  2. Segment/filters : AA segments and CJA filters may look similar but can behave differently due to sessionization, identity, and how components are defined in the Data View. Also, filters can be applied at multiple levels (panel, visualization, Data View) and stack unexpectedly. One extra filter (like excluding bots/internal) can explain a large discrepancy.

  3. Time zone differences: AA uses the Report Suite time zone, while CJA uses the Data View time zone. If these don’t align, sessions and daily totals can move between dates (especially around midnight). This typically shows up as “same week totals match, but day-by-day doesn’t.”

  4. Lift/Shift vs Optimization : A Web SDK migration is rarely a 1:1 “copy” of the old AppMeasurement setup. Events may fire at different moments, with different conditions, or through Edge Network rules; changing what gets collected. Even small tagging changes can shift page views, clicks, and conversion counts.

  5. Graph based stitching: CJA can stitch identities (ECID, login ID, CRM ID, etc.) using an identity graph and lookback settings. That can merge multiple devices into one person, impacting people counts, session counts, and attribution. AA is more commonly constrained to cookie/device-level identity unless you’ve implemented cross-device features similarly.

  6. Session timeout: Default ~30 minutes. Session definitions can differ between AA and CJA (e.g., inactivity timeout, how new sessions start, campaign resets). If CJA Data View sessionization settings aren’t aligned to AA, you’ll see differences in visits/sessions and session-based metrics. This also affects funnels and conversion rates.

  7. Bot filters/rules: AA may apply IAB bot rules and any configured bot filtering at processing time. CJA relies on Data View / dataset handling for bot exclusion, which might not match AA settings. If one side filters more aggressively, page views and visits can diverge noticeably.

  8. Internal IP filters: AA often excludes employee/office traffic via IP filters in the report suite. In CJA, you may need separate dataset filtering, Data View filters, or segmentation to achieve the same exclusion. If only one system excludes internal traffic, totals will differ.

  9. Processing rules vs derived fields: AA Processing Rules transform incoming variables before they’re stored (e.g., set eVar from query string, normalize values). CJA “Derived fields” and Data View components can replicate some logic, but not always in the same way or at the same stage. Any mismatch in transformation logic creates different dimension values.

  10. VISTA rules: VISTA rules in AA can do powerful server-side rewrites (persisting values, reclassifying hits, custom logic). If your AA implementation relied on VISTA, CJA won’t automatically inherit those transformations. Unless you recreate the logic upstream (ETL / AEP / Edge), CJA will reflect the raw(er) data.

  11. Consent tracking : Consent setups can differ between legacy AA tagging and Web SDK/AEP consent models. If one implementation suppresses hits/events under “no consent” more often, you’ll see gaps in page views and conversions. It can also affect identity stitching if IDs aren’t set when consent is denied.

  12. Variable allocation/expiration: AA eVars use allocation (original/last/linear) and expiration (visit/time period/purchase/etc.) at processing time. CJA typically uses attribution models + lookback windows in Workspace/Data View, which can lead to different “credit” distribution. Even if event totals match, attributed conversions by channel/campaign can differ.

  13. Currency conversion: AA may convert currency during processing based on configured settings and rates at that time. CJA can rely on dataset values or configured conversions that may be applied differently (often closer to reporting/query time). If exchange rate sources/timing differ, revenue won’t match exactly.

  14. Order ID deduplication: AA can deduplicate purchases using Purchase ID (Order ID) logic to prevent double counting. In CJA, deduplication depends on how the event dataset is modeled and whether you’ve implemented/standardized an order identifier consistently. If dedupe logic isn’t equivalent, purchase and revenue counts can diverge.

If you’re seeing differences between Adobe Analytics and CJA during (or after) migration, it doesn’t automatically mean one platform is “wrong.” In most cases, the variance comes from intentional differences in data modeling and processing; identity stitching, sessionization, attribution, filtering, and transformation rules that are configured differently across a Report Suite vs a CJA Data View.

The best way to troubleshoot is to standardize your comparison: align time zones, date ranges, bot/internal filters, session settings, and consent behavior; then validate a small set of “anchor metrics” (hits/events/orders) before moving to attribution-heavy views (channels, campaigns, entry/exit, etc.). Once those foundations match, any remaining gaps usually point to specific implementation or configuration differences you can address.

Finally, treat the migration as an opportunity to optimize and future-proof measurement, not just replicate legacy reports. Document what changed, why it changed, and which system should be the source of truth for each KPI; so teams can trust the numbers and move forward with confidence.

Have you migrated to CJA(Adobe Customer Journey Analytics) from Adobe Analytics or in the process of migration ? Do you notice a difference in metric number or dimensions values in CJA do not matches with your Adobe Analytics data ? You are not alone.

Adobe Analytics (AA) reports primarily from a single report suite where data is processed at collection time using AA-specific concepts (eVars/props, processing rules, VISTA, bot rules, visit definitions, etc.).

Adobe Customer Journey Analytics (CJA) is built on Adobe Experience Platform datasets and reports through a Data View where many things (sessionization, attribution, identity stitching, some conversions) are configured at query/reporting time.

Migrating from Adobe Analytics to Customer Journey Analytics is more than just a technical shift; it’s an opportunity to future-proof your data strategy, unlock deeper insights, and align your measurement with modern customer journeys. But as you’ve seen, even small misalignments in sessionization, identity stitching, or attribution can lead to frustrating discrepancies; and lost confidence in your data.

At Shiftlytic, we specialize in Adobe MarTech solutions, helping brands like yours seamlessly transition to CJA, validate data accuracy, and optimize for actionable insights. Whether you’re struggling with metric variances, need help aligning your Data Views, or want a strategic review of your migration plan, our team of Adobe-certified experts is here to guide you.

Don’t let data discrepancies slow you down. Contact us today to schedule one free consultation and ensure your migration is smooth, accurate, and aligned with your business goals. Let’s turn your data into a competitive advantage together.

Need Expert Help Navigating Your Adobe Analytics to CJA Migration?