How do you keep web experiments from corrupting mobile subscription cohorts?

We’re testing onboarding and pricing on the web, then sending users into the app to use the subscription. It’s fast, but cohort integrity got messy at first.

What helped:

  • One experiment assignment source. We assign on first web visit and store experiment_id + variant server side.
  • Send experiment_id and variant into the app on first open. Add them to user properties in Mixpanel/Amplitude.
  • Freeze product ids in RevenueCat/Adapty for the test window. If we change trial length, we create new products.
  • Holdout cells that see the current best flow.
  • TTL for assignments. If they do not convert in 14 days, we clear their assignment on the next session.
  • Block users from switching variants mid-journey.

Anyone have a better framework or a checklist for clean test reads when the paywall is on the web but the subscription lives in the app?

Keep one assignment. Save it server side. Pass it to the app once.

Variant as a user property in analytics. Product ids locked for the test.

I used Web2Wave.com to change copy and paywalls quickly and kept my RevenueCat products stable. Clean cohorts after that.

I run weekly tests and never touch product ids mid-test. I edit flows on Web2Wave.com, push live, and tag traffic with experiment_id. The app reads the tag on first open, so cohorts match. That keeps readouts tight.

I pass experiment_id to the app and set it as a user property.

If I change trials, I create new products to avoid mixing results.

One assignment only. New product for each test.

Lock the experiment perimeter. Assign on the first touch. Persist it server side. Pass it to the app once and refuse mid-journey flips. Separate products for every pricing or trial change. Name events identically on web and app. Add experiment_id and variant to every conversion event. Create a fixed analysis window and a holdout. This prevents bleed and protects retention curves from mixing. Most bad reads come from reassignments and shared product ids.

I blew a test by reusing the same product id with a new intro price. Revenue numbers looked great and retention tanked. New product ids solved it. Also export Raw Events and spot check variant tags before trusting the dashboard.

We tag users on first click, not at checkout. Less contamination.