• Updated:
  • Published:

Cohort analysis

Cohort analysis is a behavioral analytics method that groups users who share a common characteristic — typically a time-bound event such as registration date, first deposit, or first bet — and tracks how each group’s behavior evolves over subsequent time intervals. 

Rather than averaging metrics across the entire user base, the method isolates each “cohort” on its own timeline, revealing patterns in retention, monetization, and engagement that aggregate reporting obscures. 

In iGaming, cohort analysis is the primary tool for answering questions like “are players acquired this month retaining better than last month’s?” or “which traffic source delivers the highest LTV at Day 90?” — questions that cannot be answered reliably with site-wide averages.

What is cohort analysis?

A cohort is a group of users who share a defining attribute within a specific time window. Cohort analysis tracks a chosen metric — such as retention rate, cumulative revenue, or deposit frequency — for each cohort across equal time intervals (days, weeks, or months) after the defining event . The output is typically a matrix (sometimes called a retention table or triangle): rows represent cohorts, columns represent time intervals since the cohort’s origin event, and cell values show the metric at each intersection.

The method belongs to the broader family of behavioral analytics and is distinct from simple segmentation. Segmentation groups users by a static attribute (e.g., country or device); cohort analysis adds a time dimension, aligning every user to a common starting point so that behavioral trajectories become directly comparable. This temporal alignment is what makes cohort analysis especially powerful for iGaming operators, where the player lifecycle — from registration through first deposit, peak activity, and eventual dormancy or churn — unfolds over time and varies significantly by acquisition channel, product vertical, and promotional context.

Two broad types of cohorts are used in practice. Acquisition cohorts (also called time-based cohorts) group users by when they first interacted with the platform — registration week, first-deposit month, or first-bet date. Behavioral cohorts group users by an action they took, regardless of when they joined — for example, players who completed KYC within 24 hours, or players who placed a first bet on live casino.

How does cohort analysis work?

The workflow for building and reading a cohort analysis follows a consistent sequence, whether executed in a spreadsheet, a BI tool, or a dedicated product-analytics platform such as Amplitude or Mixpanel.

Step 1 — Define the cohort event and granularity. Choose the event that anchors each cohort (e.g., first deposit date) and the time grain (day, week, or month). In iGaming, the most common cohort events are: registration date (tracks top-of-funnel quality), first deposit / FTD date (tracks monetized-player quality), and first bet date (tracks product engagement from the moment of real-money activity).

Step 2 — Select the metric and time horizon. Decide what to measure across intervals. Retention rate (the share of the cohort still active at Day N) is the default, but revenue-oriented teams also track cumulative ARPU, cumulative ARPPU, deposit count, or NGR per player. Common interval checkpoints in iGaming are D1 (day after origin), D7, D14, D30, D60, and D90.

Step 3 — Build the cohort matrix. Assign each user to a cohort row based on the cohort event date. For each row, calculate the metric at each subsequent time column. A retention matrix cell at row “Week 12” and column “D30” would show the percentage of players who deposited in Week 12 and were still active 30 days after their first deposit.

Step 4 — Visualize and interpret. Heatmap shading (darker = higher retention or revenue) makes patterns visible at a glance. Analysts look for: the shape of the retention curve (steep early drop followed by a flattening tail is typical); cohort-over-cohort trends (are newer cohorts retaining better than older ones?); and anomalies (a sudden drop in D7 retention for a specific cohort may signal a broken onboarding flow or a low-quality traffic spike).

Step 5 — Act and iterate. Insights feed product, CRM, and acquisition decisions: adjust bonus structures for cohorts with steep early churn, reallocate spend away from traffic sources whose cohorts flatten at a low revenue ceiling, or trigger reactivation campaigns for cohorts approaching dormancy thresholds.

Examples of cohort analysis

Retention by FTD week. An operator groups all players by the week of their first deposit and tracks D7 / D30 / D90 retention. Weeks where the operator ran a high-wagering-requirement welcome bonus show 15% lower D30 retention than weeks with a lower-barrier offer — evidence that the aggressive bonus attracted short-lived bonus hunters rather than sustainable depositors.

Cumulative ARPU by acquisition source. Players are cohorted by the affiliate or paid-media channel that drove their registration. At D30, Affiliate A’s cohort shows an ARPU of $85 while Affiliate B’s cohort shows $42 — even though Affiliate B delivered twice the volume. The analysis reveals that optimizing for CPA alone would have shifted budget toward a source with inferior lifetime monetization, making a LTV-based evaluation essential.

Reactivation lift measurement. An operator sends a targeted reactivation campaign to players who have been dormant for 30+ days. A behavioral cohort of “reactivated players” is created, and their post-reactivation D7 and D30 activity is tracked. Comparing this cohort’s retention curve against the original acquisition cohort’s curve at the same lifecycle stage shows whether the campaign produced genuine re-engagement or a one-session bounce.

Benefits of cohort analysis

Eliminates the “average of averages” problem. Site-wide metrics like overall retention rate blend new and mature players, masking whether recent cohorts are performing better or worse. Cohort analysis isolates each wave of players on its own timeline, giving an honest read on trajectory.

Connects acquisition quality to downstream value. By linking cohort origin (channel, campaign, geo) to long-term metrics like LTV, ARPU, and churn rate, operators can shift budget toward sources that deliver durable value — not just cheap volume.

Enables early-warning signals. A decline in D7 retention across successive cohorts is a leading indicator of product or onboarding problems — visible weeks before it shows up in aggregate monthly metrics.

Measures the true impact of changes. Product releases, bonus-policy changes, or CRM campaigns can be evaluated by comparing cohorts that were exposed to the change against prior cohorts that were not, controlling for lifecycle stage.

Informs LTV forecasting. Cohort retention and revenue curves are the empirical foundation of LTV models. Plotting cumulative revenue per cohort over time reveals the “shape” of the LTV curve — how quickly a cohort pays back its acquisition cost and where the revenue ceiling sits.

Challenges

Choosing the wrong cohort event. Using registration date instead of first-deposit date can inflate cohort size with never-deposited registrants, diluting retention and ARPU signals. The cohort anchor should match the business question being asked.

Ignoring cohort size. A cohort of 50 players with 80 % D7 retention is not statistically comparable to a cohort of 5,000 with 60 %. Small cohorts produce volatile metrics that can mislead if taken at face value.

Survivorship bias in revenue cohorts. Tracking cumulative ARPU only among players who are still active overstates monetization potential. Always distinguish between ARPU (revenue divided by all cohort members, including churned) and ARPPU (revenue divided by paying members only).

Inconsistent interval definitions. “D7 retention” can mean “returned exactly on Day 7” (bounded/on-day retention) or “returned at any point within Days 1–7” (unbounded/within-window retention). Mixing definitions across reports creates false comparisons. Define the method once and apply it consistently.

Over-aggregation of time grain. Monthly cohorts smooth out weekly variation that may contain actionable signals — such as a specific promotional week that drove an unusually high-quality cohort. Start with the finest grain available and aggregate upward only when the pattern is stable.

Attribution complexity. In iGaming, a single player may interact with multiple campaigns before depositing. The cohort’s acquisition label depends on the attribution model (first-touch, last-touch, or multi-touch), and different models can assign the same player to different cohorts.

Tips / Best practices

Anchor cohorts to the highest-signal event. For monetization questions, use first-deposit date; for product-engagement questions, use first-bet date; for top-of-funnel quality, use registration date. Build parallel cohort views if different teams need different anchors.

Standardize D1 / D7 / D30 / D60 / D90 checkpoints. These intervals are widely used across iGaming and allow internal benchmarking over time as well as directional comparison with industry norms.

Layer behavioral dimensions on top of time-based cohorts. Slice each acquisition cohort by product vertical (sportsbook vs. casino vs. poker), deposit method, device type, or geo to surface sub-cohort patterns invisible in the top-level view.

Integrate cohort data into CRM triggers. Use cohort analysis not just for retrospective reporting but as a real-time input: when a player falls below their cohort’s expected D14 activity curve, auto-trigger a retention touchpoint before they lapse.

Report ARPU and ARPPU side by side. ARPU shows the average value of the entire cohort including non-payers; ARPPU shows the value of converted players. The gap between them indicates conversion efficiency — a narrow gap means most cohort members are paying, a wide gap means a small paying minority is subsidizing the average.

Revisit cohorts longitudinally. A cohort that looks weak at D30 may reactivate at D60 after a dormancy-targeted campaign. Extend the observation window long enough to capture reactivation effects before writing off a traffic source or campaign.

Wrap-up

Cohort analysis is not a single metric but a methodological lens — one that transforms retention, ARPU, LTV, and churn from static snapshots into dynamic, time-aligned trajectories. Its power lies in making lifecycle patterns visible: which acquisition waves are strengthening, where onboarding is leaking value, and how product or CRM changes propagate through player cohorts over time. 

The method is most effective when it is embedded into weekly operational cadences rather than treated as an ad-hoc exercise — when every acquisition, CRM, and product decision is backed by a cohort curve, not a blended average. 

FAQ

What is the difference between cohort analysis and segmentation? Segmentation groups users by a static attribute (geo, device, VIP tier) at a single point in time. Cohort analysis adds a time dimension, aligning users to a shared starting event and tracking how their behavior evolves over subsequent intervals. Cohort analysis reveals lifecycle dynamics that segmentation alone cannot.

Which cohort event should an iGaming operator use? It depends on the question. Registration date measures funnel-top quality; first-deposit date measures monetized-player quality; first-bet date measures product engagement. Many teams maintain parallel cohort views anchored to each event.

What tools support cohort analysis? General-purpose product analytics platforms — Amplitude, Mixpanel, Google Analytics 4 — offer built-in cohort and retention reports. iGaming-specific platforms such as affiliate management systems and BI suites connected to the operator’s data warehouse can build custom cohort matrices from transactional event data.

How does cohort analysis relate to LTV? Cohort retention and revenue curves are the empirical inputs to LTV models. By plotting cumulative revenue per cohort at each time interval, analysts can fit curves that project future value — converting observed cohort behavior into a per-player LTV estimate.

How often should cohort reports be reviewed? Weekly review of the most recent 4–8 acquisition cohorts (at D1, D7, D14 milestones) catches emerging problems quickly. Monthly review of older cohorts at D30, D60, D90 checkpoints informs longer-term strategic and budgeting decisions.