+
+
+
+
+
+ {/* Empty script tag as chrome bug fix, see https://stackoverflow.com/a/42969608/943337 */}
+
+
+
diff --git a/pages/guides/guides-by-use-case/benchmarks.mdx b/pages/guides/guides-by-use-case/benchmarks.mdx
new file mode 100644
index 0000000000..e0b1ccee3e
--- /dev/null
+++ b/pages/guides/guides-by-use-case/benchmarks.mdx
@@ -0,0 +1,27 @@
+import { Cards } from 'nextra/components'
+
+# Benchmarks & Analytics Maturity
+Understand performance benchmarks and assess your organization’s analytics maturity. Whether you’re setting goals, comparing performance against peers, or looking for ways to advance your analytics practice, these resources will help guide your next steps.
+
+## Benchmarks
+See product performance baselines across conversion, retention, and engagement. Use these as north stars to calibrate goals and identify outliers.
+
+
+
+
+
+## Benchmarks by Industry
+Drill into industry-specific norms to set context-rich targets.
+
+
+
+
+
+
+
+## Analytics Maturity
+Assess where your org sits on the analytics maturity curve and chart the next moves to unlock impact.
+
+
+
+
diff --git a/pages/guides/guides-by-use-case/continuous-innovation.mdx b/pages/guides/guides-by-use-case/continuous-innovation.mdx
new file mode 100644
index 0000000000..5345d82185
--- /dev/null
+++ b/pages/guides/guides-by-use-case/continuous-innovation.mdx
@@ -0,0 +1,240 @@
+import { Callout, Steps } from 'nextra/components'
+
+# Mixpanel and the Continuous Innovation Loop
+The **Observe → Analyze → Decide → Act (OADA)** loop is the framework behind how great teams use Mixpanel to continuously innovate.
+
+
+
+This loop represents the cycle teams follow to turn data into action by first observing what users do, then analyzing why it happens, deciding what to do next, and finally acting on those insights to drive better outcomes.
+
+Each phase represents a key moment in the product decision cycle–—and Mixpanel provides the tools to complete that loop faster with every iteration.
+
+
+**Why it matters:** Teams that move through the loop quickly learn what’s working, align faster, and deliver more impactful product improvements.
+
+---
+
+## Why Continuous Innovation Matters
+
+Building great products isn’t just about speed—–it’s about learning continuously.
+
+Markets shift, user expectations evolve, and what worked last quarter might not work tomorrow. The teams that win are the ones that turn data into a *habit of improvement* by observing what users do, analyzing why it happens, deciding what to do next, and acting with confidence.
+
+Teams that move through the OADA loop quickly:
+- Learn what’s working and what’s not
+- Align on priorities faster
+- Deliver more impactful product improvements with each iteration
+
+
+**Pro tip**: Make closing the loop part of every sprint–—observe, analyze, decide, act.
+
+
+Digital innovation isn’t a one-time project. It’s a continuous cycle of learning and improvement.
+
+---
+
+## How Different Industries Use the Continuous Loop
+
+Mixpanel powers the OADA loop across every industry--helping teams turn data into confident action.
+
+Whether you’re optimizing user onboarding, increasing checkout conversions, or improving content engagement, the same continuous loop applies.
+
+Below are examples of how teams in different industries use Mixpanel to measure, learn, and grow faster.
+
+
+💼 SaaS: Improving Onboarding and Activation
+
+ A SaaS team observes where new users drop off during onboarding, analyzes key behaviors to uncover friction points, decides which improvements will reduce time-to-value, and acts by testing guided experiences that drive activation and retention.
+
+---
+
+**Observe**
+The team monitors early-user activity using **Session Replay**, **Heatmaps**, and **Autocapture** to see how new users interact with the onboarding flow: *Account Created → Tutorial Completed → Key Action Taken*. They set up **Alerts** to track sudden drops in completion.
+
+---
+
+**Analyze**
+Using **Funnels**, **Flows**, and **Cohorts**, the team identifies where users are stalling and compares completion across personas. Behavioral trends show that users who skip the advanced configuration step activate faster.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Flows** | 45% drop-off between “Account Created” and “Tutorial Completed” | Users struggle with early onboarding complexity. |
+| **Cohorts** | Different activation rates by persona | Simpler onboarding correlates with higher early success. |
+| **Session Replay** | Confusion at advanced setup screens | Users hesitate when asked to complete optional steps too soon. |
+
+---
+
+**Decide**
+The team uses **Metric Trees** to understand which metrics--like “First Value Reached”--drive long-term retention. They decide to move optional setup later in the journey and emphasize the first “aha” moment sooner.
+
+---
+
+**Act**
+They test this change through **Experiments**, rolling out a new guided flow to a subset of users. When the experiment shows faster activation and higher retention, they launch it to all users using **Feature Flags**.
+
+---
+
+**✨ Result:** Activation improves 15%, and new users reach value faster with fewer drop-offs.
+
+
+
+
+
+
+🛍️ eCommerce: Increasing Checkout Conversion
+
+ An eCommerce team observes shopper behavior throughout checkout, analyzes patterns to uncover mobile friction, decides which optimizations will improve conversion, and acts by testing streamlined checkout designs.
+
+---
+
+**Observe**
+Using **Session Replay**, **Heatmaps**, and **Autocapture**, the team tracks the path from *Product Viewed → Added to Cart → Checkout Started → Purchase Completed* to see where users abandon the flow.
+
+---
+
+**Analyze**
+Through **Funnels**, **Cohorts**, and **Retention**, the team identifies a 40% drop-off at payment on mobile devices.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels** | 40% mobile drop-off at payment | Checkout fields are too dense for mobile screens. |
+| **Session Replay & Heatmaps** | Users zoom and misclick on payment fields | UI not optimized for mobile input. |
+
+---
+
+**Decide**
+The team leverages **Metric Trees** to link checkout completion to revenue impact. They decide to simplify payment forms and surface the most-used payment options first.
+
+---
+
+**Act**
+They deploy a **Feature Flag** to release the redesigned checkout to 50% of traffic and use **Experiments** to confirm conversion improvements before rolling it out universally.
+
+---
+
+**✨ Result:** Checkout completion rises 20%, and mobile shoppers complete purchases faster with fewer errors.
+
+
+
+
+
+
+🎬 Media & Entertainment: Boosting Viewer Retention
+
+ A streaming platform observes viewer engagement across content, analyzes which experiences retain audiences, decides how to personalize recommendations, and acts by iterating on what drives continued watching.
+
+---
+
+**Observe**
+The team uses **Session Replay**, **Autocapture**, and **Alerts** to track *Episode Started → Episode Completed → Next Episode Started* events, identifying drop-offs by series and genre.
+
+---
+
+**Analyze**
+They turn to **Retention** and **Cohorts** to learn which viewers come back. **Funnels & Flows** show most viewers stop after Episode 2, and qualitative data confirms weak recommendations at that point.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Retention & Cohorts** | Only 30% return for Episode 3 | Low content continuity beyond early episodes. |
+| **Funnels & Flows** | Drop-off after Episode 2 | Weak recommendations between episodes. |
+
+---
+
+**Decide**
+Using **Metric Trees**, the team connects “Episode Completion Rate” to “Viewer Retention.” They decide to insert a personalized “Up Next” prompt and ratings flow to strengthen recommendations.
+
+---
+
+**Act**
+They run an **Experiment** on the "Up Next" prompt. After positive results that indicate viewers who see the new recommendations have longer sessions and higher continuation rates, they expand rollout to all viewers using **Feature Flags**.
+
+
+---
+
+**✨ Result:** Viewer retention improves 25%, with stronger engagement across new series launches.
+
+
+
+
+
+
+💰 Fintech: Increasing Feature Adoption and Retention
+
+ A fintech product team observes user engagement with budgeting tools, analyzes setup friction, decides how to improve adoption, and acts by optimizing flows that drive retention.
+
+---
+
+**Observe**
+They track customer actions with **Session Replay** and **Autocapture**, focusing on *Account Linked → Budget Created → Spending Reviewed → Budget Adjusted*. **Alerts** notify them of sudden drops in budget creation.
+
+---
+
+**Analyze**
+Using **Funnels**, **Cohorts**, and **Retention**, they find that users linking smaller financial institutions often fail to complete setup.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Cohorts** | Users stop after linking a bank account | Authentication errors block progress. |
+| **Retention Reports** | Users who finish setup return 2× more often | Early success predicts long-term retention. |
+
+---
+
+**Decide**
+The team uses **Metric Trees** to link budgeting feature adoption to retention KPIs. They decide to improve error handling and prompt users to set alerts immediately after creating a budget.
+
+---
+
+**Act**
+They run an **Experiment** on a new “Set Alert” flow. After positive results, they expand rollout to all customers using **Feature Flags**.
+
+---
+
+**✨ Result:** Budget feature adoption increases 30%, and retention rises 15% as users set up alerts sooner.
+
+
+
+
+
+
+🎮 Gaming: Driving Player Engagement and In-App Purchases
+
+ A gaming studio observes player behavior across levels, analyzes progression data to uncover friction, decides what changes will improve engagement, and acts by launching prompts that drive completion and purchases.
+
+---
+
+**Observe**
+The team uses **Session Replay**, **Heatmaps**, and **Autocapture** to track *Level Started → Level Completed → In-App Purchase Made*, identifying the points where players churn.
+
+---
+
+**Analyze**
+They turn to **Funnels** and **Cohorts** on player feedback to understand why drop-offs occur. Players who use “Power-Ups” progress faster, and replays show many ignore in-game hint icons.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Cohorts** | High player drop-off after Level 3 | Repeated failures at one level lead to early churn. |
+| **Session Replay & Heatmaps** | Players miss hint icons | Poor visual placement of key gameplay aids. |
+
+---
+
+**Decide**
+Using **Metric Trees**, the team links early-level completions to retention and in-app purchases. They decide to introduce Power-Up tutorials earlier to help players progress.
+
+---
+
+**Act**
+They run an **Experiment** on a new “Use Power-Up” tutorial and measure results. When they see improved completion and monetization metrics, they expand it to all players using **Feature Flags**.
+
+---
+
+**✨ Result:** Level completion improves 35%, and in-app purchases increase 20%, driving sustained engagement.
+
+
+
+No matter your industry, the OADA loop helps you turn insights into action-—and Mixpanel gives you the tools to complete that loop faster with every iteration.
+
+---
+
+## Learn More
+
+Want to understand the strategy behind continuous innovation? Check out our blog on [How Digital Continuous Innovation Drives Sustainable Enterprise Growth](https://mixpanel.com/blog/digital-continuous-innovation/) to see how leading enterprises use the OADA framework to connect data, decisions, and action—–and build a culture of sustainable growth.
diff --git a/pages/guides/guides-by-use-case/mixpanel-introduction.mdx b/pages/guides/guides-by-use-case/mixpanel-introduction.mdx
new file mode 100644
index 0000000000..2d89248b66
--- /dev/null
+++ b/pages/guides/guides-by-use-case/mixpanel-introduction.mdx
@@ -0,0 +1,118 @@
+# Mixpanel Introduction
+
+Mixpanel is a **digital analytics platform** that helps teams continuously improve their products by turning data into action. At its core, Mixpanel supports a **continuous innovation loop**—helping you observe what users do, analyze why it happens, decide what to do next, and act on those insights.
+
+> **Why it matters:** Teams use Mixpanel to learn faster, align decisions across functions, and measure the impact of every product change.
+
+---
+
+## See Mixpanel in Action
+
+
+
+
+ {/* Left: video with a min width so it wraps when narrow */}
+
+
+
+
+
+
+ {/* Right: text with a flexible basis */}
+
+
+
+
+---
+
+## Mixpanel Features That Power Continuous Innovation
+
+
+
+
+Mixpanel is built around a simple but powerful framework for continuous improvement: **Observe → Analyze → Decide → Act (OADA)**.
+
+Each stage in the OADA framework connects directly to Mixpanel’s tools, helping you move from data to observation to action–—all in one platform.
+
+
+
+
+| 👀 **Observe** | 📊 **Analyze** | 💡 **Decide** | 🚀 **Act** |
+|---|---|---|---|
+| See what’s happening in your product with [Session Replay](/docs/session-replay), [Heatmaps](/docs/session-replay/heatmaps), [Autocapture](/docs/tracking-methods/autocapture), and [Alerts](/docs/features/alerts). | Explore [Insights](/docs/reports/insights), [Funnels](/docs/reports/funnels), [Flows](/docs/reports/flows), [Retention](/docs/reports/retention), and [Cohorts](/docs/users/cohorts) to find what moves your metrics. | Align on what to change next with [Metric Trees](/docs/metric_tree), [Boards](/docs/boards), [Annotations](/docs/features/annotations), and shared insights. | Measure impact with [Experiments](/docs/experiments) and ship improvements with [Feature Flags](/docs/featureflags). |
+
+All of this is powered by Mixpanel’s modern data foundation—–bringing together AI-assisted analysis through features like [MCP Server](/docs/features/mcp), robust [data governance](/docs/data-governance) for accuracy and trust, and built-in collaboration tools that help teams move from insight to action faster.
+
+**Learn more:** [Go deeper on the OADA Loop →](/guides/oada-loop)
+
+---
+
+## Mixpanel Data Model
+
+Everything in Mixpanel starts with **events**––the building blocks of your data model.
+
+An event represents something a **user** does (like *Signed Up*, *Viewed a Product*, or *Completed Purchase*). Each event can include **properties** that add context, such as the user’s plan type, location, or device. Optionally, you can analyze data at the group level using [Group Analytics](/docs/data-structure/group-analytics).
+
+
+
+Together, these events and properties form a flexible data model that mirrors how people actually use your product. Once instrumented, you can analyze this data instantly—–without writing SQL or waiting on an analyst.
+
+**Learn more:** [Dive deeper into how Mixpanel structures data →](/docs/data-structure/concepts)
+
+---
+
+## Keep Learning
+
+Keep building your Mixpanel expertise with these resources designed to help you learn, connect, and put insights into action.
+
+| Resource | Purpose | Link |
+|-----------|--------------|------|
+| **Community** | Connect with other Mixpanel users, share ideas, and learn how peers are tackling similar challenges. | [Open →](https://community.mixpanel.com/) |
+| **Developer Docs** | Build and extend Mixpanel with SDKs, APIs, and advanced implementation guides. | [Open →](https://developer.mixpanel.com/reference/overview) |
+| **Docs** | Explore product capabilities, setup guides, and detailed feature references. | [Open →](https://docs.mixpanel.com/) |
+| **Events** | Join live sessions and webinars to explore new features, use cases, and expert-led best practices. | [Open →](https://mixpanel.com/events) |
+| **Guides** | Apply Mixpanel best practices to real-world workflows and use cases. | 📍 You are here |
+| **Mixpanel University** | Follow guided learning paths to validate your skills and earn certifications. | [Open →](https://mixpanel.com/university/) |
+
+Wherever you are in your Mixpanel journey, these resources will help you keep learning, stay connected, and keep improving.
diff --git a/pages/guides/guides-by-use-case/mixpanel-lp-old.mdx b/pages/guides/guides-by-use-case/mixpanel-lp-old.mdx
new file mode 100644
index 0000000000..e42938971d
--- /dev/null
+++ b/pages/guides/guides-by-use-case/mixpanel-lp-old.mdx
@@ -0,0 +1,406 @@
+# Inside Mixpanel: The OADA Loop
+Mixpanel helps product teams **see what’s happening**, **understand why**, **decide what to do next**, and **measure impact**—fast. This continuous cycle is the **OADA loop**: Observe → Analyze → Decide → Act.
+
+---
+
+## See Mixpanel in Action
+
+
+
+
+ {/* Left: video with a min width so it wraps when narrow */}
+
+
+
+
+
+
+ {/* Right: text with a flexible basis */}
+
+
+
+
+---
+
+## Close the Loop with Mixpanel
+
+See how each step in the loop connects to Mixpanel features and Guides that help you put it into action.
+
+| 👀 **Observe** | 📊 **Analyze** | 💡 **Decide** | 🚀 **Act** |
+|---|---|---|---|
+| See what’s happening in your product by tracking the events that matter.
**[Build a tracking strategy →](https://docs.mixpanel.com/guides/guides-by-use-case/build-a-tracking-strategy)** | Explore funnels, retention, and drivers to find what moves your metrics.
| Align on what to change next with dashboards, annotations, and shared insights.
| Ship improvements and measure impact with experiments and launch tracking.
**[Drive product innovation →](https://docs.mixpanel.com/guides/guides-by-use-case/driving-product-innovation)**
+
+---
+
+## How Different Industries Use the Continuous Loop
+
+Mixpanel powers the OADA loop across every industry--helping teams turn data into confident action.
+
+Whether you’re optimizing user onboarding, increasing checkout conversions, or improving content engagement, the same continuous loop applies.
+
+Below are examples of how teams in different industries use Mixpanel to measure, learn, and grow faster.
+
+
+
+💼 SaaS: Improving Onboarding and Activation
+
+ A SaaS team observes where new users drop off during onboarding, analyzes behavior to uncover friction points, decides which steps to simplify, and acts by testing improvements that drive activation and conversion.
+
+---
+
+**Observe**
+The team tracks the onboarding journey to understand where users get stuck: *Account Created → Tutorial Completed → Key Action Taken*
+
+Using **Funnels** and **Flows** in Mixpanel, they see that a large portion of users drop off before finishing the tutorial.
+
+---
+
+**Analyze**
+They dig deeper with **Insights** and **Session Replay** to find out *why*. Behavioral data shows that most users abandon onboarding when asked to configure a complex setting too early, while replays confirm that this screen causes confusion.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Flows** | High drop-off between “Account Created” and “Tutorial Completed” | Many users start onboarding but don’t reach the first key action, indicating friction early in the setup process. |
+| **Insights** | Users who skip advanced configuration steps are more likely to activate | Complex setup tasks cause early abandonment before users reach value. |
+| **Session Replay** | Users pause or exit on the configuration screen | The configuration step is confusing, creating hesitation and contributing to drop-offs. |
+
+
+---
+
+**Decide**
+The team compares **cohorts** of users who complete onboarding in their first session versus those who don’t. They find that users who finish onboarding right away are twice as likely to activate within a week. They decide to simplify the initial setup and move optional configuration to a later stage.
+
+---
+
+**Act**
+They run an **Experiment** testing a new guided flow that delays complex setup until after the user experiences initial value. After the winning version significantly improves activation, they roll it out to all users using **Feature Flags**.
+
+---
+
+**✨ Result:** Activation improves 15%, and time to value shortens.
+
+
+
+
+
+
+
+🛍️ eCommerce: Increasing Checkout Conversion
+
+ An eCommerce team observes how shoppers move through checkout, analyzes drop-offs to find mobile pain points, decides what changes to prioritize, and acts by launching experiments that lift conversion rates.
+
+---
+
+**Observe**
+The team maps the path to purchase to pinpoint where customers abandon the flow: *Product Viewed → Added to Cart → Checkout Started → Purchase Completed*
+
+Using **Funnels** in Mixpanel, they see that a large portion of users start checkout but never complete payment.
+
+---
+
+**Analyze**
+They investigate further with **Insights**, segmenting results by device and geography. Mobile users show significantly lower completion rates.
+
+Through **Session Replay** and **Heatmaps**, they observe users zooming in on payment fields, misclicking form inputs, and abandoning the process after multiple failed attempts.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels** | 40% drop-off at payment step | Many users abandon checkout before completing payment, indicating friction at this stage. |
+| **Session Replay & Heatmaps** | Users zoom and misclick on payment fields | The form is difficult to complete on mobile, leading to frustration and abandonment. |
+| **Insights** | Longer time on payment page before exit | Users struggle to enter details correctly, suggesting unclear field labels or validation errors. |
+
+
+---
+
+**Decide**
+The team compares **cohorts** of mobile users who completed checkout versus those who didn’t. They find that reducing the number of required fields correlates with higher conversion. They decide to simplify the payment form and surface popular payment options earlier in the flow.
+
+---
+
+**Act**
+They create an **Experiment** testing the simplified checkout form with mobile users. The results show a significant lift in mobile conversions, so the team rolls out the new design to all users using **Feature Flags**.
+
+---
+
+**✨ Result:** Checkout completion increases by 20%, and mobile users complete purchases faster with fewer errors.
+
+
+
+
+
+
+
+🎬 Media & Entertainment: Boosting Viewer Retention
+
+ A streaming platform observes how audiences engage with content, analyzes where viewers disengage, decides how to personalize recommendations, and acts by improving the experience to keep viewers watching longer.
+
+---
+
+**Observe**
+The team maps the typical viewing journey to understand where viewers lose interest: *Episode Started → Episode Completed → Next Episode Started*
+
+Using **Funnels**, they see a major drop-off after Episode 2 in new series. **Flows** confirm that most users who stop watching don’t move on to similar shows or genres.
+
+---
+
+**Analyze**
+They dig deeper using **Insights**, **Retention**, and **Session Replay** to uncover why engagement drops off. They discover that episodes with weaker completion rates also have fewer post-watch interactions, like “Add to Watch List” or “Rate Content.”
+Session Replays and Heatmaps reveal that end-of-episode recommendations are often irrelevant or missed entirely.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Flows** | Sharp viewer drop-off after Episode 2 | Audiences disengage early in new series, suggesting issues with content pacing or recommendations. |
+| **Insights & Retention** | Fewer return visits after initial viewing session | Users aren’t motivated to continue watching after early episodes, indicating weak re-engagement drivers. |
+| **Session Replay & Heatmaps** | Limited interaction with “Next Episode” or “Recommended for You” sections | End-of-episode recommendations aren’t personalized or visually prominent enough to encourage continued viewing. |
+
+
+---
+
+**Decide**
+The team compares **cohorts** of viewers who continue past Episode 2 versus those who stop. They find that engagement with “Next Episode” or “Rate Content” interactions strongly predicts long-term retention. They decide to improve personalization and prompt users to rate episodes before recommending what to watch next.
+
+---
+
+**Act**
+They run an **Experiment** testing a new “Rate this Episode” prompt and smarter “Up Next” recommendations. Once they see a significant increase in episode-to-episode continuation, they roll out the new experience globally using **Feature Flags**.
+
+---
+
+**✨ Result:** Repeat viewership increases 25%, and average session length grows as more viewers continue past early episodes.
+
+
+
+
+
+
+
+💰 Fintech: Increasing Feature Adoption and Retention
+
+ A fintech product team observes how customers engage with budgeting tools, analyzes where they drop off, decides which steps to simplify, and acts by testing new prompts that improve adoption and retention.
+
+---
+
+**Observe**
+The team maps the customer journey to identify where engagement drops: *Account Linked → Budget Created → Spending Reviewed → Budget Adjusted*
+
+Using **Funnels** and **Flows** in Mixpanel, they see that many users link their bank account but never complete a first budget. This gap represents a key opportunity to improve activation within the product.
+
+---
+
+**Analyze**
+They dig deeper using **Insights**, **Retention**, and **Session Replay** to understand why users don’t finish setup. Behavioral data reveals that users linking smaller financial institutions often encounter errors. Retention reports show that users who successfully create a budget are twice as likely to return within 30 days.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Flows** | Many users stop after linking a bank account | Setup friction prevents users from creating their first budget and experiencing value. |
+| **Insights** | Lower completion rates among users linking smaller institutions | Connection errors and inconsistent authentication flows block progress. |
+| **Retention** | Users who complete budget setup return 2× more often within 30 days | Early success with the budgeting feature predicts long-term engagement. |
+| **Session Replay** | Users repeatedly attempt to link accounts before exiting | Frustration during setup causes drop-offs before key actions are completed. |
+
+
+---
+
+**Decide**
+The team compares **cohorts** of users who complete their first budget versus those who don’t. They find that users who set up a budget alert immediately after creation retain best over time. They decide to streamline the account linking flow and prompt users to set alerts earlier.
+
+---
+
+**Act**
+They launch an **Experiment** testing a simplified linking process and a new “Set Your First Alert” prompt after budget creation. When the new experience increases completion and retention, they roll it out to all users using **Feature Flags**.
+
+---
+
+**✨ Result:** Budget feature adoption increases 30%, and retention improves 15% as users connect accounts and set alerts more easily.
+
+
+
+
+
+
+
+🎮 Gaming: Driving Player Engagement and In-App Purchases
+
+ A gaming studio observes how players progress through levels, analyzes behavior to uncover friction and missed opportunities, decides what changes will keep players engaged, and acts by testing prompts that increase completion and purchases.
+
+---
+
+**Observe**
+The team tracks key in-game milestones to identify where players churn: *Level Started → Level Completed → In-App Purchase Made*
+
+Using **Funnels** and **Flows** in Mixpanel, they find that a large portion of users drop off after failing Level 3 multiple times. This stage becomes their primary focus for improvement.
+
+---
+
+**Analyze**
+They use **Insights**, **Session Replay**, and **Heatmaps** to understand what’s causing frustration. Behavioral data shows that players who use “Power-Ups” early in gameplay are far more likely to progress, while replays reveal many players overlook the in-game hint icon.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Flows** | High player drop-off after Level 3 | Repeated failures at a single level cause early churn and reduced engagement. |
+| **Insights** | Power-Up users complete twice as many levels as others | Early exposure to helpful tools increases progression and satisfaction. |
+| **Session Replay & Heatmaps** | Players ignore or miss the on-screen “Hint” icon | Key features are visually understated or poorly timed, leading to missed opportunities. |
+
+
+---
+
+**Decide**
+The team compares **cohorts** of players who used Power-Ups before Level 3 with those who didn’t. They confirm that introducing Power-Ups earlier increases both level completion and purchase likelihood. They decide to highlight Power-Ups at the start of Level 3 with a short tutorial prompt.
+
+---
+
+**Act**
+They run an **Experiment** testing the new Power-Up prompt, measuring its effect on both level completion and in-app purchases. After the new version shows a significant improvement in completion and a rise in purchases, the team deploys it to all players using **Feature Flags**.
+
+---
+
+**✨ Result:** Players advance further through the game, spend more time in-app, and make more purchases, driving sustained engagement.
+
+
+
+
+No matter your industry, the OADA loop helps you turn insights into action--and Mixpanel gives you the tools to complete that loop faster with every iteration.
+
+---
+
+## Keep Learning
+Keep building your Mixpanel expertise with these resources designed to help you learn, connect, and put insights into action.
+
+| Resource | Description | Link |
+|-----------|--------------|------|
+| **Community** | Connect with other Mixpanel users, share ideas, and learn how peers are tackling similar challenges. | [Open →](https://community.mixpanel.com/) |
+| **Developer Docs** | Build and extend Mixpanel with advanced implementation guides. | [Open →](https://developer.mixpanel.com/reference/overview) |
+| **Docs** | Explore product capabilities, setup guides, and detailed feature references. | [Open →](https://docs.mixpanel.com/) |
+| **Events** | Join live sessions and webinars to explore new features, use cases, and expert-led best practices. | [Open →](https://mixpanel.com/events) |
+| **Guides** | Apply Mixpanel best practices to real-world workflows and use cases. | 📍 You are Here |
+| **Mixpanel University** | Follow guided learning paths to validate your skills and earn certifications. | [Open →](https://mixpanel.com/university/) |
+
+Wherever you are in your Mixpanel journey, these resources will help you continue learning, stay connected, and keep improving.
+
+## Archive
+Saving this cool code, but not sure we need it for this page.
+
+### Boxes
+
+
+
📈
+
Analysis
+
Explore behavior with Insights, Funnels, Flows, and Retention to spot drop-offs, drivers, and return patterns.
+
Why it matters: Turns raw events into answers fast.
+ Everything in Mixpanel starts with events––the building blocks of your data model.
+
+
+ An event represents something a user does (like Signed Up, Viewed a Product,
+ or Completed Purchase). Each event can include properties that add context, such as
+ the user’s plan type, location, or device.
+
+
+ Together, these events and properties form a flexible data model that mirrors how people actually use your
+ product. Once instrumented, you can analyze this data instantly—–without writing SQL or waiting on an analyst.
+
+
+
+
+ 
+
+ Example of how events and properties structure user behavior data in Mixpanel.
+
\ No newline at end of file
diff --git a/public/Data_Model_with_Group_Analytics.png b/public/Data_Model_with_Group_Analytics.png
new file mode 100644
index 0000000000..c0e541f8df
Binary files /dev/null and b/public/Data_Model_with_Group_Analytics.png differ
diff --git a/public/oada-graphic.png b/public/oada-graphic.png
new file mode 100644
index 0000000000..ce1234994a
Binary files /dev/null and b/public/oada-graphic.png differ
diff --git a/public/oada-loop-detail.png b/public/oada-loop-detail.png
new file mode 100644
index 0000000000..b929f99c9d
Binary files /dev/null and b/public/oada-loop-detail.png differ
diff --git a/public/oada-loop-simple-tall.png b/public/oada-loop-simple-tall.png
new file mode 100644
index 0000000000..fba8de86c2
Binary files /dev/null and b/public/oada-loop-simple-tall.png differ
diff --git a/public/oada-loop-simple-wide-fcf9fa.png b/public/oada-loop-simple-wide-fcf9fa.png
new file mode 100644
index 0000000000..3445374b3e
Binary files /dev/null and b/public/oada-loop-simple-wide-fcf9fa.png differ
diff --git a/public/oada-loop-simple-wide.png b/public/oada-loop-simple-wide.png
new file mode 100644
index 0000000000..6a92a81ac5
Binary files /dev/null and b/public/oada-loop-simple-wide.png differ
diff --git a/public/oada-loop-simple.png b/public/oada-loop-simple.png
new file mode 100644
index 0000000000..2a46aece2d
Binary files /dev/null and b/public/oada-loop-simple.png differ