+ ),
+ },
+ "guides-by-use-case": "Guides by Use Case",
+ "guides-by-workflow": "Guides by Workflow",
+ "guides-by-topic": "Guides by Topic",
+ "best-practices-and-playbooks": {
+ type: "separator",
+ title: (
+
+ PLAYBOOKS
+
+ ),
+ },
+ "benchmarks": "Benchmarks",
+ "strategic-playbooks": "Strategic Playbooks"
+}
diff --git a/pages/guides/benchmarks.mdx b/pages/guides/benchmarks.mdx
new file mode 100644
index 0000000000..07c895bd0c
--- /dev/null
+++ b/pages/guides/benchmarks.mdx
@@ -0,0 +1,27 @@
+import { Cards } from 'nextra/components'
+
+# Benchmarks & Analytics Maturity
+Understand performance benchmarks and assess your organization’s analytics maturity. Whether you’re setting goals, comparing performance against peers, or looking for ways to advance your analytics practice, these resources will help guide your next steps.
+
+## Benchmarks
+See product performance baselines across conversion, retention, and engagement. Use these as north stars to calibrate goals and identify outliers.
+
+
+
+
+
+## Benchmarks by Industry
+Drill into industry-specific norms to set context-rich targets.
+
+
+
+
+
+
+
+## Analytics Maturity
+Assess where your org sits on the analytics maturity curve and chart the next moves to unlock impact.
+
+
+
+
\ No newline at end of file
diff --git a/pages/guides/guides-by-topic/_meta.ts b/pages/guides/guides-by-topic/_meta.ts
new file mode 100644
index 0000000000..d26e366012
--- /dev/null
+++ b/pages/guides/guides-by-topic/_meta.ts
@@ -0,0 +1,5 @@
+export default {
+ "continuous-innovation": "Continuous Innovation",
+ "features": "Features",
+ "core-reports": "Core Reports"
+}
\ No newline at end of file
diff --git a/pages/guides/guides-by-topic/continuous-innovation.mdx b/pages/guides/guides-by-topic/continuous-innovation.mdx
new file mode 100644
index 0000000000..c5d89d81f6
--- /dev/null
+++ b/pages/guides/guides-by-topic/continuous-innovation.mdx
@@ -0,0 +1,240 @@
+import { Callout, Steps } from 'nextra/components'
+
+# Mixpanel and the Continuous Innovation Loop
+The **Observe → Analyze → Decide → Act (OADA)** loop is the framework behind how great teams use Mixpanel to continuously innovate.
+
+
+
+This loop represents the cycle teams follow to turn data into action by first observing what users do, then analyzing why it happens, deciding what to do next, and finally acting on those insights to drive better outcomes.
+
+Each phase represents a key moment in the product decision cycle–—and Mixpanel provides the tools to complete that loop faster with every iteration.
+
+
+**Why it matters:** Teams that move through the loop quickly learn what’s working, align faster, and deliver more impactful product improvements.
+
+---
+
+## Why Continuous Innovation Matters
+
+Building great products isn’t just about speed—–it’s about learning continuously.
+
+Markets shift, user expectations evolve, and what worked last quarter might not work tomorrow. The teams that win are the ones that turn data into a *habit of improvement* by observing what users do, analyzing why it happens, deciding what to do next, and acting with confidence.
+
+Teams that move through the OADA loop quickly:
+- Learn what’s working and what’s not
+- Align on priorities faster
+- Deliver more impactful product improvements with each iteration
+
+
+**Pro tip**: Make closing the loop part of every sprint–—observe, analyze, decide, act.
+
+
+Digital innovation isn’t a one-time project. It’s a continuous cycle of learning and improvement.
+
+---
+
+## How Different Industries Use the Continuous Loop
+
+Mixpanel powers the OADA loop across every industry--helping teams turn data into confident action.
+
+Whether you’re optimizing user onboarding, increasing checkout conversions, or improving content engagement, the same continuous loop applies.
+
+Below are examples of how teams in different industries use Mixpanel to measure, learn, and grow faster.
+
+
+💼 SaaS: Improving Onboarding and Activation
+
+ A SaaS team observes where new users drop off during onboarding, analyzes key behaviors to uncover friction points, decides which improvements will reduce time-to-value, and acts by testing guided experiences that drive activation and retention.
+
+---
+
+**Observe**
+The team monitors early-user activity using **Session Replay**, **Heatmaps**, and **Autocapture** to see how new users interact with the onboarding flow: *Account Created → Tutorial Completed → Key Action Taken*. They set up **Alerts** to track sudden drops in completion.
+
+---
+
+**Analyze**
+Using **Funnels**, **Flows**, and **Cohorts**, the team identifies where users are stalling and compares completion across personas. Behavioral trends show that users who skip the advanced configuration step activate faster.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Flows** | 45% drop-off between “Account Created” and “Tutorial Completed” | Users struggle with early onboarding complexity. |
+| **Cohorts** | Different activation rates by persona | Simpler onboarding correlates with higher early success. |
+| **Session Replay** | Confusion at advanced setup screens | Users hesitate when asked to complete optional steps too soon. |
+
+---
+
+**Decide**
+The team uses **Metric Trees** to understand which metrics--like “First Value Reached”--drive long-term retention. They decide to move optional setup later in the journey and emphasize the first “aha” moment sooner.
+
+---
+
+**Act**
+They test this change through **Experiments**, rolling out a new guided flow to a subset of users. When the experiment shows faster activation and higher retention, they launch it to all users using **Feature Flags**.
+
+---
+
+**✨ Result:** Activation improves 15%, and new users reach value faster with fewer drop-offs.
+
+
+
+
+
+
+🛍️ eCommerce: Increasing Checkout Conversion
+
+ An eCommerce team observes shopper behavior throughout checkout, analyzes patterns to uncover mobile friction, decides which optimizations will improve conversion, and acts by testing streamlined checkout designs.
+
+---
+
+**Observe**
+Using **Session Replay**, **Heatmaps**, and **Autocapture**, the team tracks the path from *Product Viewed → Added to Cart → Checkout Started → Purchase Completed* to see where users abandon the flow.
+
+---
+
+**Analyze**
+Through **Funnels**, **Cohorts**, and **Retention**, the team identifies a 40% drop-off at payment on mobile devices.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels** | 40% mobile drop-off at payment | Checkout fields are too dense for mobile screens. |
+| **Session Replay & Heatmaps** | Users zoom and misclick on payment fields | UI not optimized for mobile input. |
+
+---
+
+**Decide**
+The team leverages **Metric Trees** to link checkout completion to revenue impact. They decide to simplify payment forms and surface the most-used payment options first.
+
+---
+
+**Act**
+They deploy a **Feature Flag** to release the redesigned checkout to 50% of traffic and use **Experiments** to confirm conversion improvements before rolling it out universally.
+
+---
+
+**✨ Result:** Checkout completion rises 20%, and mobile shoppers complete purchases faster with fewer errors.
+
+
+
+
+
+
+🎬 Media & Entertainment: Boosting Viewer Retention
+
+ A streaming platform observes viewer engagement across content, analyzes which experiences retain audiences, decides how to personalize recommendations, and acts by iterating on what drives continued watching.
+
+---
+
+**Observe**
+The team uses **Session Replay**, **Autocapture**, and **Alerts** to track *Episode Started → Episode Completed → Next Episode Started* events, identifying drop-offs by series and genre.
+
+---
+
+**Analyze**
+They turn to **Retention** and **Cohorts** to learn which viewers come back. **Funnels & Flows** show most viewers stop after Episode 2, and qualitative data confirms weak recommendations at that point.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Retention & Cohorts** | Only 30% return for Episode 3 | Low content continuity beyond early episodes. |
+| **Funnels & Flows** | Drop-off after Episode 2 | Weak recommendations between episodes. |
+
+---
+
+**Decide**
+Using **Metric Trees**, the team connects “Episode Completion Rate” to “Viewer Retention.” They decide to insert a personalized “Up Next” prompt and ratings flow to strengthen recommendations.
+
+---
+
+**Act**
+They run an **Experiment** on the "Up Next" prompt. After positive results that indicate viewers who see the new recommendations have longer sessions and higher continuation rates, they expand rollout to all viewers using **Feature Flags**.
+
+
+---
+
+**✨ Result:** Viewer retention improves 25%, with stronger engagement across new series launches.
+
+
+
+
+
+
+💰 Fintech: Increasing Feature Adoption and Retention
+
+ A fintech product team observes user engagement with budgeting tools, analyzes setup friction, decides how to improve adoption, and acts by optimizing flows that drive retention.
+
+---
+
+**Observe**
+They track customer actions with **Session Replay** and **Autocapture**, focusing on *Account Linked → Budget Created → Spending Reviewed → Budget Adjusted*. **Alerts** notify them of sudden drops in budget creation.
+
+---
+
+**Analyze**
+Using **Funnels**, **Cohorts**, and **Retention**, they find that users linking smaller financial institutions often fail to complete setup.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Cohorts** | Users stop after linking a bank account | Authentication errors block progress. |
+| **Retention Reports** | Users who finish setup return 2× more often | Early success predicts long-term retention. |
+
+---
+
+**Decide**
+The team uses **Metric Trees** to link budgeting feature adoption to retention KPIs. They decide to improve error handling and prompt users to set alerts immediately after creating a budget.
+
+---
+
+**Act**
+They run an **Experiment** on a new “Set Alert” flow. After positive results, they expand rollout to all customers using **Feature Flags**.
+
+---
+
+**✨ Result:** Budget feature adoption increases 30%, and retention rises 15% as users set up alerts sooner.
+
+
+
+
+
+
+🎮 Gaming: Driving Player Engagement and In-App Purchases
+
+ A gaming studio observes player behavior across levels, analyzes progression data to uncover friction, decides what changes will improve engagement, and acts by launching prompts that drive completion and purchases.
+
+---
+
+**Observe**
+The team uses **Session Replay**, **Heatmaps**, and **Autocapture** to track *Level Started → Level Completed → In-App Purchase Made*, identifying the points where players churn.
+
+---
+
+**Analyze**
+They turn to **Funnels** and **Cohorts** on player feedback to understand why drop-offs occur. Players who use “Power-Ups” progress faster, and replays show many ignore in-game hint icons.
+
+| Tool | Observation | Insight |
+|------|--------------|----------|
+| **Funnels & Cohorts** | High player drop-off after Level 3 | Repeated failures at one level lead to early churn. |
+| **Session Replay & Heatmaps** | Players miss hint icons | Poor visual placement of key gameplay aids. |
+
+---
+
+**Decide**
+Using **Metric Trees**, the team links early-level completions to retention and in-app purchases. They decide to introduce Power-Up tutorials earlier to help players progress.
+
+---
+
+**Act**
+They run an **Experiment** on a new “Use Power-Up” tutorial and measure results. When they see improved completion and monetization metrics, they expand it to all players using **Feature Flags**.
+
+---
+
+**✨ Result:** Level completion improves 35%, and in-app purchases increase 20%, driving sustained engagement.
+
+
+
+No matter your industry, the OADA loop helps you turn insights into action-—and Mixpanel gives you the tools to complete that loop faster with every iteration.
+
+---
+
+## Learn More
+
+Want to understand the strategy behind continuous innovation? Check out our blog on [How Digital Continuous Innovation Drives Sustainable Enterprise Growth](https://mixpanel.com/blog/digital-continuous-innovation/) to see how leading enterprises use the OADA framework to connect data, decisions, and action—–and build a culture of sustainable growth.
\ No newline at end of file
diff --git a/pages/guides/guides-by-topic/core-reports.mdx b/pages/guides/guides-by-topic/core-reports.mdx
new file mode 100644
index 0000000000..d950bcd8c0
--- /dev/null
+++ b/pages/guides/guides-by-topic/core-reports.mdx
@@ -0,0 +1,30 @@
+import { Cards } from 'nextra/components'
+
+# Core Reports
+
+
+
+
+
+
+
+
+
diff --git a/pages/guides/guides-by-topic/core-reports/_meta.ts b/pages/guides/guides-by-topic/core-reports/_meta.ts
new file mode 100644
index 0000000000..efaa21f53e
--- /dev/null
+++ b/pages/guides/guides-by-topic/core-reports/_meta.ts
@@ -0,0 +1,8 @@
+export default {
+ "create-boards": "Create Boards",
+ "discover-insights": "Discover Insights",
+ "analyze-conversions": "Analyze Conversions",
+ "build-user-flows": "Build User Flows",
+ "track-user-retention": "Track User Retention",
+ "define-cohorts": "Define Cohorts",
+}
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/analyze-conversions.mdx b/pages/guides/guides-by-topic/core-reports/analyze-conversions.mdx
similarity index 100%
rename from pages/guides/strategic-playbooks/onboarding-playbook/launch/analyze-conversions.mdx
rename to pages/guides/guides-by-topic/core-reports/analyze-conversions.mdx
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/build-user-flows.mdx b/pages/guides/guides-by-topic/core-reports/build-user-flows.mdx
similarity index 100%
rename from pages/guides/strategic-playbooks/onboarding-playbook/launch/build-user-flows.mdx
rename to pages/guides/guides-by-topic/core-reports/build-user-flows.mdx
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/create-boards.mdx b/pages/guides/guides-by-topic/core-reports/create-boards.mdx
similarity index 100%
rename from pages/guides/strategic-playbooks/onboarding-playbook/launch/create-boards.mdx
rename to pages/guides/guides-by-topic/core-reports/create-boards.mdx
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/define-cohorts.mdx b/pages/guides/guides-by-topic/core-reports/define-cohorts.mdx
similarity index 100%
rename from pages/guides/strategic-playbooks/onboarding-playbook/launch/define-cohorts.mdx
rename to pages/guides/guides-by-topic/core-reports/define-cohorts.mdx
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/discover-insights.mdx b/pages/guides/guides-by-topic/core-reports/discover-insights.mdx
similarity index 100%
rename from pages/guides/strategic-playbooks/onboarding-playbook/launch/discover-insights.mdx
rename to pages/guides/guides-by-topic/core-reports/discover-insights.mdx
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/track-user-retention.mdx b/pages/guides/guides-by-topic/core-reports/track-user-retention.mdx
similarity index 100%
rename from pages/guides/strategic-playbooks/onboarding-playbook/launch/track-user-retention.mdx
rename to pages/guides/guides-by-topic/core-reports/track-user-retention.mdx
diff --git a/pages/guides/guides-by-topic/features.mdx b/pages/guides/guides-by-topic/features.mdx
new file mode 100644
index 0000000000..b0174c8950
--- /dev/null
+++ b/pages/guides/guides-by-topic/features.mdx
@@ -0,0 +1,34 @@
+import { Cards } from 'nextra/components'
+
+# Features
+
+## Experiments and Feature Flags
+
+
+
+
+
+
+## Revenue Analytics
+
+
+
+
+
+## Session Replay
+
+
+
+
\ No newline at end of file
diff --git a/pages/guides/guides-by-use-case.mdx b/pages/guides/guides-by-use-case.mdx
new file mode 100644
index 0000000000..84a048c812
--- /dev/null
+++ b/pages/guides/guides-by-use-case.mdx
@@ -0,0 +1,34 @@
+import { Cards } from 'nextra/components'
+
+# Guides by Use Case
+
+## Engage Your Users
+
+
+
+
+
+
+## Grow Your Usership
+
+
+
+
+
+## Empower Your Team
+
+
+
+
diff --git a/pages/guides/guides-by-use-case/_meta.ts b/pages/guides/guides-by-use-case/_meta.ts
index 9b514422d3..3d9eb7bf69 100644
--- a/pages/guides/guides-by-use-case/_meta.ts
+++ b/pages/guides/guides-by-use-case/_meta.ts
@@ -1,5 +1,5 @@
export default {
- "build-a-tracking-strategy": "How to Build a Tracking Strategy",
- "ensuring-data-quality": "Ensuring Data Quality",
- "driving-product-innovation": "Driving Product Innovation"
+ "engage-your-users": "Engage Your Users",
+ "grow-your-usership": "Grow Your Usership",
+ "empower-your-team": "Empower Your Team",
}
\ No newline at end of file
diff --git a/pages/guides/guides-by-use-case/empower-your-team/_meta.ts b/pages/guides/guides-by-use-case/empower-your-team/_meta.ts
new file mode 100644
index 0000000000..26ab9591b8
--- /dev/null
+++ b/pages/guides/guides-by-use-case/empower-your-team/_meta.ts
@@ -0,0 +1,3 @@
+export default {
+ "see-replays": "See Replays"
+}
\ No newline at end of file
diff --git a/pages/guides/guides-by-use-case/empower-your-team/see-replays.mdx b/pages/guides/guides-by-use-case/empower-your-team/see-replays.mdx
new file mode 100644
index 0000000000..b854e3ccba
--- /dev/null
+++ b/pages/guides/guides-by-use-case/empower-your-team/see-replays.mdx
@@ -0,0 +1,250 @@
+import { Callout, Steps } from 'nextra/components'
+
+# Turn Clicks into Clarity: Best Practices for Heatmaps & Session Replay
+
+Discover what users really experience—and turn those insights into action. This Guide shows you how to use [Heatmaps](/docs/session-replay/heatmaps) and [Session Replay](/docs/session-replay) together to quickly spot friction, understand behavior, and drive better product decisions.
+
+---
+
+## What Are Heatmaps and Session Replay?
+
+Heatmaps and Session Replay combine the “what” and the “why” of user behavior.
+
+- Heatmaps show where users click and focus attention.
+
+
+
+
+ {/* Left: video with a min width so it wraps when narrow */}
+
+
+
+
+
+
+ {/* Right: text with a flexible basis */}
+
+
+
+
+- Session Replay shows how they move through your product in real time.
+
+
+
+
+ {/* Left: video with a min width so it wraps when narrow */}
+
+
+
+
+
+
+ {/* Right: text with a flexible basis */}
+
+
+
+
+Together, they help you understand not just *what* happened, but *why*.
+
+---
+
+## 1. Start with a Purpose
+
+Begin with a clear goal instead of watching random replays. Ask:
+
+- What behavior am I trying to understand?
+- Which conversion, drop-off, or UX flow do I want to validate?
+
+✅ **Do** define a specific question like:
+
+- “Where are users dropping off in signup?”
+- “Are people finding the new CTA?”
+- “Do mobile users engage with the feature the same way as desktop users?”
+
+❌ **Do not** assume watching any session will lead to insight. Without a clear question, you will end up chasing edge cases.
+
+👉 **Do this next:** Define what “success” looks like before opening your first replay.
+
+## 2. Use Heatmaps to Find the Pattern
+
+Heatmaps are your first stop when you want to **see interaction trends at scale.**
+
+**Steps to take:**
+
+
+
+{
Choose a meaningful page or flow, like signup, pricing, or onboarding.
}
+
+{
Filter by device or cohort to compare how segments behave.
}
+
+{
Scan for anomalies, such as ignored CTAs or clicks on non-interactive elements.
}
+
+
+
+
+ **Pro tip:** Use *Click Maps* for precision and *Traditional Heatmaps* for broader engagement patterns. Click Maps provide precise insights on dynamic pages with modals or dropdowns.
+
+
+Learn more about [Click Maps and Traditional Heatmaps](/docs/session-replay/heatmaps#overview).
+
+---
+
+## 3. Use Session Replay to Understand the “Why”
+
+Once you have spotted a pattern, Session Replay lets you zoom in on the granular details of individual user sessions.
+
+**Steps to take:**
+
+
+
+{
Filter replays by key event or cohort.
}
+
+For example, you can filter to “users who dropped after viewing pricing.”
+
+{
Watch 6–8 replays.
}
+
+This allows you to identify friction points and common paths your users take.
+
+{
Look for behavioral feedback.
}
+
+This includes excessive scrolling, hesitation, backtracking, or repeated form errors that highlights friction points.
+
+{
Document your findings.
}
+
+Take notes on where users struggle or deviate from expected behavior.
+
+
+
+
+ **Pro tip:** Start from within Mixpanel reports or user profiles to stay anchored to data—not anecdotes. You can click “View Replays” directly from an Event, Funnel, or User Profile to see related sessions immediately.
+
+
+
+ **Pitfall:** One replay ≠ one insight. Confirm patterns across multiple sessions before drawing conclusions.
+
+
+---
+
+## 4. Protect User Privacy and Data Integrity
+
+Session Replay gives full visibility into user behavior—but it also introduces risk if sensitive data appears on-screen.
+
+**Best practices for safe implementation:**
+
+- Use Mixpanel's privacy controls to mask or block sensitive UI screens, like checkout or profile information.
+- Test across devices, especially for mobile SDKs and embedded web views.
+- Review your company's privacy notice and legal compliance before roll-out.
+
+Learn more about [Session Replay Privacy Controls](/docs/session-replay/session-replay-privacy-controls).
+
+---
+
+## 5. Translate Insights into Product Decisions
+
+Data becomes powerful only when shared. Once you identify friction, decide what change to make—and how you will measure its impact.
+
+**Steps to take:**
+
+
+
+{
Summarize your insight.
}
+
+Turn observations into clear takeaways. For example, “Users abandon signup when validation errors are not visible.”
+
+{
Share replay clips to align stakeholders on the problem.
}
+
+Short, focused clips help everyone see the issue firsthand, creating a shared understanding and faster alignment on next steps.
+
+{
Instrument or adjust tracking if needed.
}
+
+Ensure what you are tracking capture key behaviors so you can validate results and spot new opportunities.
+
+
+
+
+ **Pro tip:** Create a recurring “Replay Review” with PMs and designers. It builds empathy and keeps teams aligned on real user behavior.
+
+
+**Strengthen your workflow with related features:**
+
+- [Annotations](/docs/features/annotations): Add notes to your reports to document findings and experiments.
+- [Alerts](/docs/features/alerts): Set notifications for spikes or drops in engagement or friction.
+- [Experiments](/docs/experiments): Validate your hypotheses by testing product changes and measuring lift.
+
+---
+
+## Key Takeaways
+
+- Start with a specific question or metric to guide your analysis.
+- Use Heatmaps for **patterns** and Session Replay for **context**.
+- Review **6–8** sessions per segment to find consistent trends.
+- Safeguard user privacy during implementation and QA.
+- Turn your findings into shared, actionable insights.
+
+📚 **Go deeper:** Explore the Mixpanel Docs on [Heatmaps](/docs/session-replay/heatmaps) and [Session Replay](/docs/session-replay).
diff --git a/pages/guides/guides-by-use-case/engage-your-users/_meta.ts b/pages/guides/guides-by-use-case/engage-your-users/_meta.ts
new file mode 100644
index 0000000000..f280b807ae
--- /dev/null
+++ b/pages/guides/guides-by-use-case/engage-your-users/_meta.ts
@@ -0,0 +1,4 @@
+export default {
+ "drive-product-innovation": "Drive Product Innovation",
+ "ship-features": "Ship Features"
+}
\ No newline at end of file
diff --git a/pages/guides/guides-by-use-case/driving-product-innovation.mdx b/pages/guides/guides-by-use-case/engage-your-users/drive-product-innovation.mdx
similarity index 70%
rename from pages/guides/guides-by-use-case/driving-product-innovation.mdx
rename to pages/guides/guides-by-use-case/engage-your-users/drive-product-innovation.mdx
index 559c66e527..5405d79f86 100644
--- a/pages/guides/guides-by-use-case/driving-product-innovation.mdx
+++ b/pages/guides/guides-by-use-case/engage-your-users/drive-product-innovation.mdx
@@ -1,6 +1,6 @@
import { Callout, Steps } from 'nextra/components'
-# Driving Product Innovation with Mixpanel Experiments
+# Drive Product Innovation with Mixpanel Experiments
Experimentation is how modern product teams make decisions with confidence. Instead of guessing, you test changes with real users, measure the impact, and move forward knowing what works. Mixpanel makes it possible to plan, run, and analyze experiments in one place.
@@ -16,9 +16,59 @@ Making product changes without data is risky. Experimentation helps you:
Leading companies like [Step](https://mixpanel.com/customers/how-step-boosted-direct-deposits-by-14-with-mixpanels-experimentation/) use Mixpanel to build a culture of experimentation, moving quickly without losing customer trust.
-
- **Do this next**: Identify one upcoming product change you are unsure about and commit to testing it before launch.
-
+👉 **Do this next:** Identify one upcoming product change you are unsure about and commit to testing it before launch.
+
+### See Mixpanel Experiments in Action
+
+Before you dive into best practices, take a quick look at how experimentation works inside Mixpanel.
+
+
+
+
+ {/* Left: video with a min width so it wraps when narrow */}
+
+
+
+
+
+
+ {/* Right: text with a flexible basis */}
+
+
+
## Setting Up Experiments the Right Way
@@ -48,9 +98,8 @@ Do not test multiple changes at once. Focus on a single variable so you know wha
Testing too many things at once makes it impossible to know which change worked.
-
-**Do this next**: Draft your hypothesis and list your primary and guardrail metrics. Review with your team before launch.
-
+👉 **Do this next:** Draft your hypothesis and list your primary and guardrail metrics. Review with your team before launch.
+
## Choosing the Right Test Model
@@ -97,9 +146,7 @@ If your primary metric improves and the results are significant, you can trust i
Do not treat inconclusive results as failures; they provide valuable clues, even if they are not yet a win.
-
-**Do this next**: Share experiment results in a [Mixpanel Board](https://mixpanel.com/blog/boards-collaborate-cards-mixpanel-feature-update/) and add notes on what the data means to provide additional context to your numbers.
-
+👉 **Do this next:** Share experiment results in a [Mixpanel Board](https://mixpanel.com/blog/boards-collaborate-cards-mixpanel-feature-update/) and add notes on what the data means to provide additional context to your numbers.
## Acting on Experiment Insights
@@ -114,9 +161,7 @@ The most impactful teams do not stop at analysis; they act.
**Pro tip**:Always explain why you made your decision, not just what you decided. This builds institutional knowledge.
-
-**Do this next**: Add a note in your Experiment Report explaining the reasoning behind your decision.
-
+👉 **Do this next:** Add a note in your Experiment Report explaining the reasoning behind your decision.
## Avoid Common Pitfalls
@@ -129,9 +174,7 @@ Stay alert to these common mistakes:
Avoid declaring a test a success at the first sign of movement; early spikes often disappear as more data arrives.
-
-**Do this next**: Review your last experiment—did you hit any of these pitfalls?
-
+👉 **Do this next:** Review your last experiment—did you hit any of these pitfalls?
## Scaling Experimentation as a Habit
@@ -146,9 +189,7 @@ Experimentation works best when it is part of your culture.
**Pro tip**: Share at least one experiment outcome (good or bad) in every team meeting—it normalizes learning.
-
-**Do this next**: Create a recurring agenda item in your weekly standup for experiment learnings. Keep it lightweight; 2–3 minutes max.
-
+👉 **Do this next:** Create a recurring agenda item in your weekly standup for experiment learnings. Keep it lightweight; 2–3 minutes max.
## Key Takeaways
@@ -159,4 +200,4 @@ To get started:
2. **Share learnings widely.** Make results visible to other teams so everyone benefits.
3. **Treat experimentation as a repeatable process**, not a one-off.
-Learn more about the [Experiments Report](/docs/reports/experiments).
\ No newline at end of file
+Learn more about the [Experiments Report](/docs/reports/experiments).
diff --git a/pages/guides/guides-by-use-case/engage-your-users/ship-features.mdx b/pages/guides/guides-by-use-case/engage-your-users/ship-features.mdx
new file mode 100644
index 0000000000..6dc9d2c33b
--- /dev/null
+++ b/pages/guides/guides-by-use-case/engage-your-users/ship-features.mdx
@@ -0,0 +1,142 @@
+import { Callout, Steps } from 'nextra/components'
+
+# How to Safely Ship New Features with Flags
+
+Feature Flags give your team the power to launch confidently, roll back instantly, and measure success in real time. Mixpanel integrates feature flagging directly into your analytics—so every rollout is both controlled and measurable.
+
+## What Are Feature Flags?
+
+Feature Flags decouple deployment (when code ships) from release (when users see it). This means engineers can deploy at any time, while product teams control when—and to whom—new functionality appears.
+
+Mixpanel's native feature flags integrate directly with your data, so you can track adoption, performance, and impact without additional setup.
+
+---
+
+## 1. Prepare for Rollout
+
+Feature flagging starts with setup. Your engineering team must enable flag evaluation within your app before flags can be controlled in Mixpanel.
+
+Steps to take:
+
+
+
+{
Implement the Mixpanel SDK
}
+
+Make sure to initialize the library with the config option `flags: true`.
+
+- [JavaScript SDK flags setup →](/docs/tracking-methods/sdks/javascript/javascript-flags)
+- [iOS / Swift SDK flags setup →](/docs/tracking-methods/sdks/swift/swift-flags)
+- [Android SDK flags setup →](/docs/tracking-methods/sdks/android/android-flags)
+
+
+{
Define fallback variants in code to prevent errors if the flag fails to resolve.
}
+
+{
Verify flag evaluation using QA Testers before releasing to real users.
}
+
+
+
+- ✅ **Do** confirm the SDK setup and flag evaluation flow with Engineering before creating a feature flag in Mixpanel.
+- ❌ **Do not** create flags in the UI before the SDK is integrated—the flag won't control behavior in your product until the SDK is configured.
+
+
+ **Pitfall:** Skipping SDK setup causes flags to appear “inactive” even if configured correctly in Mixpanel.
+
+
+---
+
+## 2. Plan a Safe Rollout
+
+With SDK setup complete, decide how to roll out the feature. Use data-driven rollout strategies to manage risk and learn quickly.
+
+Common rollout types:
+- **Phased Rollout**: Start with 5–10 % of users, expanding gradually.
+- **Cohort-Based Targeting**: Show the feature to specific segments (e.g. power users or new signups).
+- **Platform Segmentation**: Launch to a single platform (like iOS) before web.
+- **Feature Gate**: Keep a kill switch ready for instant rollback.
+
+
+ **Pro tip:** Combine a phased rollout with [alerts](/docs/features/alerts) on key metrics—like conversion or retention—to catch regressions early.
+
+
+---
+
+## 3. Configure Your Flag in Mixpanel
+
+Once your SDK is live, create and manage feature flags directly in the Mixpanel UI.
+
+Steps to take:
+
+
+
+{
Create a new flag.
}
+
+e.g. `onboarding_v2`
+
+{
Add a variant and define rollout percentages.
}
+
+{
Target users by cohort or run-time properties.
}
+
+e.g. `platform = android`
+
+{
Use QA Testers to assign test variants and confirm logic.
}
+
+{
Monitor exposure and adoption in real time, adjusting rollout weights as you scale.
}
+
+
+
+
+ **Pro tip:** Use consistent flag and variant names across systems to keep your analytics clean.
+
+
+### How it Works
+
+When your app loads, the Mixpanel SDK automatically fetches active flags and evaluates which variant each user should see. Once a variant is displayed, Mixpanel logs an exposure event (`$experiment_started`), allowing you to track adoption and impact immediately in your reports—no manual tagging needed.
+
+
+
+[📖 Full Feature Flags documentation →](/docs/featureflags)
+
+---
+
+## 4. Monitor and Learn
+
+After launch, use Mixpanel analytics to determine whether to expand, iterate, or revert.
+
+**Tools to Help**
+- [Insights](/docs/reports/insights): Measure adoption and engagement by variant.
+- [Funnels](/docs/reports/funnels/funnels-overview): Track conversion rates between control and variant users.
+- [Boards](/docs/boards): Create a shared rollout dashboard with annotations and alerts.
+- [Session Replay](/docs/session-replay): Watch user interactions to diagnose UX issues.
+
+
+ **Pro tip:** Use [borrowed properties](/docs/features/custom-properties#borrowed-properties) or [cohorts](/docs/users/cohorts) to track downstream behavior for users exposed to each variant.
+
+
+ 👉 **Do this next:** Validate exposure counts and cohort membership before scaling.
+
+ ---
+
+## 5. Govern and Clean Up
+
+Feature flags are powerful, but unmanaged flags become “flag debt.” Maintain hygiene by:
+- Assigning an **owner** for each flag.
+- **Auditing** monthly or quarterly.
+- Keeping flags scoped to one use case.
+- **Documenting** rollout intent and success criteria.
+- **Sunsetting** flags once a feature reaches 100 % rollout.
+
+
+**Pro tip:** Add flag cleanup reviews to your sprint rituals to stay organized and efficient.
+
+
+---
+
+## Key Takeaways
+
+- Prepare your setup by enabling flag support in your SDK so Mixpanel can control feature visibility.
+- Plan safe roll outs using phased or cohort-based release strategies to manage risk.
+- Configure and control flags directly in Mixpanel—no redeploy required.
+- Monitor adoption and performance with built-in analytics to decide when to expand or revert.
+- Govern and clean up flags regularly to prevent confusion and keep your implementation scalable.
+
+📚 **Go deeper:** [Feature Flags overview in Mixpanel Docs](/docs/featureflags)
diff --git a/pages/guides/guides-by-use-case/grow-your-usership/_meta.ts b/pages/guides/guides-by-use-case/grow-your-usership/_meta.ts
new file mode 100644
index 0000000000..9a88eec25e
--- /dev/null
+++ b/pages/guides/guides-by-use-case/grow-your-usership/_meta.ts
@@ -0,0 +1,3 @@
+export default {
+ "grow-revenue": "Grow Revenue"
+}
\ No newline at end of file
diff --git a/pages/guides/guides-by-use-case/grow-your-usership/grow-revenue.mdx b/pages/guides/guides-by-use-case/grow-your-usership/grow-revenue.mdx
new file mode 100644
index 0000000000..2447ca1148
--- /dev/null
+++ b/pages/guides/guides-by-use-case/grow-your-usership/grow-revenue.mdx
@@ -0,0 +1,148 @@
+import { Callout, Steps } from 'nextra/components'
+
+# Understand and Grow Revenue with Mixpanel
+
+[Revenue analytics](/guides/strategic-playbooks/onboarding-playbook/launch/revenue-analytics) connects product usage to business results—helping you see which actions drive paying customers, where money comes from, and how to increase it.
+
+---
+
+## Before You Begin
+
+Revenue analytics in Mixpanel requires Warehouse Connectors. To analyze revenue, you will need to sync your billing or transactional data (for example, from your payment platform or internal finance systems) into Mixpanel using [Warehouse Connectors](/docs/tracking-methods/warehouse-connectors).
+
+After your warehouse is connected, set up the data structure that fits your business model:
+- **Transactional revenue** (e.g. one-time purchases, orders): Use [Mirror](/docs/tracking-methods/warehouse-connectors#mirror) to bring purchase events from your warehouse into Mixpanel. Mirror automatically syncs new transactions on a scheduled cadence.
+- **Subscription-based revenue** (e.g. recurring plans, renewals): Use [Profile History](/docs/tracking-methods/warehouse-connectors#user-profiles) to track subscription changes, upgrades, and cancellations over time.
+
+Once your data source is connected, you can link behavioral data (from Mixpanel events) with business data (from Mirror, Profile History, and Warehouse Connectors) to understand how user actions drive revenue outcomes.
+
+📖 [Set up Warehouse Connectors →](/docs/tracking-methods/warehouse-connectors#step-1-connect-a-warehouse)
+
+
+ **Pro tip:** Make sure your revenue tables include shared identifiers such as `user_id` or `account_id` so Mixpanel can link purchases back to user actions.
+
+
+---
+
+## What Is Revenue Analytics?
+
+Revenue analytics lets you connect product behavior to business impact. In Mixpanel, this means tracking events that represent value exchange (signups, upgrades, purchases, renewals) and tying them to monetary outcomes.
+
+When set up correctly, you can:
+- See which actions or features correlate with higher spend
+- Determine which in-trial actions lead to higher customer lifetime value after conversion
+- Understand retention patterns among paying customers
+- Align product and finance teams on shared metrics
+
+By following the steps in this guide, you will learn how to track, verify, and act on your revenue data to fuel growth.
+
+---
+
+### 1. Capture the Right Revenue Events
+
+Once your warehouse connection is live, focus on the key moments that represent a value exchange—either when money is transferred or when a commitment to future revenue is established.
+
+- ✅ **Do** focus on meaningful milestones like `subscription_created`, `upgrade_completed`, or `payment_success`.
+- ❌ **Do not** log every possible interaction as a revenue event.
+
+Each event should include:
+- `amount` (numeric)
+- `currency` (ISO code)
+- `plan_type` (or equivalent)
+- `user_id` or `account_id`
+
+
+ **Pro tip:** For multi-currency transactions, include `base_currency` and `base_ccy_amount` so Mixpanel can convert all revenue to a single base currency for consistent reporting.
+
+
+
+ **Pitfall:** Missing any of these properties will prevent Mixpanel from rolling up revenue totals accurately.
+
+
+---
+
+### 2. Verify and Define Your Revenue Data
+
+Before you begin analysis, confirm that Mixpanel is reading your data correctly.
+
+Steps to take:
+
+
+
+{
Tag your revenue events in Lexicon.
}
+
+Tagging your revenue events make them discoverable and easy to reference across reports.
+
+{
Check event integrity.
}
+
+Look for duplicate event names, missing properties, or inconsistent casing.
+
+{
Compare totals against your source-of-truth.
}
+
+Compare your totals against your finance system warehouse to confirm that imported revenue matches what you expect.
+
+
+
+Note: Warehouse Connector data updates on a scheduled sync (not in real time), so totals will reflect the most recent completed sync rather than live revenue changes.
+
+
+ **Pro tip:** Ask a finance or data engineer to confirm that your daily imports are up to date and correctly formatted.
+
+
+📖 [Learn more about data validation →](/docs/tracking-best-practices/debugging)
+
+---
+
+### 3. Analyze What Drives Revenue
+
+Once your data is verified, you can start exploring what behaviors drive your business outcomes. There are many ways to analyze revenue in Mixpanel—each highlighting a different aspect of user value, conversion, or retention.
+
+One simple but powerful example is **Average Revenue Per User (ARPU).**
+
+**How it is calculated:**
+
+ $Average \ Revenue \ Per \ User \ (ARPU) = \frac{Sum \ of \ Revenue \ Over \ Time}{Number \ of \ Unique \ Users}$
+
+
+Tracking ARPU helps you understand how much revenue each active user generates during a given period. It is a great starting point for comparing user segments, evaluating pricing changes, or measuring growth trends over time.
+
+**Tools to Help**
+
+- [Insights](/docs/reports/insights): Visualize ARPU trends and spot spikes or dips after product or pricing changes.
+- [Funnels](/docs/reports/funnels/funnels-overview): See how user behaviors along the purchase path affect downstream revenue.
+- [Cohorts](/docs/users/cohorts): Compare high-value cohorts (e.g. annual plan users) with lower-value ones to identify drivers of growth.
+
+
+**Example:** Explore how ARPU looks in practice in our [E-commerce demo project](https://mixpanel.com/s/300OIR). This sample dashboard uses an **Insights** report to calculate revenue per user over time, helping teams visualize revenue trends in real Mixpanel data.
+
+ARPU is just one approach to revenue analysis.
+
+- For transactional revenue, you can explore metrics like **Revenue per Purchase Event** or **Revenue by Feature Usage** to uncover which actions or products generate the most value.
+- For subscription-based revenue, try tracking **Recurring Revenue Change** or **Churned Revenue** to understand how upgrades, downgrades, and cancellations impact your overall recurring revenue growth.
+
+---
+
+### 4. Connect Insights to Action
+
+Revenue analytics is only valuable if it drives decisions. Once you have identified what moves the needle, turn insights into action across teams.
+
+- Share [Boards](/docs/boards) with product and growth teams to align on revenue drivers.
+- Set up [Alerts](/docs/features/alerts) to notify you of significant revenue drops or spikes.
+- Add [Annotations](/docs/features/annotations) to track launches, pricing changes, or campaigns that may explain revenue shifts.
+
+
+ **Pro tip:** When you notice a revenue dip or spike, dig into which user actions changed around the same time—and share those findings with product or marketing so they can act on them quickly. Set up a quarterly review to ensure data stays current and your insights remain actionable.
+
+
+---
+
+## Key Takeaways
+
+- Revenue analytics in Mixpanel requires Warehouse Connectors to import financial data.
+- Track only the key events that represent real value exchange.
+- Always verify totals and event integrity before drawing conclusions.
+- Use Funnels, Insights, and Cohorts to discover what behaviors drive conversions.
+- Refresh and share dashboards regularly to align business and product teams.
+
+
+📚 **Go deeper:** [Mixpanel Docs on Revenue Analytics](/guides/strategic-playbooks/onboarding-playbook/launch/revenue-analytics)
diff --git a/pages/guides/guides-by-workflow/_meta.ts b/pages/guides/guides-by-workflow/_meta.ts
new file mode 100644
index 0000000000..a15f909a74
--- /dev/null
+++ b/pages/guides/guides-by-workflow/_meta.ts
@@ -0,0 +1,4 @@
+export default {
+ "build-tracking-strategy": "Build Tracking Strategy",
+ "ensure-data-quality": "Ensure Data Quality"
+}
\ No newline at end of file
diff --git a/pages/guides/guides-by-use-case/build-a-tracking-strategy.mdx b/pages/guides/guides-by-workflow/build-tracking-strategy.mdx
similarity index 100%
rename from pages/guides/guides-by-use-case/build-a-tracking-strategy.mdx
rename to pages/guides/guides-by-workflow/build-tracking-strategy.mdx
diff --git a/pages/guides/guides-by-use-case/ensuring-data-quality.mdx b/pages/guides/guides-by-workflow/ensure-data-quality.mdx
similarity index 100%
rename from pages/guides/guides-by-use-case/ensuring-data-quality.mdx
rename to pages/guides/guides-by-workflow/ensure-data-quality.mdx
diff --git a/pages/guides/mixpanel-introduction.mdx b/pages/guides/mixpanel-introduction.mdx
new file mode 100644
index 0000000000..1dc86f09e0
--- /dev/null
+++ b/pages/guides/mixpanel-introduction.mdx
@@ -0,0 +1,122 @@
+import { Callout } from 'nextra/components'
+
+# Mixpanel Introduction
+
+Mixpanel is a **digital analytics platform** that helps teams continuously improve their products by turning data into action. At its core, Mixpanel supports a **continuous innovation loop**—helping you observe what users do, analyze why it happens, decide what to do next, and act on those insights.
+
+
+ **Why it matters:** Teams use Mixpanel to learn faster, align decisions across functions, and measure the impact of every product change.
+
+
+---
+
+## See Mixpanel in Action
+
+
+
+
+ {/* Left: video with a min width so it wraps when narrow */}
+
+
+
+
+
+
+ {/* Right: text with a flexible basis */}
+
+
+
+
+---
+
+## Mixpanel Features That Power Continuous Innovation
+
+
+
+
+Mixpanel is built around a simple but powerful framework for continuous improvement: **Observe → Analyze → Decide → Act (OADA)**.
+
+Each stage in the OADA framework connects directly to Mixpanel’s tools, helping you move from data to observation to action–—all in one platform.
+
+
+
+
+| 👀 **Observe** | 📊 **Analyze** | 💡 **Decide** | 🚀 **Act** |
+|---|---|---|---|
+| See what’s happening in your product with [Session Replay](/docs/session-replay), [Heatmaps](/docs/session-replay/heatmaps), [Autocapture](/docs/tracking-methods/autocapture), and [Alerts](/docs/features/alerts). | Explore [Insights](/docs/reports/insights), [Funnels](/docs/reports/funnels), [Flows](/docs/reports/flows), [Retention](/docs/reports/retention), and [Cohorts](/docs/users/cohorts) to find what moves your metrics. | Align on what to change next with [Metric Trees](/docs/metric_tree), [Boards](/docs/boards), [Annotations](/docs/features/annotations), and shared insights. | Measure impact with [Experiments](/docs/experiments) and ship improvements with [Feature Flags](/docs/featureflags). |
+
+All of this is powered by Mixpanel’s modern data foundation—–bringing together AI-assisted analysis through features like [MCP Server](/docs/features/mcp), robust [data governance](/docs/data-governance) for accuracy and trust, and built-in collaboration tools that help teams move from insight to action faster.
+
+**Learn more:** [Go deeper on the OADA Loop →](/guides/guides-by-topic/continuous-innovation)
+
+---
+
+## Mixpanel Data Model
+
+Everything in Mixpanel starts with **events**––the building blocks of your data model.
+
+An event represents something a **user** does (like *Signed Up*, *Viewed a Product*, or *Completed Purchase*). Each event can include **properties** that add context, such as the user’s plan type, location, or device. Optionally, you can analyze data at the group level using [Group Analytics](/docs/data-structure/group-analytics).
+
+
+
+Together, these events and properties form a flexible data model that mirrors how people actually use your product. Once instrumented, you can analyze this data instantly—–without writing SQL or waiting on an analyst.
+
+**Learn more:** [Dive deeper into how Mixpanel structures data →](/docs/data-structure/concepts)
+
+---
+
+## Keep Learning
+
+Keep building your Mixpanel expertise with these resources designed to help you learn, connect, and put insights into action.
+
+| Resource | Purpose | Link |
+|-----------|--------------|------|
+| **Community** | Connect with other Mixpanel users, share ideas, and learn how peers are tackling similar challenges. | [Open →](https://community.mixpanel.com/) |
+| **Developer Docs** | Build and extend Mixpanel with SDKs, APIs, and advanced implementation guides. | [Open →](https://developer.mixpanel.com/reference/overview) |
+| **Docs** | Explore product capabilities, setup guides, and detailed feature references. | [Open →](https://docs.mixpanel.com/) |
+| **Events** | Join live sessions and webinars to explore new features, use cases, and expert-led best practices. | [Open →](https://mixpanel.com/events) |
+| **Guides** | Apply Mixpanel best practices to real-world workflows and use cases. | 📍 You are here |
+
+Wherever you are in your Mixpanel journey, these resources will help you keep learning, stay connected, and keep improving.
diff --git a/pages/guides/self-guided-tours.mdx b/pages/guides/self-guided-tours.mdx
new file mode 100644
index 0000000000..6344474de4
--- /dev/null
+++ b/pages/guides/self-guided-tours.mdx
@@ -0,0 +1,119 @@
+import SelfGuidedTours from '../../components/SelfGuidedTours'
+
+# Self-Guided Product Tours
+
+Discover Mixpanel’s core features through quick, self-guided tours.
+Select any card below to explore interactive walkthroughs of key workflows.
+
+{/*
+ Cards API (examples):
+
+ {
+ badge: 'PRODUCT OVERVIEWS', // Pill label shown above the title
+ title: 'Mixpanel Experiments', // Main title
+ blurb: 'Setup a Mixpanel Experiment', // Short supporting text (optional)
+ img: '/navattic/launch-an-experiment.png', //Image src (public/… 314x139 pixels)
+ navatticOpen: 'cmfkxwfa5000004lc8408f5wi', // Navattic demo id (popup)
+ navatticTitle: 'Launch an Experiment', // Optional: popup title
+ // href: 'https://example.com' // (Alternative) card links out if provided
+ }
+
+ Notes:
+ - If `navatticOpen` is present, the card triggers a Navattic popup.
+ - If `img` is omitted, a dark placeholder fills the image area.
+ - If `href` is provided (and `navatticOpen` is not), the card becomes a link.
+ - Images should live under /public/navattic/ for easy referencing.
+*/}
+
+
+
+{/* Section CTA — content lives in MDX so it’s easy to change or A/B test */}
+
+
+
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/_meta.ts b/pages/guides/strategic-playbooks/onboarding-playbook/launch/_meta.ts
index 6d4a964c19..4e0130f879 100644
--- a/pages/guides/strategic-playbooks/onboarding-playbook/launch/_meta.ts
+++ b/pages/guides/strategic-playbooks/onboarding-playbook/launch/_meta.ts
@@ -1,9 +1,5 @@
export default {
- "create-boards": "Create Boards",
- "discover-insights": "Discover Insights",
- "analyze-conversions": "Analyze Conversions",
- "build-user-flows": "Build User Flows",
- "track-user-retention": "Track User Retention",
- "define-cohorts": "Define Cohorts",
- "revenue-analytics": "Revenue Analytics"
-}
+ "roll-out": "Roll Out",
+ "train-users": "Train Users",
+ "drive-adoption": "Drive Adoption",
+}
\ No newline at end of file
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/drive-adoption.mdx b/pages/guides/strategic-playbooks/onboarding-playbook/launch/drive-adoption.mdx
new file mode 100644
index 0000000000..cf37466f9f
--- /dev/null
+++ b/pages/guides/strategic-playbooks/onboarding-playbook/launch/drive-adoption.mdx
@@ -0,0 +1,93 @@
+import { Callout } from 'nextra/components'
+
+# Activate Your Analytics Culture: A Guide to Driving Mixpanel Adoption
+
+Rolling out Mixpanel is just the beginning. True success happens when your teams use data to make faster, smarter decisions every day.
+
+This guide walks you through how to **execute an adoption plan** that builds lasting habits, empowers champions, and turns analytics into action.
+
+---
+
+## What Is an Adoption Plan?
+
+An adoption plan ensures your Mixpanel rollout becomes part of how your organization *thinks and works*, not just another tool to log into.
+
+Without one, teams often stall after setup—reports go unused, dashboards get stale, and decisions revert to intuition. With a strong plan, Mixpanel becomes a daily part of your company’s operating rhythm.
+
+---
+
+## Select Your Adoption Approach
+
+Organizations typically scale Mixpanel adoption in one of two ways:
+
+### Team-by-Team Approach
+
+If you’re just getting started, it usually makes sense to start small.
+
+- Start with one or two high-impact teams—often Product, Growth, or Engineering.
+- Focus on clear use cases that demonstrate value quickly, like activation or retention analysis.
+- Use early wins to refine your training, data setup, and communication plan.
+ - ✅ **Best when**: You are building confidence and proving Mixpanel’s value before scaling.
+ - ⚠️ **Watch out for**: Expanding too fast before teams have strong habits or success stories to share.
+
+### Multi-Team Approach
+
+If your organization is ready for broader rollout, expand adoption across several teams in parallel.
+
+- Ensure your tracking plan, taxonomy, and enablement materials are consistent.
+- Host joint enablement sessions to align teams on shared metrics and best practices.
+- Encourage collaboration across teams to compare insights and share dashboards.
+ - ✅ **Best when**: Mixpanel foundations are solid and multiple teams are data-curious.
+ - ⚠️ **Watch out for**: Misalignment—without a shared data language, multi-team adoption can fragment quickly.
+
+
+ **Pro tip:** Most organizations start team-by-team to learn what works, then graduate to a multi-team rollout once data structures and champions are established.
+
+
+---
+
+## Keep Momentum and Build a Data Habit
+
+Adoption isn't a one-time event—it's a practice. The goal is to weave Mixpanel into everyday decision-making.
+
+**Here is how to sustain it:**
+
+- **Celebrate data wins.** Highlight teams that made impactful decisions using Mixpanel.
+- **Automate reminders.** Use [alerts](/docs/features/alerts) to deliver insights automatically.
+- **Document learnings.** Add [annotations](/docs/features/annotations) to track what happened when metrics changed.
+- **Review usage trends.** Periodically review Mixpanel usage to see which reports or dashboards are most valuable.
+
+
+ **Pitfall:** Without continuous reinforcement, adoption can fade quickly after launch.
+
+
+👉 **Do this next:** Schedule regular check-ins with champions and leadership to review wins and adjust enablement priorities.
+
+---
+
+## Scale What Works
+
+Once your pilot teams are thriving, scale adoption intentionally—not all at once. Use data and feedback from early adopters to refine your approach.
+
+**Steps to take:**
+
+- **Document Lessons Learned**: Capture what worked and what didn't during early rollout.
+- **Build Internal Templates**: Standardize dashboards or funnels that new teams can copy.
+- **Onboard New Teams**: Host shorter, team-specific intros that highlight relevant Mixpanel use cases.
+- **Measure Progress**: Track organization-wide adoption metrics—e.g. % of active Mixpanel users, reports viewed, or alerts configured.
+
+
+ **Pro tip:** Make Mixpanel part of your onboarding for every new employee—one login session and one “first insight” is all it takes to build the habit.
+
+
+---
+
+## Key Takeaways
+
+Adopting Mixpanel is about more than setup—it’s about creating lasting data habits that shape how teams work.
+
+- It's often a good idea to start small, prove value, then scale to more teams.
+- Celebrate wins and make data part of daily work.
+- Standardize success and track adoption over time.
+
+👉 **Do this next:** Ready to put this into practice? Download the [Mixpanel Adoption Plan](https://docs.google.com/spreadsheets/d/12FJDLLTpOgw-5B4_EoA9coUkp6eM1kKAwr2mAgiNrds/edit?gid=1651252994#gid=1651252994) to outline, track, and measure your adoption journey.
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/roll-out.mdx b/pages/guides/strategic-playbooks/onboarding-playbook/launch/roll-out.mdx
new file mode 100644
index 0000000000..8c286e854d
--- /dev/null
+++ b/pages/guides/strategic-playbooks/onboarding-playbook/launch/roll-out.mdx
@@ -0,0 +1,115 @@
+import { Callout } from 'nextra/components'
+
+# Roll Out Mixpanel with Confidence
+
+Move from implementation to adoption with a clear plan for deploying, rolling out, and rallying your teams.
+
+---
+
+## Deploy to Production
+Your Mixpanel setup is complete—now it's time to take it live. Deploying to production means you are connecting your tested implementation to your real, live user data.
+
+### Reiterate Benefits
+
+When you go live with Mixpanel, you enable your teams to:
+
+- See **real user behavior** as it happens.
+- Track **key activation and retention metrics**.
+- Test **hypotheses quickly** with data-backed evidence.
+- Build a **shared understanding** of product performance across teams.
+
+Before rollout, remind stakeholders that this phase turns your tracking plan into tangible value. It's where insight begins.
+
+### Deployment Checklist
+
+For deployment, confirm that you’ve:
+
+- [Completed QA](/guides/guides-by-use-case/ensuring-data-quality) in your staging project.
+- Prepared your [production project](/docs/orgs-and-projects/managing-projects#creating-projects).
+- [Enabled tracking](/docs/quickstart/capture-events/track-events) in your production project.
+- Triggered historical backfilling (if necessary).
+- Monitored live data for any potential anomalies.
+- Implemented your [data governance strategy](/docs/data-governance).
+- [Granted production access](/docs/orgs-and-projects/roles-and-permissions#invite-users) to the correct users.
+
+For a detailed, step-by-step version of this list, check out the full [Mixpanel Production Deployment Checklist](https://docs.google.com/spreadsheets/d/12FJDLLTpOgw-5B4_EoA9coUkp6eM1kKAwr2mAgiNrds/edit?gid=751571839#gid=751571839). It covers everything from validation in staging to go-live monitoring.
+
+
+ **Pro tip:** Keep a small monitoring window after deployment (24–48 hours) to catch any anomalies before promoting org-wide usage.
+
+
+---
+
+## Roll Out Mixpanel
+
+Once your production deployment is verified, the focus shifts to planning for adoption—getting the right people using Mixpanel effectively.
+
+### Plan Your Adoption
+
+Start with a short plan that outlines:
+
+- **Key success metrics:** Decide what adoption looks like—e.g. weekly active Mixpanel users, reports viewed, or dashboard subscriptions.
+- **Your launch goal:** For example, “Every PM and designer logs into Mixpanel and saves a report within 1 week.”
+- **Timeline:** Who needs access first, what happens on launch day, and how you'll follow up.
+
+
+ **Pro tip:** Use the Mixpanel Usage report in Mixpanel Settings to monitor which teammates are actively using Mixpanel.
+
+
+---
+
+### Identify Stakeholders and Champions
+
+A strong rollout is driven by the right people:
+
+| Role | Responsibility |
+|--------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| **Executive Sponsor** | Drives Mixpanel adoption by securing resources, removing roadblocks, and ensuring alignment with company goals. |
+| **Team Leads** | Leads Mixpanel adoption within their team by setting KPIs, providing training and resources, and reinforcing data-driven decisions through regular use of Mixpanel. |
+| **Mixpanel Champions** | Enthusiastic Mixpanel advocates who inspire peer adoption, answer questions, provide training, and share best practices. |
+
+Encourage each team to nominate a *Mixpanel Champion* who can model usage and surface success stories.
+
+👉 **Do this next:** Establish a cadence—monthly data reviews or shared “insight of the week—to keep Mixpanel relevant.
+
+---
+
+## Communicate the Launch
+
+Treat your rollout like a product launch. Build excitement, show value, and invite participation.
+
+**Before Launch**
+
+- Share a message from your sponsor or product lead explaining why Mixpanel is being introduced.
+- Preview what teams will gain (“We’ll be able to answer questions like where users drop off in signup").
+- Point to a “Start Here” dashboard and short walkthrough.
+
+**Launch Day**
+
+- Host a short demo or all-hands to introduce Mixpanel.
+- Encourage each team to explore one key report.
+- Celebrate first discoveries in Slack or during standups.
+
+**After Launch**
+
+- Share early insights (“We discovered our activation rate doubled after signup simplification”).
+- Ask for feedback and use it to refine your dashboards or events.
+- Reinforce success in monthly reviews.
+
+
+ **Pro tip:** Save your launch message and templates—you can reuse them for new teams or product lines later.
+
+
+👉 **Do this next:** Download the [Mixpanel Adoption Execution Checklist](https://docs.google.com/spreadsheets/d/12FJDLLTpOgw-5B4_EoA9coUkp6eM1kKAwr2mAgiNrds/edit?gid=2005429816#gid=2005429816) and use it to help guide your rollout beyond launch. It covers how to communicate progress, celebrate team successes, and keep momentum strong through ongoing education and evangelism.
+
+---
+
+## Key Takeaways
+
+- When you deploy Mixpanel to production, you bring your implementation to life. When you roll it out thoughtfully, you build a data-driven culture.
+- Verify your setup in production before rolling out broadly.
+- Align on adoption goals and track them from day one.
+- Appoint champions to sustain usage across teams.
+- Communicate clearly and celebrate early wins.
+
+👉 **Do this next:** Make a copy of the [Mixpanel Production Deployment Checklist](https://docs.google.com/spreadsheets/d/12FJDLLTpOgw-5B4_EoA9coUkp6eM1kKAwr2mAgiNrds/edit?gid=751571839#gid=751571839) and use it to validate your setup before rollout. It’s the fastest way to confirm your setup is complete and Mixpanel is ready for production.
\ No newline at end of file
diff --git a/pages/guides/strategic-playbooks/onboarding-playbook/launch/train-users.mdx b/pages/guides/strategic-playbooks/onboarding-playbook/launch/train-users.mdx
new file mode 100644
index 0000000000..3e15139a8e
--- /dev/null
+++ b/pages/guides/strategic-playbooks/onboarding-playbook/launch/train-users.mdx
@@ -0,0 +1,78 @@
+import { Callout, Steps } from 'nextra/components'
+
+# Train Your Team to Use Mixpanel Effectively
+
+Learn how to onboard, train, and scale Mixpanel knowledge across your organization using built-in Mixpanel tools and free public resources.
+
+---
+
+## What is This Guide About
+
+Whether you are just starting out or expanding Mixpanel usage to new teams, effective training ensures consistent data literacy, faster adoption, and better ROI. This guide helps you set up a repeatable internal enablement process.
+
+---
+
+## 1. Start with the Fundamentals
+
+Establish a shared understanding of Mixpanel’s purpose, value, and core workflows.
+
+- ✅ **Do** provide a unified introduction to Mixpanel’s “why” and “how.”
+- ❌ **Do not** assume each team member will figure it out on their own.
+
+**Steps to take:**
+
+1. Share Mixpanel's overview resources like the [Mixpanel Introduction](/guides/mixpanel-introduction).
+
+2. Encourage teams to explore the [Docs home page](/docs/what-is-mixpanel) for guided discovery.
+
+3. Highlight the [Mixpanel Community](https://community.mixpanel.com/) for Q&A and inspiration from peers.
+
+4. Review training videos on [Mixpanel's Core Reports](/guides/guides-by-topic/core-reports).
+
+
+
+ **Pitfall:** Skipping this step leads to inconsistent understanding of Mixpanel concepts across departments.
+
+
+---
+
+## 2. Tailor Training by Role
+
+Different teams use Mixpanel differently; PMs ask questions, engineers instrument data, analysts interpret results.
+
+**Steps to take:**
+
+1. Identify each team's Mixpanel use cases.
+2. Pair each with relevant public resources (Docs, Community posts, blogs). Here are a few to get you started:
+ - **PMs & Analysts:** [Insights](/docs/reports/insights), [Funnels](/docs/reports/funnels), [Retention](/docs/reports/retention), [Cohorts](/docs/users/cohorts), [Experiments](/docs/experiments).
+ - **Engineers:** [SDK setup](/docs/quickstart/install-mixpanel), [Event tracking](/docs/quickstart/capture-events), [Feature Flags](/docs/featureflags).
+
+
+ **Pro tip:** Create internal “office hours” where each team shares how they are using Mixpanel and what they have learned.
+
+
+---
+
+## 3. Reinforce and Scale
+
+Make Mixpanel learning continuous, not one-and-done.
+
+**Tools to Help**
+
+- [Mixpanel Community](https://community.mixpanel.com/): Stay connected with updates, webinars, and expert threads.
+- [Mixpanel Blog](https://mixpanel.com/blog/): Discover real-world use cases and product deep dives.
+- [Mixpanel Changelog](/changelogs): Keep teams up to date on new features.
+
+Ongoing engagement keeps Mixpanel top of mind and helps your team continuously improve.
+
+---
+
+## Key Takeaways
+
+- Anchor everyone in the why behind Mixpanel.
+- Customize learning by role.
+- Use Docs and Community for ongoing learning.
+- Make training a repeatable part of onboarding and growth.
+
+
+🌐 **Stay connected**: Join the [Mixpanel Community](https://community.mixpanel.com/) to share how you are driving adoption and learn from other champions rolling Mixpanel out across their organizations.
\ No newline at end of file
diff --git a/pages/guides/what-is-mixpanel.mdx b/pages/guides/what-is-mixpanel.mdx
deleted file mode 100644
index 61ef0b338d..0000000000
--- a/pages/guides/what-is-mixpanel.mdx
+++ /dev/null
@@ -1,75 +0,0 @@
-import ExtendedButton from "/components/ExtendedButton/ExtendedButton";
-
-# What is Mixpanel?
-
-Mixpanel will help you better understand your customers and answer questions about your product.
-It enables you to track how users engage with your product and analyze this data with interactive reports
-that let you query and visualize the results with just a few clicks.
-
-Mixpanel is built on three key concepts: [**Events**](#events), [**Users**](#users), and [**Properties**](#properties).
-
-
-
-
-
-
-
-
-## Concepts
-
-Before you get started, you should know three Mixpanel concepts:
-
- - **Events** are actions that happen in your product
- - **Users** are the people who use your product
- - **Properties** are the attributes of your users and events
-
-### Events
-
-An event is a data point that represents an interaction between a user and your product. Events can be a wide range of interactions.
-
-Imagine you run a cafe where customers can purchase a coffee via an app. Each purchase is an event that can be tracked in Mixpanel.
-
-
-
-### Users
-
-On the other side of an event is a user — the specific individual who completed an interaction with your product.
-
-Each user has a unique identifier that you can use to track their activity. This identifier can be an email address, a username, or a unique ID. Mixpanel uses a unique ID to identify users.
-
-
-
-### Properties
-
-You can track additional information about **users** and **events**. These details are called **properties**.
-
-An **Event Property** describes an event. For a coffee purchase, the event would be _Purchased Item_ and the event properties could be _type_ (in this case a Coffee) and _price_ (in this case $2.50).
-
-
-
-A **User Property** describes a User. This could be their name, email, age, etc.
-
-
-
-Properties allow you to create groups of users (aka [cohorts](/docs/users/cohorts)) and also enable you to filter for certain events or users. These
-powerful features make it easy to identify trends and new customer insights.
\ No newline at end of file
diff --git a/pages/overrides.scss b/pages/overrides.scss
index 05eada3fd2..56b1cc4ba2 100644
--- a/pages/overrides.scss
+++ b/pages/overrides.scss
@@ -696,6 +696,13 @@ article {
// opacity: 0.4;
// }
}
+
+ // Modify Steps
+ .nextra-steps h2:before,.nextra-steps h3:before,.nextra-steps h4:before{
+ background-color: colors.$purple20;
+ color: colors.$purple140;
+ }
+
//####Light Theme Code Syntax and Box####
code,
kbd,
@@ -1440,6 +1447,13 @@ article {
}
}
+ // Modify steps
+ .nextra-steps h2:before,.nextra-steps h3:before,.nextra-steps h4:before{
+ background-color: colors.$purple140;
+ color: colors.$purple20;
+ border-color: colors.$purple140;
+ }
+
//ID management banner style
.idManagementBanner {
.nextra-callout {
diff --git a/public/Data_Model_with_Group_Analytics.png b/public/Data_Model_with_Group_Analytics.png
new file mode 100644
index 0000000000..6dccea66c4
Binary files /dev/null and b/public/Data_Model_with_Group_Analytics.png differ
diff --git a/public/Experiments_and_FF.png b/public/Experiments_and_FF.png
new file mode 100644
index 0000000000..44d2e7b369
Binary files /dev/null and b/public/Experiments_and_FF.png differ
diff --git a/public/navattic/01_OverviewTour_314x139.png b/public/navattic/01_OverviewTour_314x139.png
new file mode 100644
index 0000000000..2104ebd1d3
Binary files /dev/null and b/public/navattic/01_OverviewTour_314x139.png differ
diff --git a/public/navattic/02_Insights_314x139.png b/public/navattic/02_Insights_314x139.png
new file mode 100644
index 0000000000..a3dd180ad8
Binary files /dev/null and b/public/navattic/02_Insights_314x139.png differ
diff --git a/public/navattic/03_Funnels_314x139.png b/public/navattic/03_Funnels_314x139.png
new file mode 100644
index 0000000000..86d4977681
Binary files /dev/null and b/public/navattic/03_Funnels_314x139.png differ
diff --git a/public/navattic/04_Autocapture_314x139.png b/public/navattic/04_Autocapture_314x139.png
new file mode 100644
index 0000000000..f066c72042
Binary files /dev/null and b/public/navattic/04_Autocapture_314x139.png differ
diff --git a/public/navattic/05_SessionReplay_314x139.png b/public/navattic/05_SessionReplay_314x139.png
new file mode 100644
index 0000000000..9570236093
Binary files /dev/null and b/public/navattic/05_SessionReplay_314x139.png differ
diff --git a/public/navattic/06_Experiments_314x139.png b/public/navattic/06_Experiments_314x139.png
new file mode 100644
index 0000000000..7baefb66d1
Binary files /dev/null and b/public/navattic/06_Experiments_314x139.png differ
diff --git a/public/oada-loop-detail.png b/public/oada-loop-detail.png
new file mode 100644
index 0000000000..d2aaaf24e4
Binary files /dev/null and b/public/oada-loop-detail.png differ
diff --git a/public/oada-loop-simple-wide-fcf9fa.png b/public/oada-loop-simple-wide-fcf9fa.png
new file mode 100644
index 0000000000..9e488230c2
Binary files /dev/null and b/public/oada-loop-simple-wide-fcf9fa.png differ
diff --git a/redirects/local.txt b/redirects/local.txt
index 723dd98181..6673e988bc 100644
--- a/redirects/local.txt
+++ b/redirects/local.txt
@@ -293,4 +293,15 @@
/guides/launch/track-user-retention /guides/strategic-playbooks/onboarding-playbook/launch/track-user-retention
/guides/launch/define-cohorts /guides/strategic-playbooks/onboarding-playbook/launch/define-cohorts
/guides/launch/revenue-analytics /guides/strategic-playbooks/onboarding-playbook/launch/revenue-analytics
-/guides/beyond-onboarding /guides/strategic-playbooks/onboarding-playbook/beyond-onboarding
\ No newline at end of file
+/guides/beyond-onboarding /guides/strategic-playbooks/onboarding-playbook/beyond-onboarding
+/guides/what-is-mixpanel /guides/mixpanel-introduction
+/guides/guides-by-use-case/driving-product-innovation /guides/guides-by-use-case/engage-your-users/drive-product-innovation
+/guides/guides-by-use-case/ensuring-data-quality /guides/guides-by-workflow/ensure-data-quality
+/guides/guides-by-use-case/build-a-tracking-strategy /guides/guides-by-workflow/build-tracking-strategy
+/guides/strategic-playbooks/onboarding-playbook/launch/create-boards /guides/guides-by-topic/core-reports/create-boards
+/guides/strategic-playbooks/onboarding-playbook/launch/discover-insights /guides/guides-by-topic/core-reports/discover-insights
+/guides/strategic-playbooks/onboarding-playbook/launch/analyze-conversions /guides/guides-by-topic/core-reports/analyze-conversions
+/guides/strategic-playbooks/onboarding-playbook/launch/build-user-flows /guides/guides-by-topic/core-reports/build-user-flows
+/guides/strategic-playbooks/onboarding-playbook/launch/define-cohorts /guides/guides-by-topic/core-reports/define-cohorts
+/guides/strategic-playbooks/onboarding-playbook/launch/track-user-retention /guides/guides-by-topic/core-reports/track-user-retention
+/guides/strategic-playbooks/onboarding-playbook/launch/revenue-analytics /docs/features/revenue-analytics
\ No newline at end of file