Module 2: Measuring Product-Market Fit

Learn the quantitative and qualitative methods for measuring PMF, from the Sean Ellis survey to retention analysis.

Module Progress0 of 5 tasks completed

Module Tasks

Run the Sean Ellis survey

Deploy the 'How would you feel if you could no longer use [product]?' survey to your active users.

2 hoursTypeform/SurveyMonkeyEmail toolPMF score (% very disappointed)

Analyze your retention curves

Calculate and visualize your retention by cohort to see if users are sticking around.

3 hoursAnalytics platformSpreadsheetRetention curve analysis

Set up NPS tracking

Implement Net Promoter Score surveys and establish a baseline.

2 hoursNPS toolIn-app surveysNPS baseline score

Define and track engagement metrics

Identify your core engagement metrics and set up tracking dashboards.

3 hoursAnalyticsDashboard toolEngagement dashboard

Gather qualitative PMF data

Collect user testimonials, reviews, and feedback that provide qualitative PMF evidence.

2 hoursReview sitesSupport ticketsQualitative PMF evidence doc

The Sean Ellis PMF Survey

Main Question:

"How would you feel if you could no longer use [Product]?"

Very disappointedThis is your target. 40%+ = PMF
Somewhat disappointedThese users like you but don't need you
Not disappointedThese users aren't getting value

Follow-up Questions:

  • What is the primary benefit you receive from [Product]?
  • What type of person do you think would benefit most from [Product]?
  • How can we improve [Product] for you?
  • What would you likely use as an alternative if [Product] were no longer available?

How to Interpret Results:

0-20%No PMF - Major pivot or changes neededGo back to customer discovery
20-40%Approaching PMF - Getting warmerFocus on your 'very disappointed' users
40-60%PMF achieved - Ready to scaleDouble down on what's working
60%+Strong PMF - Significant market pullScale aggressively

Key Retention Metrics

Day 1 Retention

% of users who return the day after signup

Benchmark: 40-60%Measures initial value delivery

Day 7 Retention

% of users who return a week after signup

Benchmark: 20-30%Measures early habit formation

Day 30 Retention

% of users who return a month after signup

Benchmark: 10-20%Measures sustained value

Weekly Active Users (WAU)

Users active at least once per week

Benchmark: Varies by productMeasures regular engagement

Monthly Active Users (MAU)

Users active at least once per month

Benchmark: Varies by productMeasures overall reach

DAU/MAU Ratio

Daily actives divided by monthly actives

Benchmark: 20%+ is good, 50%+ is exceptionalMeasures stickiness

Understanding Cohort Analysis

What it is: Cohort analysis groups users by when they signed up and tracks their behavior over time.

Why it matters: It reveals whether your product is getting better at retaining users and identifies what works for different user groups.

How to Read Cohort Charts:

Flattening curves:Users who stay past a certain point tend to keep staying - you've found a 'sticky' threshold
Declining curves:Users continue to leave over time - ongoing value delivery problem
Improving cohorts:Newer users retain better than older - your product is improving
Worse cohorts:Newer users retain worse - something is broken or market is changing

Engagement Metrics Framework

1

Core Action

The single action that delivers primary value

Examples: Sending a message (Slack), Making a trade (Robinhood), Publishing content (Medium)
Key question: What's the one thing users must do to get value?
2

Activation

The milestone that predicts long-term retention

Examples: Completing onboarding, First successful use case, Inviting team members
Key question: What action correlates with users who stay?
3

Engagement

Regular usage that indicates ongoing value

Examples: Weekly logins, Features used per session, Time in product
Key question: What behavior shows users are getting regular value?
4

Expansion

Deepening usage and broader adoption

Examples: Using advanced features, Adding team members, Upgrading plans
Key question: What shows users are getting more value over time?

Leading Indicators of PMF

These metrics predict PMF before you achieve it. Track them weekly.

Time to first value

How quickly users experience the core benefit

Target: As fast as possible - ideally minutes
Warning: If it takes days/weeks, you'll lose users before they see value

Activation rate

% of signups who complete key activation milestone

Target: 60%+ for most products
Warning: Low activation = onboarding or positioning problem

Usage frequency

How often users engage with core features

Target: Depends on product type
Warning: Declining frequency predicts churn

Feature adoption

% of users using key features

Target: 80%+ for core features
Warning: Low adoption = poor UX or wrong features

Qualitative PMF Signals

SignalWeightHow to Track
Unprompted testimonialsHighUsers post praise on social without being asked
Word-of-mouth referralsHighTrack 'How did you hear about us?' responses
Customer support toneMediumAnalyze support ticket sentiment over time
Feature request patternsMediumUsers asking for MORE (expansion) vs asking for BASICS (core problems)
Review site ratingsMediumTrack G2, Capterra, ProductHunt ratings over time
Competitive mentionsMediumUsers comparing you favorably to alternatives

Key Takeaways

  • 1.The Sean Ellis survey (40% "very disappointed") is the gold standard for measuring PMF.
  • 2.Retention is the ultimate measure of PMF - users who stay are users who got value.
  • 3.Cohort analysis reveals trends that aggregate metrics hide.
  • 4.Track leading indicators (activation, engagement) to predict PMF before you achieve it.
  • 5.Combine quantitative data with qualitative signals for a complete picture.