Product

Jobs, Features, and Insights: The JTBD Value Cycle for Product Analytics

Jobs, Features, and Insights: The JTBD Value Cycle for Product Analytics

Most SAAS drown in product KPI dashboards, but miss insights that predict success in marketing, sales & product. In this article i explain our philosophy with JTBD instead of features using our own JTBD Value Cycle.

Most SAAS drown in product KPI dashboards, but miss insights that predict success in marketing, sales & product. In this article i explain our philosophy with JTBD instead of features using our own JTBD Value Cycle.

Michel Kant

21 februari 2025

Leestijd: 30 min

21 februari 2025

Michel Kant

Leestijd: 30 min

Key Takeaways

The JTBD Value Cycle transforms product analytics: By following a framework that connects user needs to feature development and tracking, your product decisions become more focused and effective.

Better insights through purposeful tracking: Moving beyond "track everything" to strategic event tracking based on specific questions leads to actionable insights and clearer understanding of user success.

Product-led qualification drives better results: By focusing on actual product usage instead of marketing activities, you can identify truly qualified leads, resulting in higher conversion rates (20% vs 3%) and shorter sales cycles.

Clear categorization of event tracking: Strategic organization of tracking into categories (Core Adoption, Engagement Depth, Team Collaboration, etc.) helps predict user success and guide product decisions.

More effective cross-team alignment: This framework is particularly valuable for product, marketing, and sales teams as it provides a shared understanding of user value and helps identify the right moments for engagement.

Jobs, Features, and Insights: The JTBD Value Cycle for Product Analytics

If you're working in or with B2B (SaaS), you've probably heard this before: "We need to track everything!" But I've learned that what matters isn't tracking more - it's tracking smarter.

In this blog, I'll share how I've implemented an approach that brings outside-in thinking to product adoption, marketing, and development. At its heart is what we internally call The JTBD Value Cycle:

🔄 Jobs To Be Done (JTBD) → Feature Identification → Feature Development → Questions → Events Tracked → Insights → Better JTBD Understanding

This isn't just another analytics framework. It's a way to connect what users actually want to achieve (their jobs to be done) with what we build, what we track, and ultimately, how we deliver value. Whether you're a product manager, marketer, or analytics professional, this approach will change how you work. For me as Chief Product, it’s the core of our way of working. For me as Chief Marketer it drives how we do marketing and for me as Chief Sales i tells me who to reach out to.

Deze post is voorlopig even in het Engels. Omdat dit aan de hand van onze interne documentatie is geschreven.


1. Introduction: The JTBD value cycle

"We're tracking everything!" is a phrase often heard in product meetings. But are we tracking the right things? More importantly, are we tracking things that actually matter to our users and our business? The difference lies in understanding what users are trying to achieve (Jobs To Be Done) versus how they're using our solution (Features). This distinction isn't just academic – it's the key to meaningful product analytics.

When a marketing agency uses our platform, they're not just "creating a dashboard" or "integrating Google Search Console." They're trying to "demonstrate SEO value to clients" or "identify growth opportunities across their client portfolio." Understanding this difference transforms how we approach to how we deliver, what features we develop and how we track those.

A. Understanding Jobs To Be Done (JTBD) vs Features

JTBD (Outside-in thinking) represents what users want to accomplish. It's the outcome they're seeking, independent of any specific solution. Features, on the other hand, represent our specific solutions to help users accomplish these jobs. This distinction is crucial because it helps us avoid the common trap of solution-first thinking.

Let's look at a concrete example:

JTBD:

  • "I want to quickly show my clients how their SEO is performing"

  • Context-independent

  • Focused on the outcome

  • Could be solved in multiple ways

  • Discovered through user research and conversations

Possible features:

  • GSC Integration

  • Custom dashboard creation

  • Automated reporting

  • White-labeling options

  • All represent different ways to solve the same job

  • Developed based on our understanding of the job

As you can see one JTBD might require multiple features to be fully addressed, while a single feature might contribute to multiple jobs. This relationship isn't always one-to-one, and understanding this helps us build better solutions and track more meaningful metrics.

B. The Complete Value Cycle

At the heart of product analytics lies a continuous cycle:

🔄 Jobs To Be Done (JTBD) → Feature Identification → Feature Development → Questions → Events Tracked → Insights → Better JTBD Understanding

This cycle shows us how user needs translate into actual solutions, what we need to measure, and how we measure success. Let's break down each component:

  1. Jobs To Be Done (JTBD)

  • Identified through customer conversations

  • Validated through market research

  • Prioritized based on user and business value

  • Example: "I need to prove my agency's SEO value to clients"

  1. Feature Identification

  • Exploring possible solutions to the job

  • Understanding technical constraints

  • Evaluating implementation complexity

  • Example: Could be solved through automated reporting, real-time dashboards, or white-labeled client access

  1. Feature Development

  • Building specific solutions

  • Implementing tracking points

  • Creating user workflows

  • Example: Developing the white-label dashboard feature

  1. Questions We Need to Answer

  • About adoption and usage:

    • How long does it take for an agency to add their first client after signing up?

    • How many clients does an average agency add in their first week/month?

    • What percentage of invited collaborators accept their invitations within 48 hours?

  • About feature effectiveness:

    • How frequently do users access Google Search Console data after integration?

    • What percentage of users make changes to their white-labeling settings after the initial setup?

    • What are the most commonly used filter combinations?

  • About user behavior and satisfaction:

    • Is there a relationship between command bar usage and overall user satisfaction?

    • How does the integration of Google Ads data affect the frequency of dashboard access?

    • How long does it take for users to click the magic link after requesting it?

  1. Events Tracked

  • Events are tracked specifically to answer our questions

  • For adoption and usage questions:

    • Track timestamp of first client addition

    • Track client addition events with timestamps

    • Track invitation sends and accepts with timestamps

  • For feature effectiveness questions:

    • Track GSC data access frequency

    • Track white-label setting modifications

    • Track filter selections and combinations

  • For user behavior questions:

    • Track command bar interactions

    • Track dashboard access patterns after Google Ads integration

    • Track magic link request and click events

Each event is purposefully tracked to provide data that answers specific questions, rather than tracking events just because we can.

  1. Insights

  • Questions get answered through event data analysis

  • Example questions answered:

    • "We see 80% of successful agencies add their second client within 2 weeks"

    • "Agencies that use the command bar add 3x more clients"

    • "White-label settings are modified within 24 hours of client addition"

  • These answers provide insights into:

    • If jobs are being completed

    • How effectively features solve jobs

    • Where users might be struggling

    • What differentiates successful users

  1. Better JTBD Understanding & Value Creation

A. Different JTBD provide value at different stages of the user's journey:

Early Value JTBD: "Users can add Google Search Console data to enrich client analytics"

  • Immediate value demonstration

  • Quick wins for agencies with clients

  • Validates basic product usefulness

  • Foundation for deeper engagement

Progressive Value JTBD:

"Users can maintain a logbook to track important events and notes for each client"

  • Becomes more valuable as client base grows

  • Helps manage increasing complexity

  • Supports team collaboration

  • Shows deeper platform integration into workflows

Maturity Indicator JTBD:

"Users can utilize a command bar for quick navigation throughout the application"

  • Improves daily workflow efficiency

  • Reduces time spent on routine tasks

  • Indicates advanced platform knowledge

  • Suggests strong product adoption

B. Customer Journey Understanding

  • How value perception evolves over time

  • Which combinations of jobs create strongest value

  • When users are ready for more advanced feature

  • Where additional support might be needed

C. Product Development Insights

Understanding usage patterns helps prioritize:

  • Which features to develop next

  • Which JTBD need better solutions

  • Where users might need more support

Timeline expectations:

  • Which jobs should provide immediate value

  • Which jobs naturally evolve with usage

  • When to introduce advanced features

D. Business Growth Opportunities

  • Identify accounts getting strong value

  • Recognize optimal timing for sales conversations

  • Spot expansion opportunities within accounts

  • Guide success team interventions

E. New Questions Generation Usage patterns raise new questions:

  • Why do some users adopt certain jobs faster?

  • What combinations of jobs indicate higher success?

  • Where are users finding unexpected value?

These questions lead to:

  • New feature ideas

  • Better onboarding flows

  • More effective success metrics

C. Breaking Away from "Analytics for Analytics' Sake"

"Let's track everything, we might need it later!"

This common approach to analytics leads to what we call 'data debt': an overwhelming amount of data that provides little to no actionable insight. Let's look at a typical scenario:

A product team implements tracking on:

  • Every button click

  • All page views

  • Each form interaction

  • Every dropdown selection

  • All filter combinations

Six months later, they have:

  • Dashboards no one looks at

  • Data no one understands

  • Metrics no one uses

  • Questions that still can't be answered

The problem isn't lack of data—it's lack of purpose.

The Cost of Tracking Everything

  1. Data Overwhelm

    • Teams spend more time managing data than using it

    • Important signals get lost in the noise

    • Analysts spend more time cleaning data than analyzing it

  2. Resource Waste

    • Storage costs for unused data

    • Processing power for unneeded analytics

    • Engineering time implementing unused tracking

    • Maintenance burden for unused events

  3. Lack of Actionable Insights

    • Too many metrics, not enough meaning

    • No clear connection to business decisions

    • Cannot separate signal from noise

    • No clear path from data to action

The Power of Purposeful Tracking

Instead, start with questions:

"How long does it take for an agency to add their first client?"

  • Track: Account creation timestamp

  • Track: First client addition timestamp

  • Result: Clear time-to-value metric

"Do agencies who use the command bar add more clients?"

  • Track: Command bar usage

  • Track: Client addition rate

  • Result: Feature impact on business growth

"Which features indicate long-term success?"

  • Track: Specific feature usage

  • Track: Retention and expansion

  • Result: Clear success indicators

Why This Matters

Every event tracked should:

  1. Answer a specific question

  2. Inform business or user value

  3. Drive product decisions

This approach:

  • Makes data actionable

  • Connects tracking to outcomes

  • Ensures resource efficiency

  • Creates clear paths to insights

Remember: The goal isn't to have the most data, but to have the right data to make better decisions.

D. The Shift: Marketing-Led to Product-Led

Traditional marketing-led thinking goes something like this:

  • User downloads an ebook about SEO analytics

  • They attend a webinar about agency growth

  • They fill out a form showing interest

  • Marketing qualifies them as a lead

  • Sales reaches out

But what does this actually tell us about the user's likelihood to succeed with our product? Very little.

The Problem with Marketing-Led Qualification

Traditional Approach:

  • Measures interest, not intent

  • Based on marketing interactions:

    • Downloads of content

    • Form fills

    • Website visits

    • Email opens

  • Relies heavily on assumptions:

    • "They downloaded three ebooks, they must be interested!"

    • "They attended our webinar, they're ready to buy!"

  • Results in:

    • Premature sales outreach

    • Lower conversion rates

    • Misaligned expectations

    • Resource waste on unqualified leads

The Power of Product-Led Qualification

Product-led approach looks at actual usage patterns:

  • Agency adds their first client within 24 hours

  • They integrate Google Search Console for that client

  • They invite team members to collaborate

  • They create custom branded reports

This tells us:

  1. They're actively implementing the solution

  2. They're getting real value

  3. They're expanding usage within their organization

  4. They're investing in the platform

Real Value Indicators

Product-led metrics that matter:

  • Time to first value

    • How quickly do they add their first client?

    • How soon do they create their first report?

  • Depth of adoption

    • Which key features are they using?

    • How many team members are active?

  • Expansion patterns

    • Are they adding more clients?

    • Are they using more advanced features?

The Impact on Success

This shift changes everything:

  • Sales conversations become value-driven

    • "I see you've added 5 clients already"

    • "Your team is using our white-label features effectively"

  • Customer success becomes proactive

    • Identifying adoption patterns

    • Predicting potential challenges

  • Product development becomes focused

    • Understanding which features drive success

    • Identifying where users need help

The result? Higher conversion rates, better customer satisfaction, and sustainable growth based on actual value delivery rather than marketing interactions.

E. PQA & PQL: The New Qualification Framework

When we understand product usage, we can identify two types of qualification signals: Product Qualified Accounts (PQA) and Product Qualified Leads (PQL). While related, they serve different purposes and signal different types of opportunities.

Product Qualified Accounts (PQA)

What is a PQA?

  • An organization showing meaningful product adoption

  • Multiple users actively engaging

  • Clear patterns of value realization

Signs of a Product Qualified Account:

  1. Usage Breadth

    • Multiple team members active

    • Different roles engaging (managers, executors)

    • Various features being utilized

  2. Usage Depth

    • Regular client additions

    • Consistent data integration

    • Custom dashboard creation

    • White-label implementation

  3. Value Realization Patterns

    • Frequent client report generation

    • Regular data analysis

    • Team collaboration

    • Feature exploration

Product Qualified Leads (PQL)

What is a PQL?

  • Individual users demonstrating product mastery

  • Champions within their organization

  • Users realizing significant value

Identifying PQLs through behavior:

  1. Feature Mastery

    • Uses advanced features

    • Creates custom workflows

    • Helps team members

    • Explores new capabilities

  2. Engagement Patterns

    • Regular login behavior

    • Consistent feature usage

    • Proactive support engagement

    • Feature feedback provision

  3. Value Indicators

    • High client management activity

    • Effective use of automation

    • Integration utilization

    • Team expansion initiatives

Why This Framework Matters

For Sales:

  • Know exactly when to engage

  • Have value-based conversations

  • Focus on accounts showing success

  • Identify expansion opportunities

For Customer Success:

  • Predict account health

  • Identify training needs

  • Spot champions early

  • Guide product adoption

For Product:

  • Understand successful user journeys

  • Identify critical features

  • Optimize onboarding flows

  • Guide feature development

The Qualification Process

  1. Monitor Usage Signals

    • Track key feature adoption

    • Measure engagement depth

    • Watch team expansion

  2. Identify Patterns

    • Usage frequency

    • Feature adoption sequence

    • Team collaboration

  3. Trigger Appropriate Actions

    • Sales outreach

    • Success team engagement

    • Educational content

    • Feature recommendations

This framework transforms how we think about qualification - from guessing based on marketing signals to knowing based on actual value realization.


2. The Evolution of Lead Qualification

Remember the days of counting PDF downloads and webinar attendees? Many companies still qualify leads this way. They track who downloads the "Ultimate Guide to SEO" or attends the "Scale Your Agency" webinar. While these actions show interest, they tell us nothing about whether someone will actually succeed with our product.

A. The Traditional Approach: Hope-Based Qualification

Marketing teams typically qualify leads based on:

  1. Content Consumption

    • Number of blog posts read

    • Ebooks downloaded

    • Webinars attended

    • Time on site

  2. Demographic Fit

    • Company size

    • Industry

    • Job title

    • Budget signals

  3. Engagement Signals

    • Email opens

    • Form completions

    • Social media interactions

    • Website revisits

The fundamental problem? None of these actions predict success. They're hope-based metrics - we hope someone who downloads three ebooks is more likely to become a customer. We hope a marketing manager who attended two webinars is ready to buy.

B. Why Traditional Methods Fall Short

Let's look at a typical scenario:

Marketing: "This lead is hot! They downloaded our agency scaling guide and attended our SEO webinar!" Sales: "Great! I'll reach out immediately!" [Two weeks later...] Lead: "Sorry, we're not interested at all…"

The disconnect happens because:

  1. Content consumption doesn't equal product fit

  2. Interest doesn't equal intent

  3. Understanding doesn't equal implementation

  4. Marketing activities don't predict success

Would you like me to continue with how product-led qualification transforms this approach?

C. The Product-Led Revolution: Actions Speak Louder Than Downloads

Enter product-led qualification. Instead of hoping someone who downloaded an ebook becomes a customer, we look at what they actually do in the product:

Example:

  • Traditional signals: Downloaded 5 SEO guides, attended 3 webinars

  • Product signals: Added 2 clients, integrated GSC, created branded reports

Which agency would you bet on?

D. What We Actually Track Now

  1. Active Product Usage

    • First value milestone: Adding that first client

    • Integration adoption: Connecting data sources

    • Feature exploration: Moving beyond basics

    • Regular engagement: Making it part of their workflow

  2. Team Collaboration

    • Adding team members

    • Acceptance rates of invites

    • Cross-team feature usage

    • Collaboration patterns

  3. Value Realization

    • Client additions over time

    • Report generation frequency

    • Data source integrations

    • White-label implementation

  4. Advanced Adoption

    • Custom dashboard creation

    • Command bar usage

    • Advanced filtering

    • API implementation


E. Making The Transition

Moving from traditional to product-led qualification isn't just about changing metrics. It requires:

  1. Mindset Shift

    • From "Are they interested?" to "Are they successful?"

    • From "Did they download?" to "Did they implement?"

    • From "How many forms?" to "How many clients?"

  2. Process Changes

    • Sales timing based on usage, not marketing scores

    • Success team engagement driven by adoption patterns

    • Marketing focused on activation, not just acquisition

  3. Tool Evolution

    • Product analytics replacing (or complementing) marketing automation

    • Usage data feeding into CRM

    • Real-time qualification based on actual use

F. Measuring The Impact

The proof is in the numbers:

Traditional Qualification:

  • 3% of MQLs become customers

  • 45 days average sales cycle

  • High early churn rate

  • Lots of "tire kickers"

Product-Led Qualification:

  • 20% of PQLs become customers

  • 12 days average sales cycle

  • Lower churn rate

  • Conversations based on actual value

H. The Future is Product-Led

The beauty of product-led qualification? It's based on reality, not hope. When someone succeeds with your product:

  • They're more likely to buy

  • They're easier to sell to

  • They stick around longer

  • They become advocates

After all, would you rather talk to someone who downloaded your ebook or someone who's already getting value from your product?


3. Strategic Event Tracking Categories

Event tracking without categorization is like having a library without a classification system - you have the books, but good luck finding what you need. Strategic categorization helps us:

  • Organize our tracking efforts

  • Understand user progression

  • Identify patterns in behavior

  • Make data actionable

  • Connect events to outcomes

A. Core Product Adoption

Think of this as your "are they getting started successfully?" category.

What we track for example:

  • First client addition

  • Initial GSC integration

  • First dashboard creation

  • First report generation

Why it matters: These events tell us if users are getting past the crucial first steps. It's like watching someone learn to ride a bike - are they pushing off, finding balance, and making those first pedals?

  • Old way: Track every click in the onboarding

  • Strategic way: Track completion of first value actions

  • Old insight: "User clicked through 7 onboarding screens"

  • New insight: "User added their first client and integrated GSC within 24 hours"

When to use this category:

  • Measuring onboarding effectiveness

  • Identifying early friction points

  • Predicting future success

  • Timing initial success team outreach

B. User Engagement Depth

This category answers the question: "Are users making our product part of their routine?"

What we track:

  • Login frequency and patterns

  • Feature usage consistency

  • Time spent on key activities

  • Return visit behavior

Why it matters: Frequency and depth of engagement tell us if we're becoming part of a user's workflow. The difference between a tourist and a local is that a local knows their way around and has regular habits - we want our users to become locals.

  • Old way: Track total time spent in app

  • Strategic way: Track meaningful engagement patterns

  • Old insight: "User spent 45 minutes in the app"

  • New insight: "User checks client dashboards every Monday morning and generates reports every first of the month"

Pattern Recognition:

  • Daily active patterns

    • Morning dashboard checks

    • Regular data reviews

    • Consistent reporting cycles

  • Weekly routines

    • Client review sessions

    • Team collaboration peaks

    • Report generation timing

  • Monthly habits

    • Performance analysis

    • Client presentations

    • Strategy adjustments

When to use this category:

  • Understanding user workflows

  • Identifying power users

  • Predicting churn risks

  • Optimizing feature placement

C. Team Collaboration Signals

This category reveals how your product spreads within an organization - the viral coefficient of your B2B product, if you will.

What we track:

  • Team member invitations

  • Invitation acceptance rates

  • Cross-team feature usage

  • Role-based activities

  • Collaboration touchpoints


Why it matters: The more your product becomes part of a team's workflow, the stickier it becomes. It's like watching a network effect in action - each new team member increases the product's value for everyone.

Real Example:

  • Old way: Count total users per account

  • Strategic way: Track meaningful team interactions

  • Old insight: "Account has 5 users"

  • New insight: "Agency owner invites all client managers within 2 weeks, who then each create custom dashboards for their clients"

Collaboration Patterns to Watch:

  • Invitation flows

    • Who invites whom

    • Time to accept invitations

    • Role assignments

  • Usage patterns

    • Shared dashboard views

    • Comment activities

    • Report sharing

  • Team expansion

    • Department spread

    • Role diversity

    • Usage depth per role

When to use this category:

  • Predicting account stability

  • Identifying expansion opportunities

  • Understanding team adoption

  • Guiding team onboarding improvements


D. Value Realization Indicators

This is where we measure if users are achieving what they set out to do - are they getting the value they came for?

What we track:

  • Client growth rate

  • Report sharing frequency

  • Data integration depth

  • Custom dashboard creation

  • White-label implementation

Why it matters: Users buy products to get a job done. These indicators tell us if they're successfully doing those jobs and getting the value they expected.

  • Old way: Track feature usage

  • Strategic way: Track success milestones

  • Old insight: "User created 5 dashboards"

  • New insight: "Agency is creating branded dashboards for each client within 3 days of client addition"

E. Account Growth Markers

This category helps us identify accounts that are primed for expansion and long-term success.

What we track:

  • Client portfolio growth

  • Feature adoption expansion

  • User seat additions

  • Integration breadth

  • Usage volume trends

Why it matters: Growth markers are leading indicators of account health and expansion potential. They help us identify accounts ready for upgrading or needing attention before issues arise.

  • Old way: Monitor subscription level

  • Strategic way: Track growth indicators

  • Old insight: "Account is on basic plan"

  • New insight: "Agency has maxed out client limit three months in a row and is using advanced features"

F. Feature Maturity Usage

This category tells us how sophisticated users are becoming with our product.

What we track:

  • Advanced feature adoption

  • Command bar usage

  • Custom workflow creation

  • API implementation

  • Complex filtering patterns

Why it matters: Mature feature usage indicates product mastery and usually correlates with higher retention and advocacy rates. It's the difference between someone who can drive a car and someone who can parallel park blindfolded.

  • Old way: Count total features used

  • Strategic way: Track sophistication of usage

  • Old insight: "User accessed 12 features"

  • New insight: "Agency uses command bar for navigation, has set up custom report automations, and leverages API for client integrations"

Value of Strategic Categorization

By organizing our tracking this way, we:

  1. Create clear progress indicators

    • From basic adoption to advanced usage

    • From individual use to team collaboration

    • From simple features to complex workflows

  2. Enable targeted actions

    • Know when to trigger sales conversations

    • Identify accounts needing support

    • Spot expansion opportunities

    • Guide product development

  3. Predict outcomes better

    • Conversion likelihood

    • Churn risks

    • Expansion potential

    • Feature success

Remember: The goal isn't just to collect data in neat categories - it's to create a framework that turns user actions into predictable business outcomes. Each category serves as a lens through which we can understand different aspects of user success and product value.


4. Key Performance Questions: Asking the Right Things to Track the Right Things

Questions drive insights. But not all questions are created equal. Let's look at how to formulate questions that actually help us understand user success and product value.

A. Job Completion Questions

Instead of asking: "How many users created a dashboard?" Ask: "How effectively are users showing SEO value to their clients?"

Better questions about jobs:

  • What percentage of agencies complete their first client setup within 24 hours?

  • How long does it take from GSC integration to first client presentation?

  • Do agencies that create branded reports add more clients?

  • Which jobs consistently lead to team expansion?

  • What's the typical sequence of jobs for successful agencies?

B. Feature Effectiveness Questions

Instead of asking: "How many times was this button clicked?" Ask: "How is this feature helping users complete their jobs?"

Questions that matter:

  • What percentage of invited collaborators accept their invitations within 48 hours?

  • How frequently do users access Google Search Console data after integration?

  • Is there a relationship between command bar usage and overall user satisfaction?

  • What percentage of users make changes to their white-labeling settings after the initial setup?

  • How does the integration of Google Ads data affect the frequency of dashboard access?

C. Growth and Expansion Questions

Instead of asking: "How many users do we have?" Ask: "How are users growing with our product?"

Questions about growth:

  • How many clients does an average agency add in their first week/month?

  • What's the typical timeline from first client to tenth client?

  • Which features predict account expansion?

  • How does team size correlate with feature adoption?

  • What usage patterns indicate readiness for expansion?

D. User Journey Questions

Instead of asking: "What's our activation rate?" Ask: "How are users progressing through their journey?"

Journey-focused questions:

  • How long does it take for users to click the magic link after requesting it?

  • What's the typical path from first login to first client report?

  • Where do successful users spend most of their time?

  • What features do power users discover that others miss?

  • Which early behaviors predict long-term success?

E. Value Realization Questions

Instead of asking: "Are users active?" Ask: "Are users achieving their goals?"

Value-focused questions:

  • How quickly do agencies demonstrate value to their first client?

  • What percentage of agencies create custom branded reports?

  • How often do clients engage with shared reports?

  • Which integrations lead to highest client retention?

  • What combinations of features indicate strong value delivery?

The Art of Good Questions

When forming questions, consider:

  1. Action vs. Outcome

    • Poor: "Did they use the feature?"

    • Better: "Did the feature help them succeed?"

  2. Usage vs. Value

    • Poor: "How many dashboards created?"

    • Better: "How effectively are they managing clients?"

  3. Activity vs. Progress

    • Poor: "How many logins this week?"

    • Better: "Are they progressing toward their goals?"

Remember: Good questions lead to actionable insights. Every question should help us understand either:

  • How users are achieving their goals

  • Where they're struggling

  • What predicts their success

  • How we can help them succeed faster

The answers to these questions don't just inform product decisions - they shape our entire understanding of user success.


5. Conclusion: Making Analytics Work for Everyone

When we started this journey, we talked about moving beyond vanity metrics. But this framework does more than that - it creates a shared language between users, product, and business value.

The Power of Connected Thinking

The JTBD Value Cycle:

<aside> 🔄

Jobs To Be Done (JTBD) → Feature Identification → Feature Development → Questions → Events Tracked → Insights → Better JTBD Understanding

</aside>

Isn't just a process - it's a way of thinking that:

  • Starts with user needs (not feature ideas)

  • Tracks with purpose (not just because we can)

  • Measures what matters (not what's easy)

  • Drives actual insights (not just numbers)

What We've Learned

  1. Outside-In Thinking Works

    • Understanding jobs before features

    • Tracking completion over clicks

    • Measuring success, not just usage

  2. Strategic Categories Matter

    • Core Product Adoption

    • User Engagement Depth

    • Team Collaboration Signals

    • Value Realization Indicators

    • Account Growth Markers

    • Feature Maturity Usage

  3. Questions Drive Insights

    • From "what happened?" to "why it matters"

    • From tracking events to understanding journeys

    • From data points to user stories

Making It Work in Your Organization

Start Small:

  • Pick one important job

  • Identify the features that support it

  • Form clear questions about success

  • Track events that answer those questions

  • Learn and iterate

Build Momentum:

  • Share insights across teams

  • Connect data to decisions

  • Celebrate user success stories

  • Evolve your understanding

The Future is Job-Focused

As products become more complex and users more sophisticated, this approach becomes even more crucial. Because at the end of the day:

  • Users don't care about features; they care about getting jobs done

  • Teams don't need more data; they need better insights

  • Products don't need more tracking; they need purposeful measurement

Final Thoughts

Remember: The goal isn't to create perfect analytics. It's to understand our users better and help them succeed. When we do that well, everything else - adoption, retention, growth - follows naturally.

  1. Start with jobs, not features.

  2. Ask better questions, not more questions.

  3. Track with purpose, not just because you can.

Sources & Inspiration

  • Elena Verna & Austin Hay's insights on Product-Led Growth infrastructure and team alignment LinkedIn Post

  • A couple of years of searching for the best aproach for myself to manage product

  • Loads of inspirational talks with Claude 😃

Meld je aan en ontvang de nieuwsbrief!

Schrijf je in op onze nieuwsbrief en ontvang maandelijks data-inzichten, product updates & meer.

Meld je aan en ontvang de nieuwsbrief!

Schrijf je in op onze nieuwsbrief en ontvang maandelijks data-inzichten, product updates & meer.

Meld je aan en ontvang de nieuwsbrief!

Schrijf je in op onze nieuwsbrief en ontvang maandelijks data-inzichten, product updates & meer.

Je bent aangemeld!

Making the jungle of marketing data easy to work with for digital marketing agencies.

Resources

GSC Looker Studio Template

Gratis SQL Cursus voor Marketeers

Nederlandse SERP Monitor

Bedrijf

Over ons

Vacatures

Legal

Data Pro Code

© 2025 Flipstream BV. All rights reserved.

Making the jungle of marketing data easy to work with for digital marketing agencies.

Resources

GSC Looker Studio Template

Gratis SQL Cursus voor Marketeers

Nederlandse SERP Monitor

Bedrijf

Over ons

Vacatures

Legal

Data Pro Code

© 2025 Flipstream BV. All rights reserved.