Opportunity Solution Trees: Discovery That Maps to Roadmap Reality
Outcomes → Opportunities → Solutions → Experiments (And How It All Becomes OKRs)
This is one of RoadmapOne ’s articles on Objective Prioritisation frameworks .
Teresa Torres’ Opportunity Solution Tree (OST) solves a problem every product team faces: the urge to jump straight from “we need to increase retention” to “let’s build feature X.” The OST introduces intermediate layers that force you to understand the problem before committing to solutions.
The framework is powerful for discovery. But it’s not a prioritisation framework—it doesn’t tell you which opportunity to pursue first. What it does is generate a structured backlog that maps directly to OKRs and RoadmapOne’s planning model.
Opportunity Solution Trees structure discovery: Outcome (what you’re trying to achieve) → Opportunities (customer problems that could move the outcome) → Solutions (ideas that might address opportunities) → Experiments (ways to test solutions). The mapping to RoadmapOne is direct: Outcomes are Objectives, validated Opportunities and Solutions become Key Results, and Experiments are the way we learn. Use OST for discovery, then feed the outputs into RICE/BRICE prioritisation and capacity planning.
The Four Layers
Outcomes: What You’re Obsessed With
Outcomes sit at the top of the tree. They’re measurable business or customer results—not features, not outputs, but changes in the world you’re trying to create.
- “Reduce churn from 8% to 5%”
- “Increase activation rate from 40% to 60%”
- “Grow monthly active users by 30%”
In RoadmapOne, Outcomes are Objectives. They’re what you’re obsessed with achieving—the things that matter to the CEO, the CFO, and the board.
Good outcomes are:
- Measurable — specific metrics with targets
- Customer or business focused — not “launch feature X”
- Achievable but ambitious — stretch without being fantasy
- Time-bound — clear evaluation windows
Opportunities: Customer Problems Worth Solving
Beneath each Outcome, you map Opportunities—customer problems, pain points, or needs that, if addressed, would move the Outcome metric.
For “Reduce churn from 8% to 5%”, opportunities might include:
- Customers struggle to understand their usage data
- Power users can’t collaborate with teammates
- Customers forget the value because they only engage monthly
- Customers hit limits but don’t know upgrade options exist
Each opportunity is a hypothesis: “If we solve this problem, churn will decrease.” You don’t know yet which opportunities are real or impactful—that comes through research and experimentation.
Solutions: Ideas That Might Work
For each opportunity, you brainstorm potential solutions—ways you might address the customer problem.
For “Customers struggle to understand their usage data”:
- Redesigned analytics dashboard
- Weekly email digest summarising key metrics
- In-app guidance explaining data visualisations
- One-click export to familiar spreadsheet format
Solutions are not commitments. They’re ideas competing for validation. You might generate 5-10 solutions per opportunity, knowing most won’t survive contact with customer feedback.
Experiments: How We Learn
The bottom layer is experiments—small, time-boxed tests that validate whether solutions actually address opportunities.
For “Weekly email digest”:
- Send a mockup to 10 customers and interview for feedback
- Build a scrappy MVP and measure open rates
- A/B test digest vs no digest with 5% of users
- Run a fake door test to measure interest
Experiments generate evidence. They move your confidence level from “we think this might work” to “we have data showing this works” or “we’ve learned this doesn’t work.”
Mapping OST to RoadmapOne
The translation is direct:
| OST Layer | RoadmapOne Equivalent | Where It Lives |
|---|---|---|
| Outcomes | Objectives | Roadmap grid, OKRs |
| Opportunities | Input to Key Results | Discovery backlog |
| Solutions | Key Results (once validated) | Roadmap grid, team backlogs |
| Experiments | Discovery capacity allocation | Sprint planning, JIRA |
Outcomes become Objectives. The measurable result you’re pursuing is your Objective in RoadmapOne. “Reduce churn from 8% to 5%” sits at the top of your OKR hierarchy.
Validated Solutions become Key Results. Once you’ve run experiments and validated that a solution addresses a real opportunity, it graduates to a Key Result. “Launch weekly email digest” becomes a committed deliverable that teams will build.
Experiments consume discovery capacity. In capacity-based planning , you allocate sprints for discovery work. Those sprints fund the experiments that validate (or invalidate) solutions.
OST and the Experiments Mindset
One thing I love about OST is the explicit “Experiments” layer. We’re always trying things we think will work—but thinking isn’t knowing.
The experiments layer forces intellectual honesty:
- We have an opportunity hypothesis (customers struggle with X)
- We have a solution hypothesis (feature Y will address X)
- We have experiment evidence (tested with N customers, results showed Z)
That chain of reasoning is what separates good product management from feature factories. You’re not building because someone asked or because it’s on a competitor’s roadmap. You’re building because experiments validated that this solution addresses a real opportunity that moves an outcome you care about.
In RoadmapOne, the Confidence dimension in BRICE tracks this. A Key Result starts at low confidence (we think it’ll work). As experiments validate the opportunity and solution, confidence increases. The prioritisation score updates accordingly.
OST vs GIST: Overlapping Hierarchies
GIST (Goals → Ideas → Steps → Tasks) looks similar to OST. Both create layered hierarchies from strategic intent to execution. What’s the difference?
| Layer | OST | GIST |
|---|---|---|
| Top | Outcome | Goal |
| Second | Opportunity | Idea |
| Third | Solution | Step |
| Bottom | Experiment | Task |
The key difference is emphasis:
OST focuses on the Opportunity layer—ensuring you understand customer problems before proposing solutions. The explicit distinction between Opportunity and Solution prevents jumping to features too quickly.
GIST focuses on confidence tracking—the “confidence meter” that updates as you validate ideas through steps. GIST emphasises the journey from hypothesis to proven solution.
If your team’s weakness is jumping to solutions without understanding problems, OST is the better mental model. If your team’s weakness is committing to ideas without validation, GIST’s confidence tracking is more useful.
Either way, the outputs map to the same place: validated solutions become Key Results in RoadmapOne, pursued against Objectives.
When OST Works
During Product Discovery
OST shines when you’re exploring a problem space. The tree structure forces discipline:
- Start with the Outcome (what are we trying to achieve?)
- Map Opportunities (what customer problems might move the needle?)
- Generate Solutions (what could we build?)
- Design Experiments (how do we test our assumptions?)
This prevents the classic trap of starting with a solution (“let’s build a dashboard”) and working backward to justify it.
For Outcome-Focused Teams
If you’re trying to shift from feature factories to outcome-driven product management, OST provides scaffolding. The tree makes the connection between solutions and outcomes explicit. Every solution must trace to an opportunity that traces to an outcome.
When Stakeholders Push Features
When a stakeholder demands “build X,” OST gives you a framework to respond:
- “What outcome would X achieve?”
- “What opportunity (customer problem) does X address?”
- “What experiments could we run to validate X before committing engineering resources?”
The tree creates space for productive conversation rather than feature-request ping-pong.
Where OST Falls Down
It’s Not Prioritisation
OST helps you generate and structure a backlog. It doesn’t help you decide which opportunity to pursue first.
A tree with 5 opportunities and 20 solutions doesn’t tell you whether to start with Opportunity A/Solution 3 or Opportunity D/Solution 1. For that, you need RICE , BRICE , or another prioritisation framework.
OST is the input; prioritisation is the filter; the roadmap is the output.
Trees Can Become Forests
I’ve seen teams create beautiful OSTs with 40 opportunities and 120 solutions—comprehensive mappings of everything that could possibly move the outcome.
That’s not a tree. It’s a forest. And it’s not actionable.
The tree should represent your current focus, not every possible path. Prune aggressively. A tree with 3-4 opportunities and 8-10 total solutions is more useful than an exhaustive taxonomy.
Discovery Becomes an End, Not a Means
The worst failure mode is treating the OST as the deliverable. Teams spend months building elaborate trees, researching every opportunity, testing every solution—and never commit to building anything.
At some point, discovery has to produce outputs that feed planning. The OST helps you explore intelligently, but the goal is validated solutions that become Key Results, not ever-expanding trees.
Practical Implementation
Start with one Outcome. Don’t try to map your entire product. Pick the highest-priority outcome and build its tree.
Timebox opportunity exploration. Spend 1-2 weeks mapping opportunities through customer research. Resist the urge to be exhaustive—you can always add opportunities later.
Generate solutions fast, then filter. Brainstorm quickly without judgment, then prune to the 2-3 solutions worth testing per opportunity.
Design cheap experiments. The goal is learning with minimum investment. Prototypes, mockups, fake door tests, customer interviews—not full builds.
Graduate validated solutions to Key Results. When experiments provide confidence, move the solution into your roadmap as a committed Key Result under the relevant Objective.
Prioritise Key Results with RICE/BRICE. The tree generated the candidates; now score them to decide sequencing and capacity allocation.
The Bottom Line
Opportunity Solution Trees structure discovery: Outcomes at the top, Opportunities beneath, Solutions below those, Experiments at the bottom. The hierarchy forces you to understand problems before committing to solutions.
The mapping to RoadmapOne is direct. Outcomes are Objectives. Validated Solutions are Key Results. Experiments consume discovery capacity allocated to squads and sprints.
But OST doesn’t tell you which opportunity to pursue or which solution to build first. It generates a structured backlog. Use RICE , BRICE , or another framework to prioritise that backlog.
Discovery generates candidates. Prioritisation filters them. Capacity allocation funds them. OST handles discovery. RoadmapOne handles the rest.
References
- Teresa Torres, Continuous Discovery Habits (2021) — The definitive OST guide
- GIST Framework — Related discovery hierarchy with confidence tracking
- OKRs for Product Teams — Connecting outcomes to key results
- BRICE Prioritisation — Prioritising validated solutions
- Capacity-Based Planning — Allocating discovery capacity
- Objective Prioritisation Frameworks — Complete guide to all frameworks