One decision.
Made well.

When you have one clear call to make and the cost of getting it wrong is real, a Decision Sprint is where you start.


→ Book a free Right Fit call

The Founder's Desk Decision Sprint · Case Study 01
Decision Sprint E-commerce / Skincare

Case Study 01

The
Single
Lever

A skincare brand was spending more on paid media and getting less in return. Traffic down. Conversion down. No consensus on what to fix first. One Decision Sprint replaced weeks of debate with one focused 30-day bet and guardrails to keep it honest.

Service
Decision Sprint
Industry
E-Commerce / Skincare
Timeline
30-day plan
Signals at engagement start
Paid spend (Meta + TikTok) ↑ Rising
Traffic quality ↓ Falling
Conversion rate ↓ Falling
Team alignment ✕ None
The real problem
"Not the ads — the decision beneath them."

The Situation

Spending more.
Getting less.

The brand had been increasing ad spend across Meta and TikTok for months. Instead of growth, they got the opposite. Fewer visitors, fewer purchases, tighter margins. The team had theories. Their agency had recommendations. Everyone wanted to try something different.

The real problem wasn't the ads. It was the decision underneath them. Every stakeholder had a different theory, a different priority, and a different idea of what to test next. The result was inaction dressed as strategy.

"We're spending more but earning less — and we don't know which lever to pull."

The founder's problem statement

The Actual Question

Not: what's wrong
with our ads?
But: What is driving the under performance?

The presenting problem was the ad performance. The actual decision was different: how do we identify the real constraint, choose one focused 30-day bet, and stop wasting spend on guesswork?

That reframe mattered. Answering the wrong question — optimising the ads without resolving the strategic tension underneath — would have produced another round of inconclusive tests and continued the pattern.

What We Explored Together

Five directions.
One choice.

Every viable path was put on the table — including the ones being avoided. The tension wasn't a lack of good ideas. It was knowing which idea to back — and which to leave alone.

01
Push harder on paid creative
More volume, more testing — accelerate out of the slump through sheer experimentation.
Chosen path
02
Fix conversion first
Clarity, trust, friction removal — address what happens after the click before touching spend again.
03
Reset the message
Align landing experience to one sharp hook. Rebuild the story before scaling the audience.
04
Shift to retention
Protect margin while acquisition recovers — build LTV before re-investing in acquisition.
05
Consolidate channels
Reduce noise and actually learn from results — one channel before two.

The Trade-offs That Mattered

The tensions,
named honestly.

Speed vs Stability

Fast experimentation risks destabilising too many variables at once. We chose controlled testing over speed.

Tight focus vs Try everything

The urge to run every test in parallel was the pattern causing the problem. One bet, properly set up.

Short-term fix vs Message reset

A message reset would take longer but solve more. Conversion fix first — message work as phase two.

What Was Delivered

Four documents.
One direction.

Decision Brief
One page. The decision, the five options considered, the trade-offs made explicit, and a clear recommendation with the reasoning behind it. Not a list — a call, with evidence.
30-Day Experiment Charter
Two to three tests only, with named owners, weekly checkpoints, and explicit pass/fail thresholds. Spend couldn't drift without evidence triggering a review.
Risk & Unknowns Register
What needed to be validated first to avoid burning more budget. The assumptions that, if wrong, would change the recommendation — mapped before the first test launched.
Alignment Script
A ready-to-send brief for the team and the agency. Everyone operating from the same page, the same guardrails, and the same definition of what a successful 30 days looks like.

What Changed

The team stopped debating
and started moving.

1
Priority chosen
One lever, not five. The others were explicitly parked.
3
Tests max
Hard limit. Pass/fail thresholds built in before day one.
30
Day horizon
Time-bounded bet with a defined review point and clear criteria.
1
Brief sent
Team and agency aligned. One document, one direction, one cadence.

Decisions became measurable: improve X by Y, or stop. Spend couldn't drift without evidence. Execution moved faster because everyone had the same brief and a simple weekly cadence — not a new round of opinions.

Signals tracked post-engagement
Traffic quality
Tracked weekly
Conversion rate
Tracked weekly
CAC vs contribution margin
Tracked weekly
Wasted spend from unfocused testing
Tracked weekly

If this sounds familiar

If you're spending more and getting less — a Decision Sprint gets you to one clear bet.

The problem is rarely the budget. It's almost always focus. One session. One brief. One direction your team can actually execute against.

Get started today.