The Lean Startup: Minimum Viable... results!

Episode 92 - 05 Jul 2017

We're reached a very exciting stage in our Lean Startup Journey:

It's time to reveal the RESULTS of our Minimum Viable Product - our Minimum Viable Experiment.

  • Will we cleared for take-off - cleared to PERSEVERE?
    • Or will we before forced to regroup and try something else - to PIVOT?

Watch as "Agile Done Right", "Real Life Agile" and "Agile That Pays" go head to head!

And remember to grab a copy of the Lean Startup Cheat Sheet


We pulled on to the “Lean Startup” runway, and pressed the starter button on our Minimum Viable EXPERIMENT.


We’ll look at the results. Will we be CLEAR to PERSEVERE? Or will this PILOT need to PIVOT?


Welcome back to a rather exciting stage in our Lean Startup journey.

Today I’ll reveal the results from our Minimum Viable Product / Minimum Viable Experiment.

To recap very quickly: this whole thing started with a plan to create a 5-day mini-course.

And to do so using principles from The Lean Startup - the book by Eric Ries.

We looked at the Build Measure Learn model, and its hidden counter-clockwise flow:

  • Assumption, Metric, Experiment

After some waffling I came up with this assumption:

  • “The 5 Day Mini-course will result in more email opt-ins”

Which implied this metric:

  • “Email opt-ins”

But what would be a good experiment? What might we use as our Minimum Viable Product?

We looked to Zappos, Dropbox and Buffer for inspiration.

Long story short: Buffer’s “fake door” approach seemed the most applicable, even though it would FAIL to provide the METRIC we were looking for: a direct measurement of email sign-ups.

It would, however give us a direct measurement of INTEREST and DESIRE - necessary precursors to the the ACTION of email sign-ups.

And in the absence of an alternative, we pressed on.

Buffer's MVP

Buffer’s MVP delivers the old one-two by (a) promising something awesome, then (b) apologising that “it’s not quite ready”.

I stole it, painted it yellow and dubbed it the Mundane MVP.

All that remained was a mechanism to get people to the Landing Page. A Facebook advert would seem to fit the bill.

Set things up correctly, and we get two very useful metrics:

  • The proportion of people that see the ad that go ahead and click through
  • The proportion of people that get to the Landing Page and click to "stay informed"

I know what you’re thinking: why run one experiment when you could run three? Excellent point.

You were good enough to provide 100+ possible names for the course of some kind, from which we selected a final three.

(Actually four. One didn’t make it past Facebook’s profanity filter.)

Three experiments - differing only in the name of the course.

That’s the theory. Wanna see how it looks in practice?

The Real Experiment

Here’s the ad in my Facebook News Feed.

Before I click the button, I should warn viewers of a nervous nature should move away from the screen


This is the Landing Page. Not sure what was thinking about putting my mug shot in there.

Anywhere, there’s a nice big action button, a few bullet points, and another big button.

Both buttons lead to the “Thank you” Page - which should probably be called the “I’m very sorry page”.

A word of caution

One last thing before I show your the results: it’s all too easy when running an experiment to look at the result and convince yourself that the results confirm... whatever you want them to confirm.

We’re not going to fall into that trap, are we? After all. We have an important decision to make.

We need to decide whether to persevere, or pivot.

So before looking at the results, we should have a feeling for what good and bad looks like, starting with this metric:

  • the click-through rate from the Ad to the Landing Page.

According to Wordstream, the average across all industries is just 0.9%.

My targeting was rudimentary:

  • Region:USA
  • Age range: 25 - 65+
  • Interests: a short list of Agile-related keywords

Not very sophisticated. So I’ll be surprised to get a result above 1%.

What about the second metric:

  • The click-through from the Thank You page

Well, I’d like to think that anyone who made it this far would be quite likely to click through.

Unless, of course, my ugly mug puts them off.

Perhaps somewhere in the region of 50%?

The results

No more stalling: here are the results.

The adverts for "Agile Done Right", "Real Life Agile" and "Agile That Pays" were displayed:

  • 4868, 1989 and 2564 times

And were clicked:

  • 96 times, 23 times and 32 times.

Giving us click-through rates of:

  • 1.97%, 1.16% and 1.25%

A very positive result: all are higher than I was expecting - and "Agile Done Right" did very well indeed.

Landing Page performance

That’s the Ad performance. What about the Landing Page performance?

Number of visits to the landing pages (for "Agile Done Right", "Real Life Agile" and "Agile That Pays") were:

  • 35, 12 and 10.

Those numbers surprised me: I had expected them to agree with the ad click numbers above. A spot of Googling revealed that I should have expected a discrepancy between these numbers. But a discrepancy this large?

As for clicks on the orange buttons. They were:

  • 9, 3 and 3.

Giving click-through rates of:

  • 25.7%, 25.0%, and 30.0%

Ouch. Those rates are way lower than I was expecting.


Where does that leave us?

On the positive side, it Looks like there’s a demand for an Agile course - especially for an Agile course entitled “Agile Done Right”. Kudos to James Allen for coming up with the name.

On the negative side, click through numbers on the big orange buttons were way lower than I was expecting.

Was it really my face that made the difference?!?!

What do YOU think?

I’m really interested in your interpretation of these results.

Do they indicate that I should pivot? Or persevere?

Pivot means that we come up with a new idea - and test it with a new Minimum Viable Experiment.

Persevere means that I set to work and create “Agile Done Right. The mini-course” for real.

Let me know your thoughts in the comments below