build measure learn
Putting your MVP to the test: Mea­sur­ing your startup’s impact

This sec­tion is for you if …

  • you’re start­ing an impact start­up or already deep in the process with your team.
  • you can clear­ly define your tar­get group, the prob­lem you’re solv­ing, your solu­tion, and its intend­ed impact.
  • you’ve devel­oped a val­i­dat­ed pro­to­type.
  • you’ve already gath­ered some ini­tial evi­dence of impact at the out­come lev­el.
  • you’ve iden­ti­fied a mar­ket and built an ini­tial busi­ness mod­el.
  • you’ve devel­oped a Min­i­mal Viable Prod­uct (MVP).
  • you’re able to ensure the impact of your solu­tion at the out­come lev­el.

In this sec­tion, you’ll learn how to …

  • gath­er direct feed­back on your MVP from your tar­get audi­ence.
  • iden­ti­fy issues in prod­uct devel­op­ment.
  • col­lect data on both out­put and out­come indi­ca­tors for impact mea­sure­ment.
  • assess how scal­able your solu­tion real­ly is.

Test­ing your MVP’s acces­si­bil­i­ty and accep­tance

Is your MVP tru­ly acces­si­ble and con­vinc­ing to your tar­get group? Test it with pilot cus­tomers and test users to uncov­er weak spots and gain insights for improve­ment. These four steps will guide you through the process

1. Set clear test­ing goals

Start by defin­ing what exact­ly you want to test. Is it a web­site, an app, a work­shop, or a con­sult­ing ser­vice? Use qual­i­ta­tive tests to gath­er feed­back on the user expe­ri­ence and spot issues, and quan­ti­ta­tive tests to col­lect mea­sur­able data like time spent on a task or task suc­cess rates. This data can serve as use­ful bench­marks for your project.

2. Recruit par­tic­i­pants

Next, recruit the right par­tic­i­pants. Ide­al­ly, reach out to ear­ly adopters who rep­re­sent your tar­get group (see “From pro­to­type to impact: Build­ing a sus­tain­able busi­ness mod­el for your impact start­up.” For mean­ing­ful results, aim for five to ten test sub­jects per tar­get group seg­ment.

Tar­get group vs. cus­tomers

In the Lean Impact Jour­ney we dis­tin­guish between your tar­get group when talk­ing about your impact mod­el and the prod­uct – and cus­tomers when talk­ing about your busi­ness mod­el.

How you apply these two terms depends on your spe­cif­ic solu­tion. In this play­book, the tar­get group includes both the users of your solu­tion and those who ben­e­fit from it.

Depend­ing on your solu­tion, those groups may over­lap. If they don’t, make sure to test your MVP with peo­ple from both.

3. Cre­ate a test sce­nario and tasks

Now it’s time to cre­ate real­is­tic test sce­nar­ios that show how your MVP will actu­al­ly be used, and design tasks that go along with them. Make sure the sce­nar­ios are rel­e­vant and easy to under­stand – clear, spe­cif­ic, and mea­sur­able. Decide whether you’ll run the tests in a con­trolled set­ting, remote­ly, or in the par­tic­i­pants’ every­day envi­ron­ment. And make sure all the tools and mate­ri­als are ready to go.

4. Run the test

Dur­ing the test, ask par­tic­i­pants to think out loud as they use your MVP. Care­ful­ly observe how they inter­act with it and doc­u­ment your find­ings so you can make tar­get­ed improve­ments.

User test­ing

User test­ing comes from the soft­ware sec­tor but works just as well for non-dig­i­tal inno­va­tions. Here are some sug­ges­tions for adapt­ing the method to your needs:

  • Talk about par­tic­i­pants or the tar­get group instead of “users.”
  • Con­sid­er ele­ments or aspects of your solu­tion instead of “func­tions.”
  • Use phys­i­cal pro­to­types, role-play­ing games or sim­u­la­tions instead of dig­i­tal inter­faces.
  • Build real­is­tic sce­nar­ios that show how and where your solu­tion would be used.

How quick­ly does your MVP deliv­er added val­ue?

TTo check how effi­cient your solu­tion is, you can cal­cu­late the Time To Val­ue (TTV) . TTV mea­sures how long it takes for your tar­get audi­ence to expe­ri­ence the first notice­able ben­e­fit from your solu­tion. It helps you to assess the effi­cien­cy of your MVP and spot areas for improve­ment.

1. Define the val­ue, start­ing point, and end point

First, clar­i­fy what added val­ue means for your tar­get group. That could be reach­ing a goal, using a key fea­ture, or gain­ing use­ful knowl­edge.

Then define when the clock starts tick­ing — maybe it’s at signup, reg­is­tra­tion, pur­chase, or par­tic­i­pa­tion – and when the val­ue becomes notice­able, for exam­ple by solv­ing a prob­lem or suc­cess­ful­ly using your solu­tion.

2. Mea­sure the time and cal­cu­late the aver­age

Track the time between the start and end points using track­ing tools like Google Ana­lyt­ics or heatmaps for tech-based solu­tions. For non-dig­i­tal offers, Excel sheets or par­tic­i­pant track­ing can do the job.
Exam­ple: A par­tic­i­pant signs up at 9 am and com­pletes a suc­cess­ful action at 11 am.

TTV = 2 hours.

Add up the TTVs for all par­tic­i­pants and divide by the num­ber of par­tic­i­pants. Exam­ple: A needs 2 hours, B 4 hours and C 3 hours.

Aver­age TTV = (2 + 4 + 3) / 3 = 3 hours.

3. Seg­ment your data

Look at dif­fer­ences between groups or sce­nar­ios to spot oppor­tu­ni­ties for improve­ment. For exam­ple: How does the TTV dif­fer between new users and exist­ing users

Improve with A/B test­ing

A/B test­ing is a pow­er­ful way to improve your MVP. You cre­ate two dif­fer­ent ver­sions of your solu­tion and test which one per­forms bet­ter with your tar­get group. To get mean­ing­ful results, show each ver­sion to a dif­fer­ent group and col­lect spe­cif­ic data – such as pref­er­ences, under­stand­ing, or par­tic­i­pant expec­ta­tions. These insights help you refine your MVP.

Exam­ple ques­tions you can ask:

Pref­er­enceWhich ver­sion do you pre­fer, and why?
Under­stand­ingWhat mes­sage does ver­sion A/B con­vey?
UsageHow like­ly are you to take [Aktion]?
Emo­tionWhat emo­tions does ver­sion A/B trig­ger in you?
Improve­mentsWhat would you change to make ver­sion A/B bet­ter?
Con­textIn what sit­u­a­tion would you pre­fer ver­sion A/B?
Com­par­i­sonWhat dif­fer­ences do you notice between the ver­sions?
Open-end­edWhich ver­sion bet­ter meets your needs, and why?

Run pilot mea­sure­ments to track your impact

How do you know your MVP is mak­ing a real dif­fer­ence? Pilot mea­sure­ments let you test its impact across lev­els 1 to 6 of the Impact Lad­der. Use tar­get­ed ques­tions and track­ing tools to high­light your suc­cess and con­tin­ue improv­ing your MVP.

1. Set up track­ing tools

Choose tools or soft­ware or meth­ods that fit your data col­lec­tion needs (e.g., sur­veys, data­bas­es, ana­lyt­ics tools). Make sure track­ing is built into your exist­ing work­flows from the start.

Here are some meth­ods you can use:

  • Quan­ti­ta­tive data col­lec­tion: Count ser­vices deliv­ered or prod­ucts used, mea­sure activ­i­ty lev­els or the num­ber of peo­ple reached, con­duct struc­tured obser­va­tions
  • Qual­i­ta­tive meth­ods: Use semi-struc­tured or open inter­views, obser­va­tions, doc­u­ment reviews, or before-and-after sur­veys

Ide­al­ly, you’ve already cre­at­ed a data col­lec­tion plan in the pre­vi­ous chap­ter “Devel­op your MVP: Test your solu­tion under real-world con­di­tions.” If not, take a moment to go back and do that now.

2. Use the Impact Lad­der to assess your MVP

To see whether your MVP is cre­at­ing impact across lev­els 1–6 of the Impact Lad­der, you can ask ques­tions like :

Step 1 – Activ­i­ties take place as plannedWere you able to use all planned ele­ments of the MVP?
Did you run into any tech­ni­cal issues or bar­ri­ers?
Step 2 — Tar­get groups are reachedHow did you hear about our solu­tion?
Do you feel like you’re part of the tar­get group?
Step 3 – Tar­get groups accept the offer­ingHow sat­is­fied are you over­all with the solu­tion?
Would you use it again or rec­om­mend it?
Step 4 — Tar­get group gains aware­ness or skillsWhat new knowl­edge did you gain by using our solu­tion?
Has your per­spec­tive on [rel­e­vantes The­ma] changed?
Do you now feel more moti­vat­ed or bet­ter equipped to apply what you’ve learned?
Step 5 — Tar­get group changes behav­iorHas our solu­tion led you to make any changes in your dai­ly life or work?
What con­crete steps have you tak­en after using it?
Step 6 — Tar­get group’s liv­ing con­di­tions improveHow has your per­son­al or pro­fes­sion­al sit­u­a­tion improved since using our solu­tion?
What spe­cif­ic pos­i­tive changes can you attribute to it?

Next chap­ter: Fund­ing strat­e­gy

You’ve sys­tem­at­i­cal­ly test­ed your MVP, gath­ered feed­back, and eval­u­at­ed its impact. Before you start refin­ing your MVP and the­o­ry of change, we rec­om­mend devel­op­ing a fund­ing strat­e­gy.

That’s what we’ll tack­le next – so you can put both your MVP and your fund­ing approach through a real-world stress test.