Build, Measure, Learn: How to Test Your MVP (Minimum Viable Product)
Abby Ross is Co-Founder and Chief Product Officer at ThinkCERCA, an EdTech company that builds students’ critical thinking skills through writing. Abby recently shared her expertise in testing minimum viable products at the 2017 PE Summit in Chicago.
Most startups have adopted the Lean Startup methodology popularized by Eric Reis. Essentially, the Lean Startup approach gives startups a method for quickly testing their assumptions about what they’re building before they spend time, resources and cash creating something that doesn’t solve their customers’ problems.
At the center of the Lean Startup methodology is the concept of the MVP, or Minimum Viable Product. An MVP is an early-stage version of the final product, but it only has features sufficient for learning and getting feedback from the earliest customers. Startups use the MVP as part of the “Build, Measure, Learn” feedback loop. First, the MVP is built and tested, then data points are gathered and measured, and finally the startup learns enough to iterate on the MVP, until the product has reached a stage where it can confidently go to market.
Abby Ross and her co-founder utilized “Build, Measure, Learn” as they tested the MVP for their company, ThinkCERCA, and they continued to apply the methodology to their company as they built out new features.
Photo credit: Amanda Gentile/ADG Photography
Build: Create your MVP (Minimum Viable Product)
Building an MVP shouldn’t take long, nor should it require tons of resources. Abby launched ThinkCERCA’s MVP and got it in the hands of 30 paying customers (English departments at area schools) in only eight months. “You can build and launch an MVP in one hour,” says Abby. “There should be nothing long about an MVP.” Abby’s suggestion: start with paper and pencil. “Paper and pencil are you best friends. You can do a paper-based prototype in under 15 minutes. ” Abby suggests using paper and pencil to map out each page of the website. “If someone doesn’t know what to do [when they see your map], you can rip it up and draw up a new sheet of paper.”
Figure out what’s the most important thing that your target customers need, and that should be the core of what you build for your MVP. What you decide to build should solve a problem (a very specific and compelling problem), while still being something that makes your company competitive:
Are you serving a specific target audience? Narrow your audience down and then scale it back up.
Does it address a pain point for your users? You can’t be everything for everybody—focus on solving a specific problem. “I went in and sat in classrooms and principals’ offices—I just watched what they do everyday, which really helped me develop some empathy. I just really wanted to understand what the true pain point of educators was.”
Is it testable? “Almost anything is testable, but you have to make sure you know what metrics of success you are testing,” says Abby.
Can it be built and launched in a week?
Building an MVP should be continuous improvement starting with a very core and foundational idea. Adding to the MVP should be based on rigorous testing, and should be done as an iterative process. “It’s a continuous process of iteration,” says Abby, “[and] you’re adding one thing to make [the MVP] better each time.”
Measure: Take a Hard Look at the Data
“Having an MVP mentality [means] you’re going to be testing a lot. You’re going to build something small, you’re going to get some feedback, and then hopefully you’re going to [learn] and grow from that.”
Abby recalls one of her first tests of a ThinkCERCA prototype: she and her co-founder traveled to a conference to put the MVP in front of teachers and get their feedback about the product. After teachers interacted with the product, Abby and her co-founder presented them with a short survey that asked, on a scale of 1 to 5, whether the teachers would use this ThinkCERCA software in their classrooms. At the end of that first day of the conference, Abby and her co-founder were completely devastated.
“We looked at the data at the end of the night, and we ended up with a 1.2 [average score]. NOBODY wanted to use this product we had it built. It was heartbreaking. We had been so excited about this thing we had built, but what it actually ended up being was exactly what we didn’t want to build.”
Learn: Iterate Based on the Evidence
Abby stresses that when you build a feature for your MVP, it’s important to watch the actual human interaction with that new feature and learn from that interaction. The feature may not always be necessary, and you’ll find that out from watching how people experience that feature. “Ask people to rate their interest in what they did, or to rate the feature [you built],” says Abby. “That should give you a really nice understanding of where the value in your product is as you iterate.”
“It’s been proven time and time again that 80 percent of your users utilize 20 percent of your features, and likely they’re all using the same 20 percent,” says Abby. “Make that 20 percent really count.”
Abby’s advice comes from her Chicago #PESummit17 workshop, “How to Test Your MVP (Minimum Viable Product).” Listen to the entire workshop (and take note of the resources Abby shares) on episode 57 of our podcast, and subscribe on iTunes, Google Play or SoundCloud.