So in today's startups founders need to know about something called the minimum viable product or sometimes called the MVP. In startups today instead of building a complete finished product we're actually going to get out of the building and test pieces of the product incrementally and iteratively. And the question is, what are you going to test, why are you going to test it, and how do you know what time to test it at. The idea is at the very beginning of the company you're trying to answer two very general questions. One is, is the problem I think customers have or the need they have actually shared by anybody other than me. That is while I believe this, can I find other people who have this problem or need. And the way you elicit some of those responses might be something as simple as a website that says do you have this problem, click here. Or in fact, you could just simply say on a one‑on‑one interview, I'm trying to understand whether people share this problem. The next step is trying to understand whether the class of solution you're building, not the specific product, but whether the thing you're building in general solves the problem you've now kind of validated. Is gee, what if we had a website that did X, would that be exciting or interesting to you? Or interesting enough that you would use it or pay for it? Or what if we had a physical device that did X or Y. When you're trying to test the solution, some mockups, wire frames, clay prototypes of just a general idea sometimes really elicits feedback a lot quicker and better than just having a set of verbal descriptions.
So one of the hardest problems that entrepreneurs encounter, particularly those who are technical, is getting over the feeling that the MVP has to be bug free and complete. Actually it's the antithesis of this. You're not building a product, you're actually trying to elicit responses from customers. And while somebody might say well, I need another week to finish the code. The real question is is could we do a cheap hack with a PowerPoint slide or could we do something else to just figure out whether somebody would give us some answers. It might be that it comes down to you do need some functionality that really requires code, but the first set of questions you want to ask yourself is, is there some way to get here by not having to do all the engineering. Because that probably might be wrong at this point in time. It's not until we have something repeatable where people are saying yes, will you just finish it, this feature set out, and we'll buy it. Then you actually want to start committing to real serious development.
The other key idea is, you've got to remember, is what you're doing with these MVPs are you're using them to run experiments. It's not that you're just throwing them out, when you go out of the building with an MVP, you have a series of hypotheses you're testing. And we happen to make you write down what do you expect to happen in this experiment. Not just go out and find out what happens, but what do you think is going to happen. Steve, I think if I show people, you know, this feature, you know, 45 percent of them are going to raise their hand and give me a check. Great. Now let's go out and test it. What happened? No one liked it. Well you know what, that wasn't a failure, that actually saved you from six months of development. So what we're actually going to do is actually enumerate what those key characteristics of this test will be, take that data and use it to modify our hypotheses.
Some of the biggest mistakes entrepreneurs make in creating MVPs is thinking that they have any connection with the finished product. It would be nice if the MVPs were dragging you towards incremental completion of the product. But taking left and right turns cheaply and inexpensively is equally good. So again, the biggest trap is falling into the gee, I have to write the code or it has to match what our key architectural features are going to be later on. What you're trying to do is maximize learning.
One of the good examples that come to mind for me about an MVP was we had a team, this robotic lawnmower that naturally turned into a precision agriculture device. One of the key things they needed to prove to me and themselves was‑‑ one of their key hypotheses was whether they could tell the difference between a weed and a plant. And they said, Steve, we got it, it works great on the bench. And look at all of our science and technology. And I said well, that really doesn't matter, how's it going to work out in a dirty, dusty, noisy, agricultural field. And they went well, maybe a year from now we'll build a prototype and get it out there. And I said no, no, no, you need to do this in a class. They said we can't build the robot. I said no, I don't care. Why don't you put all the machine vision equipment on a cart you actually pull through the field and see if at least the machine vision equipment looking at dirt in an agricultural row can tell the difference in a live field rather than some simulated environment. And low and behold they worked their, parts of their body off, and actually figured out how do that within a couple weeks. And they were dragging a prototype of the robot that was actually trying to look for weeds versus plants. And they discovered, low and behold, holy cow, we could do this. It cut probably a year off their development cycle in giving them, and later on after the class potential investors, confidence that their technology would actually work.