I think there's some degree of nuance that isn't captured with comments like yours. I agree with your comment in principle, but there are interpretations of this method that are too extreme and that's where most of the bad MVPisms come from (and what I assume this blog post is a response to). As with any process, what most people are doing is not what was intended, and so we have to discuss what people are doing more so than the theory.
Data is only as valuable as the insight, understanding and intuition at the disposal of those gathering and actioning the data. There are organisations that truly embrace this idea of "we know nothing, the data will guide us" and they get lost because they're chasing a bunch of numbers for the sake of numbers on weak experiments that don't marry up to anything of meaning.
You need some amount of difficult-to-define "understanding" and "intuition" to be able to turn any learnings (whether gathered from experience or data) into a product: without it, you'll fail, regardless of process. There are many successful products that started out as someone absolutely committed to a belief in what is needed by the market, there are many successful products that were developed by following the data from day one, and likewise there are many unsuccessful products following both patterns.
My personal disdain for MVP-ing is exactly because it is so difficult to get right, and is very easy to get wrong, and it leaves little room for corrective action: I'm sure we all have experience with zombie startups, years old, still frantically "MVPing" everything under the sun. Personally, I'd much rather work within an organisation that has a clear vision about the problem they're trying to solve with flexibility around how it is to be solved, even if it ultimately leads to failure.
Building an MVP means getting something in front of customers as soon as possible, in order to test your product works for customers and change it if you're wrong. The entire point of an MVP is to give you as much time to change things as possible.
Everything you do prior to putting your product in front of someone might be wrong. The only validation that counts is the customer handing you their money. The longer you spend building before getting something out there the more expensive any mistakes might be. Consequently, making an MVP and only building the absolute minimum viable solution to a problem, give you as much correction time as possible.
I'd much rather work within an organisation that has a clear vision about the problem they're trying to solve with flexibility around how it is to be solved
We all would. The question is about how the org goes about getting clarity about the problem. If they think they fully understand it without talking to customers or putting something in front of customers in order to give the customer something to talk about, run the hell away. Those discussions will happen as soon as the customer sees something. If you've built a full product that you think is ideal, and then the customer doesn't agree, then you either lose the customer or you have to do a ton of work to change the product. Those are expensive mistakes that kill a startup. Having those conversations as early as possible is how you avoid that cost.
When I say there's little room for corrective action, I am specifically speaking about the choice to pursue the MVP strategy. A person who starts a company and decides to embrace the MVP strategy is stuck on a hamster wheel: if they've spent a year and not validated anything positively, they cannot "just" change strategy because the organisation has been built atop of the idea of small ideas, not big ideas.
My experience is that most businesses that pursue the MVP strategy never leave the MVP stage, regardless of the viability of the thing that originally inspired them, because they believe to MVP properly is to surrender all creativity, vision and intuition to numbers, which is in turn to surrender what's important when building anything.
No matter what strategy you choose, there has to be a meaningful amount of intuition and understanding to be able to make decisions: a purely data driven approach (with arbitrary goals like "someone paid us") is predicated on product development being entirely quantitative. Customers lie not just in what they say but also in what they do: getting a customer to pay is very easy compared to, say, building a sustainable business. People would have paid for faster horses!
People who espouse the success of the MVP strategy in their own businesses are often people with a level of creativity/intuition/understanding that most people do not have. MVP is a powerful tool in the right hands, absolutely, and I don't doubt that you're one such person, but MVP as a strategy for "normal" people has so many foot guns that I would never recommend it over a grand vision strategy. How many people are capable of picking the right customers to talk to? How many people are capable of understanding customer behaviour? How many people are capable of balancing outlier vs. representative behaviour?
I believe that normal people will have a higher chance of success by pursuing a vision than they would by following customer behaviour from the start.
You're expecting normal people to successfully invent the car? I have much more faith in them forming a relationship with a paying customer, and offering a small process improvement through a MVP.
An MVP will also help with appropriately market timing of your product. If the client is asking for a faster horse, you might want to look at innovating the saddle or horseshoe first. You might be 1000 years too early with your car engine idea.
The entire idea of MVPs is to leave as much room for corrective action as possible.
It doesn't happen often, but when it happens it really annoying that society is so reluctant to embrace an idea that people won't even acknowledge that the idea exists. If you name it, they will change the word meaning to the opposite of the thing you named. If you show it by example, they will completely ignore your idea on the example, and only ever accept irrelevant details.
We are seeing an example here, with MVPs. People really dislike anything lean and anything scientific. Merge the two ideas and you have a sure mind-repellent.
It's just the that acronym wasn't longer such as MVPI (Minimum Viable Product Iteration) or MVPLL ((Minimum Viable Product Learning Loop).
>Customers lie, but so does data.
The "data" in the MVP iteration framework doesn't lie because the relevant data is "paying customers". Either the startup has growing revenue from its product or it doesn't.
Data is only as valuable as the insight, understanding and intuition at the disposal of those gathering and actioning the data. There are organisations that truly embrace this idea of "we know nothing, the data will guide us" and they get lost because they're chasing a bunch of numbers for the sake of numbers on weak experiments that don't marry up to anything of meaning.
You need some amount of difficult-to-define "understanding" and "intuition" to be able to turn any learnings (whether gathered from experience or data) into a product: without it, you'll fail, regardless of process. There are many successful products that started out as someone absolutely committed to a belief in what is needed by the market, there are many successful products that were developed by following the data from day one, and likewise there are many unsuccessful products following both patterns.
My personal disdain for MVP-ing is exactly because it is so difficult to get right, and is very easy to get wrong, and it leaves little room for corrective action: I'm sure we all have experience with zombie startups, years old, still frantically "MVPing" everything under the sun. Personally, I'd much rather work within an organisation that has a clear vision about the problem they're trying to solve with flexibility around how it is to be solved, even if it ultimately leads to failure.
Customers lie, but so does data.