I think this kind of concierge / person behind the curtain approach only works for a certain type of business though, usually services where aspects of the service pipeline can either be automated or not.
For purer technology / product businesses, how do you do this, fundamentally? How would Google have manually mocked up their early product? How would Facebook? Github? Tesla for that matter?
Sometimes you really do just unavoidably have to build the product out before testing the market, and if it doesn't work, just accept the sunk cost and throw it away - and sometimes fail completely as a result.
I don't see this as a fundamentally solvable inefficiency, just part of how tech product startups work, and the very tradeoff that must be made to allow for innovation to happen.
there are still smaller pieces you can MVP to a smaller audience before launching it to the world.
> Google have manually mocked up their early product? How would
Crawl an intentional community (remember webrings?) or other small directed subset of the web and see if you're able to get better search results using terms you know exist in the corpus, rather than all of the Internet.
> Facebook?
They had Myspace as an example so the idea wasn't exactly unproven.
> Github?
Kernel developers were already using the software, successfully, all (for large values of "all") they did was bring it to a wider market with a better UI.
> Tesla for that matter?
People don't get this, but Tesla's biggest achievement to date, isn't the cars themselves, but the factory that they're built in. There's no way to MVP an entire factory, but building a car in a bespoke, pre-assembly fashion is totally possible and totally doesn't scale.
If you're asking if electric cars were known to work, the first one came out in 1832. If you're asking about product-market fit, they keep selling cars they haven't made yet, just to gauge demand. Aka where's my Cybertruck!?
> > Google have manually mocked up their early product? How would
> Crawl an intentional community (remember webrings?) or other small directed subset of the web and see if you're able to get better search results using terms you know exist in the corpus, rather than all of the Internet.
But that isn't a mock up, it's the real thing but on a smaller dataset. If you're going to do the real thing anyway, why not run it on all the data you can?
After all, the throttling factor to release is in the engine, not in the dataset. If you're going to write the full engine anyway, there's nothing to be saved by limiting it to a subset of the data.
Google was manually editing and merging Perl scripts to get web scraping data almost every day early on. Yahoo manually verified content and added it to lists with hand typed summaries fir years, even up to the time Google came in the scene.
You are right that some businesses scale better doing high touch customer service like this. In the case of Pilot, you have the lead sales guy and accounting domain expert (CEO) asking SaaS customers to schedule enterprise service sales calls with him.
Which makes total sense. What he’s not automating is setting up Marketo or Hubspot and Drip or Constant Contact and committing to some CRM system that is both impersonal and adds friction when you’re going for quality over quantity.
He could hire BDRs and CSMs or outsource and spam and set up AI and knowledge bases, and possibly scale up faster.
But not only would that take away the personal touch and competitive advantage, he’d lose the opportunity to educate himself on what real customers really need.
Not to mention all the time spent evaluating tools and setting up automation and negotiating contracts that lock you in to a specific process that might not be what you want 2 years down the line when your business changes.
For purer technology / product businesses, how do you do this, fundamentally? How would Google have manually mocked up their early product? How would Facebook? Github? Tesla for that matter?
Sometimes you really do just unavoidably have to build the product out before testing the market, and if it doesn't work, just accept the sunk cost and throw it away - and sometimes fail completely as a result.
I don't see this as a fundamentally solvable inefficiency, just part of how tech product startups work, and the very tradeoff that must be made to allow for innovation to happen.