Hacker News new | past | comments | ask | show | jobs | submit login

Does anyone have a good feel for how likely it is that OpenAI might be running it at this price to get companies hooked, with plans to then raise the price later on once everyone is locked in?

I'm personally much more excited about the LLaMA + llama.cpp combo that finally brings GPT-3 class language models to personal hardware. I wrote about why I think that represents a "Stable Diffusion" moment for language models here: https://simonwillison.net/2023/Mar/11/llama/




They want to take the widest possible share, which atm, without competition means bringing on people/companies that wouldn't otherwise consider it.

The price will only go down when competition appears. They can only slow it down with the cheapest possible offering (to put market entry bar higher for competitors). They don't know what competition will do, but they know if they move fast they'll have very low chance of catching up anytime soon and that's all that matters.

Competition will be interesting because interface is as simple as it can be (easy to switch to different provider).

Providers can hook people though pre-training but I don't know if it's possible to do dedicated pre-training on large models like this. They may need to come up with something special for that.


That's how Google maps did it. Got everyone hooked on cheap tech then hit them with a few order of magnitude price hikes.


I would bet a fair amount of money that they will not raise prices.

It is better for OpenAI to be a utility that is used by a million companies.

Simon, I share your enthusiasm for llama.cpp (from your blog today) and also Hugging Face models. That said, I like self “hostable” tools as a fallback - I would rather usually just pay for an API.


Importantly, the self-hostability factor ensures an alternative is available to those who dare. This protects end-users against price gouging and lock-in, and contributes to healthy competition -- this serves as a positive motivational force for service providers like ClosedAI et. al. to keep improving and adding novel functionality.


It's very likely, they are in a race and that's the tech playbook to win the market.


I'm looking forward for models to be embedded in video games, for example NPC built from a LLM. We could have convincing and rich interaction that can even change the world inside the game with some rules. GPT could not only be use for conversation but also for action. With some internal memory for a character a prompt like "Is this character is going to attack the player" or "Where this character is going to after the conversation" can lead to very convincing NPC AI and rich story.

Even intricate emergent simulation would be very interesting, for example a colony-sim game like Rimworld or Dwarf Fortress where pawns'AI is directed by a GPT model would be largely ahead of what we have today.


I pointed that out in the caveats since that happened with Google Maps, but in practice I don't think it'll happen (or if it happens it will only be a slight increase) since that would seriously upset its users. Especially since the low price was likely due to competition anyways.

In the case of Google Maps it was effectively a monopoly.


Being a monopoly is what OpenAI is aiming for.


Specifically in the case of Google Maps it was a de facto monopoly, and thus has full control of pricing, due to the lack of good competitors (OpenStreetMap doesn't count).

For LLMs, instead competition is very fierce which will pressure down prices such as here with the ChatGPT API.


The Google Maps pricing change was the best thing that happened to other map providers. I've never seen so many websites and apps using OpenStreetMaps, Yandex Maps and Apple Maps.

That pricing change was extremely short-sighted, they thought no one would switch but their competitors were ready with easy to integrate APIs and much better pricing.


Depends how much competition ends up in this market. If there is plenty of competition that gives good results at a similar costs rising prices will be difficult. Now if it actually costs far more to run than the API cost is currently, we'll see it go up.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: