This is a clickbait-y headline, what actually happened is that OpenAI rolled out a new feature for summarizing PDFs, which a few folks had already oriented their startup around.
So now that this previously 3rd party feature is fully integrated into ChatGPT, that effectively kills any separate app trying to do the same thing.
The fact is, building a whole company that is basically a wrapper around ChatGPT never had any moat to begin with, and I'm not at all surprised that OpenAI would see what cool things people are building on top of ChatGPT and decide to offer that functionality themselves. It's predatory for sure, similar to what Apple has done over time by copying iOS features enabled by jailbreaking, but not surprising in the least.
I don't think it's predatory. I think they had it (and 700 other things) in their roadmap already. Anyone building a startup around OpenAI needs to at least consider: "is this a pretty obvious thing, already on their roadmap?".
If you are building a startup around OpenAI APIs (especially anything that’s a quick, low-hanging fruit, kind of thing where there’s little special sauce), either its misguided or you are putting it on their roadmap.
That's generally true, and OpenAI. especially with GPT-4, seems especially eager to roll features into the core offering.
Fair perhaps, but OpenAI doesn't need to build everything on their roadmap (what company actually does?). They could likely save themselves a good chunk of time to focus on what's really important (zero to one stuff), rather than reimplementing what someone else has already done.
Partnering with these smaller companies could be a good way for OpenAI to foster its community and spread the benefits of AI more widely; instead, my perception is that they're punishing the little guy for having a good idea and being able to act on it quickly.
> So now that this previously 3rd party feature is fully integrated into ChatGPT, that effectively kills any separate app trying to do the same thing.
I honestly take this as a litmus test for people who understand the word "positioning".
There's still a massive market for people who want a place with a UX focused around their documents and a specific task, vs ChatGPT's swiss knife approach.
If anything Google is probably the biggest danger to these startups as they start to launch their copilots for docs... not ChatGPT getting this ability
Yep, when MS + Apple + Google fully integrate ChatGPT and post-ChatGPT tech into their products, it’s basically game over for most “AI” startups. Same as how those same companies commoditized Dropbox into irrelevance.
Dropbox is a great example because they focused on positioning to endure
Now their pitch is more focused on collaboration and versioning, and better tailored towards creatives than home users as it once was
Positioning is what lets you win when your core value proposition becomes common: it won't work as well as when your value prop wasn't common, but it can still be highly profitable
I just looked it up and I was surprised to see Microsoft OneDrive launched only a few months after Dropbox. Seems like it did take a few years for the MS juggernaut to win out.
And also some others. The only one I can think of off the top of my head is the unintrusivw brightness bar, which had been available for jailbroken phones long before Apple added it in iOS 10.
Yes I remember the founder of pdf.ai trying to collect funding, and chatpdf was already a good tool even before that.
There were a lot of creators making a bunch of apps like this.
I’ve made a significant amount of money from a “talk-to-your-files” website and at no point did I feel entitled to OpenAI not competing with me. The writing was on the wall 6 months ago.
In our case we used it to learn about very specific niches that we pivoted to 3 months ago. OpenAI is pretty clearly going to do everything possible to make ChatGPT an enterprise workhorse. If you compete directly with them, you’ll get slaughtered.
Same experience here. When I build my current OpenAI project it was obvious that my main selling point would be being early in no way wouldn't there be hundreds of (even better) alternatives in a few months from then.
My competitors now have very polished interfaces, I didn't but I was early. Now the other tools obviously are more interesting and I am happy to move to the next fun thing :)
This should be a surprise to anyone. OpenAI has a problem and that is ChatGPT isn't going to exists in the long run at it current state. They have to build as many features into the product as fast as possible or they will lose all their customers to Google, Microsoft, Meta, and many other products that will be built using models that aren't OpenAI.
Why would I use ChatGPT to process my PDF when I can do it straight from my Google Drive, browser, or device? Why would I use ChatGPT to write a report if I can do it straight from Google Docs or Word? Why would I use ChatGPT to write a email reply for me when I can do it straight from Gmail or Outlook?
Now with open source LLM soon anyone can build AI products without needing to pay OpenAI for API requests.
IMO OpenAI is heavily overvalued especially with how fast open source LLM are being built. Highly unlikely ChatGPT will not survive this decade. OpenAI will need to build better products than what ChatGPT is or they will lose.
Why would I use Word or Docs, if I can do it straight with ChatGPT? Why use Outlook if I can send messages straight from ChatGPT? You have a point, OpenAI has a challenge, but this could also be the opportunity of the century - creating a new OS that redefines how we interact with PCs.
The same mistakes of so-called 'AI companies' especially ChatGPT wrappers being solely built on top of someone else's platform always risk getting themselves sherlocked.
Jasper.ai is a great example of a startup at complete risk after Sam Altman said that "We’re not trying to go and compete with our partners." [1] which a complete and obvious lie.
"OpenAI wins by default if they start to compete against you." [0]
This warning came about after many startups built their products on top of Facebook's or Twitter's APIs (who remembers Facebook apps?), and then later they changed their platform which nerfed the startup. This isn't the same as OpenAI is effectively being used just a utility, which there are already very good alternatives for.
Its common sense to know that mega crops will make everything free to use in a non subscription model. Its the lack of business intellect in these start-ups that most innovations fail.
I’m not a subscriber to open ai. I don’t care for it. Now. I get mail all the time in my non-native language… if I can pop that in my scanner, send it to your service and get it back in English… I’ll pay for that. I don’t need a chat bot.
Please just save me from playing the “convince my phone to focus on the page so Google translate will work” game.
I wouldn't trust an AI like GPT to do translations. It's bound to hallucinate things that are not in the original writing. You'll just want to use Google translate.
GPT AIs are great for creative works where it doesn't matter if they just make things up.
I can confirm that this is my impression as well. Also chatgpt somehow is much better picking the right words when automatically translating i18n files with individual words that only work in a context. Even 3.5turbo outperforms google or deepl for me.
However it definitely tends to make things up. I would be very cautious translating longer business texts if you can't correct them yourself.
I presume when building something if that is feature or product/ service. Majority is questioning pdfs as sort of semantic search with LLM support to deliver “conversations” back in the UI. This kind of thing was part of Azure OpenAI playground since July.
Tried that and failed. I know it's going to work in a few months from now, or maybe I didn't find the latest tool however to get to the same availability I would need to spend so much more on hardware than on chatGPT. Rented or bought, both is expensive.
If your use case needs 2-5 servers with a 16 or 70k model. Just to keep reaction time low it doesn't scale right now.
You don't know how to run the infra for this then. For batch workloads, I can get slightly higher throughput with a 13b llama2 fine tune on my local workstation and 2x 3090s than I can get by saturating my OpenAI api key token limits for 3.5-turbo.
That works for batch workloads but how could I scale 0 to 30 messages that need kinda fast processing per minute?
I tried with one 'small' server, but allowing bigger requests to for users as well it sometimes takes over a minute to process. So I added a second smaller one that just handles smaller queries and only if the bigger one is already in use. That works for my average workload of 1-3 messages a minute to get kinda fast responses similar to OpenAI and costs me over $1000 a month on vast.ai or ~$6000 in graphic cards (2x3090 + 2x4090 without power)
But that doesn't work on the weekend where I process up to 30 messages a minute. So I need another 2 X 2x4090 or more adding another $1500 on vast.ai or $6000 in hardware.
Every actually owned card would add about $50 to $70 to my monthly power cost.
Also vast.ai is kinda a gamble but one of the few ways to get access to cheap consumer cards. Renting servers from actual companies in data centers highly drives the price up.
All while my highest monthly OpenAI bill was about $600 so far (with generous limits).
Uh, not sure how I missed that last line in your original post about this being a more real-time requirement.
No, that for sure complicates things if you don't have a "base load" you can use your dedicated compute for when your real-time workload is lower than capacity. You would need to design (or use) hybrid local / cloud scaling solution here if you want to keep your costs low.
Something that can spin up some workers when your request queue grows, and kills them when it shrinks.
So now that this previously 3rd party feature is fully integrated into ChatGPT, that effectively kills any separate app trying to do the same thing.
The fact is, building a whole company that is basically a wrapper around ChatGPT never had any moat to begin with, and I'm not at all surprised that OpenAI would see what cool things people are building on top of ChatGPT and decide to offer that functionality themselves. It's predatory for sure, similar to what Apple has done over time by copying iOS features enabled by jailbreaking, but not surprising in the least.