I got access to their free API access about five months ago, and I liked it so much that I converted to a paid plan. I don’t really understand some of the negativity here: OpenAI is now a for profit company, their GPT-3 APIs solve many difficult use cases, and they charge a very reasonable fee for reliable access.
It is a little sad that just a few large AI focused companies in the USA and China have the financial resources to build very large models, and I don’t like that situation. I have slowly come to accept that these companies can make a profit selling access, and as long as the access is reliable and reasonably priced, then that is sort of OK.
Just today in the news Google announced an alpha TF-GNN that is likely something that I will eventually use. Also today Facebook announced a 2G parameter model that understands a few hundred languages and I think they are going to make the trained model available for free if I understood their press announcement correctly. I don’t have the hardware resources to stand up something like FB’s new model, so OpenAI’s approach of running their model on their servers to power their API is better for me. I added new GPT-3 example chapters to my Common Lisp and Clojure books, so it is good to know that readers can now get API access tokens without waiting.
It is a little sad that just a few large AI focused companies in the USA and China have the financial resources to build very large models, and I don’t like that situation. I have slowly come to accept that these companies can make a profit selling access, and as long as the access is reliable and reasonably priced, then that is sort of OK.
Just today in the news Google announced an alpha TF-GNN that is likely something that I will eventually use. Also today Facebook announced a 2G parameter model that understands a few hundred languages and I think they are going to make the trained model available for free if I understood their press announcement correctly. I don’t have the hardware resources to stand up something like FB’s new model, so OpenAI’s approach of running their model on their servers to power their API is better for me. I added new GPT-3 example chapters to my Common Lisp and Clojure books, so it is good to know that readers can now get API access tokens without waiting.