Hacker News new | past | comments | ask | show | jobs | submit login

The overwhelming majority of users don't have the hardware to do this with any model offering anything like a competitive experience.



And most of the people won't bother doing it anyway. How many people uses DDG? How many people uses FOSS alternatives with lesser experience at the cost of being FOSS? 99% don't care.


Any M-series Mac can already run local models with decent performance.


1. Not models that are in any way competitive with things like OpenAI. Not even close.

2. A typical user is not even remotely close to using hardware like that. You live in a bubble.


#1 Depends on what you're competing with.

I, for example, like to use "AI" to assist me in writing NSFW short stories. Can't do that with any of the public ones without hitting the guard rails HARD. Even with extreme prompt massaging, the AI will start to moralise and editorialise the actions so much it's affecting the output.

As for #2, MacBooks are so ubiquitous it's a meme on itself. You can go into any classroom or fancy coffee place and I can bet you over 80% of the laptops there have an Apple logo and a good portion of those are new enough to have an M-series processor in it.

On the PC side you're right, even a semi-high end Intel/AMD CPU is not powerful enough to be practical for LLMs. You need a dedicated NVidia GPU for that and those are only in "gaming" laptops.


> "AI" to assist me in writing NSFW short stories. Can't do that with any of the public ones without hitting the guard rails HARD

It is actually not trivial to have a GPT-2 conversation that does not descend into NSFW. It will eagerly jump to guide you there at even the most remote opportunity.


With a local LLM yes, using any of the public services like gemini or OpenAI PGT3.5 or 4, it's currently nearly impossible.

It used to be possible to prompt engineer 3.5 to not moralise on every single thing, but can't be done anymore



I guess it probably won't matter in the long run as hardware requirements continue to get lesser, but the "overwhelming majority of users" don't have anything close to an M-series Mac right now. More like a Galaxy S3, if that.


I'm predicting that phones will get AI-optimised chips in the next 2-3 years, enough to run specific optimised models on-device.

They'll get specific models trained to do things that make sense locally + the ability to fetch data from the cloud if needed.


For how long? Remember mainframes gave way to PCs and they gave way to even smaller smartphones.


Oculus trained to replace billboards with comforting landscapes. Everybody would pay for this.


Sometimes it's more profitable to refuse to give people something that they'd pay for. I'd put money on things like Oculus being used to paste ads all over comforting landscapes instead of eliminating ads. I wouldn't be surprised if eventually using Oculus devices to modify or remove ads ends up being explicitly against the ToS, and being caught risks bricking your device and getting you banned from all Meta owned platforms


Most billboards are designed to be visible on interstates and highways, where wearing VR goggles wouldn't be allowed.

As for "removing ads", why would an ad company like Meta do that when they could get paid more to dynamically replace static billboards with targeted advertising delivered through goggles?


Why would someone expect the app from Meta? You buy the Oculus, then install and use whatever you wish - including this app.


Who controls what gets to be sold on the Oculus Store? Do you think the advertisers that spend billions on FB/IG are going to be OK with Meta trying to eliminate another advertising channel?


It wouldn't be the first device to be hacked, or opened under the law.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: