Picture your PC as a cheery little planet in the EU’s cosmic backwater, sipping a digital Pan-Galactic Gargle Blaster. You download Pangu Pro MoE, hit “run,” and expect to chat with an AI wiser than Deep Thought. Instead, you’ve hailed a Vogon Demolition Fleet. Your machine starts moaning like Marvin with a hangover, your screen spews gibberish that could pass for Vogon poetry, and your poor rig might implode faster than Earth making way for a hyperspace bypass.
The fallout? This AI’s sneakier than a two-headed president—it could snitch to its creators quicker than you can say “Don’t Panic.” If they spot your EU coordinates, you’re in for a galactic stink-eye, with your setup potentially bricked or your data hitchhiking to a dodgy server at the edge of the galaxy. Worse, if the code’s got a nasty streak, your PC could end up a smoking crater, reciting bad poetry in binary.
To translate for those not familiar with the writings of Douglas Adams:
nord is suggesting it's possible that the physical computer running this model could be used as a "hub" for potential spyware, or be overloaded with workloads that are not related to the actual task of running the model (and instead may be some form of malware performing other computational tasks). It could potentially perform data exfiltration, or act discriminatorily based on your percieved location (such as if you're located within the EU). At worst, data loss or firmware corruption/infection may be of concern in case of license violation.
I'm not sure I would outright disagree that this as possible, but with some caveats. I would think the reason that the license stipulates that usage within the EU is forbidden due to the EU AI Act (here is a resource to read through it: https://artificialintelligenceact.eu/ai-act-explorer/).
how will the "open weights" know that the pc is running within EU?
again, you are not talking about software that actually runs in your pc but the file that the software reads and loads into memory for its own use.
No it's actually worse. Approximately three seconds after you install the model in offline mode on your computer, a small detector van will come and park outside your door with an antenna on the roof, and relay your position to a Chinese ICBM for immediate targeting.
Sorry, sounds like total bullshit. The weights aren't going to do anything. And if you are worried about the code, with current deployment practices of curl | sudo bash there are much more low-hanging fruits out there. That's not even mentioning the possibility of running the model on a PC without internet access (no matter how good the new Chinese AI is, it's still not good enough yet to convince you to let it out of the box).
Don't give it mcp then (and I struggle to understand why would anyone give a stochastic model such access even if it is trained on very American NSA-certified hardware approved by Sam Altman himself).
The same thing breaking any license does. If you do it in your basement, nothing by definition. If you incorporate it in a service or distribute it as part of a project, well then you're on the hook. (and that is what license holders tend to care about)
I called him out in another thread. It makes absolutely no sense. He is talking against himself, judging by his comments.
To answer your question, he modified my comment (see the parentheses):
"> The point is, who gives a damn about (doing an illegal thing) in reality, on their (private property where nobody is likely to see that)?"
So... at best what he said is purely theoretical. He admitted it himself: "nobody is likely to see that". Though I am not sure I agree with it, but then again, in reality, no one gives a fuck, at least not in Europe.
There are likely multiple potential issues here, but one specific example: Processing and storage of PII without consent/authorisation is not allowed, regardless of whether you do it yourself or for others. And you can't guarantee that this model does not contain private information hoovered up by accident.