I don't know if we can extrapolate, but I can imagine AI inference on our desktops for $500 in a few years...
https://github.com/lyogavin/airllm
However, it will be far slower as you said.
I don't know if we can extrapolate, but I can imagine AI inference on our desktops for $500 in a few years...