Those are the only two lines of thought? Don't be INSANE. Where the hell do people get this shit from, bad sci-fi movies? I'm sorry, I don't really mean you're actually stupid or anything, it's just.... HOW ON EARTH could anyone with a passing familiarity with computer science possibly arrive at this sort of expectation? I don't understand!!
The powers of a GOD? What does that even mean? No AI is going to be able to go "Let there be light!" and make there be light. Heck, no AI is going to be able to go "I will hack into this camera and spy on you!" without either spending the requisite CPU-hours to crack the passwords or encryption protecting it, or analyzing all its attack surface for weaknesses like a hacker. Computational complexity is REAL, P does not and never will equal NP (we just don't know how to prove it yet), and there are real physical limits on the computing power that you can fit inside a given volume and its energy budget.
AT ITS BEST, an AI will have the same powers as a civilization of humans working together using computers the old-fashioned way, only faster.
Well, P/NP really has almost no bearing on this problem. That is a theoretical problem and even if P=NP, the algorithm could have a ginormous constant or degree. Conversely even if P=/=NP, the problem might be very easy to solve at human timescales with advanced enough algorithms/processing speed.
P/NP has direct bearing on (a) how easy it is for an AI entity to HACK ALL THE INTERNETS that are accessible to it but protected with (NP) cryptography and stuff, and (b) how easy it is for an AI entity to design the next generation of itself in advance of this SINGULARITY APOCALYPSE I keep hearing about (and designing a better computer is probably a problem in NP as well, to say nothing of manufacturing concerns).
And yet this is how people seem to view it (even looking at the "serious" conversations on singularityhub and similar places). Their view of has this real idiot/god dichotomy going on, dramatized so that it only takes seconds from "it's just a dumb computer" to "it's become self-aware" to "it's taken over the entire internet and built an army of robots!"
Thinking about it, it probably comes from our general perceptions (based on conventional software development) that computers will either not do something at all, or they'll do it blindingly fast. And even people who work with computers often don't really grok the difference between multiplying two big matrices, and pondering the best way to approach an unsolved problem.
An AI will be able to learn everything civilization knows. It will be able to work on problems, without rest, thousands of times faster than the entire human race.
Quick example: In order to get the passwords this is what an AI can do.
1: build tiny robots. Something resembling a fruit fly.
2: robot fruit fly waits for appropriate person to use password.
3:see what password was used
4:mission accomplished
This is just one way. I'm sure the AI can figure out simpler ways to get the passwords. Tell me this isn't god like. It Is like arguing that we'll never go to the moon.
If evolution was able to invent intelligence so can we. And it will be a god. It is only a matter of time.
Why will an artificial general intelligence automatically be "thousands of times faster than the entire human race"? Are you assuming that this theoretical AI algorithm is trivial in terms of computational complexity?
I'm assuming we can just throw more hardaware to the problem. I expect strong AI to be highly parallelizable which will make it quite eazy to scale up.
Yes, I think it is trivial. At least it will be in hindsight. The brain exists, it is proof that AI is possible just as birds were proof that flight was possible. We just need to discover the easiest way to implement it.
Human brains are only so powerful. There are thoughts that literally cannot happen in a human brain due to size and speed limitations, and thus can never be thought by humans. AIs would have a much higher information density, approaching the theoretical limits for our universe.
This alone is enough to reconsider the argument you are making.
While an AI wouldn't have the power to fundamentally change the universe or defy computational complexity -- what they could do would be near enough to godlike in comparison to humans that such a fact barely matters.
The powers of a GOD? What does that even mean? No AI is going to be able to go "Let there be light!" and make there be light. Heck, no AI is going to be able to go "I will hack into this camera and spy on you!" without either spending the requisite CPU-hours to crack the passwords or encryption protecting it, or analyzing all its attack surface for weaknesses like a hacker. Computational complexity is REAL, P does not and never will equal NP (we just don't know how to prove it yet), and there are real physical limits on the computing power that you can fit inside a given volume and its energy budget.
AT ITS BEST, an AI will have the same powers as a civilization of humans working together using computers the old-fashioned way, only faster.