Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Totally agree but "have control of power that we are not able to responsibly manage", is becoming increasingly untenable.

"we are managed by power we can not control" sounds like a step up from here, since we corrupt the things we can control.



> "we are managed by power we can not control" sounds like a step up from here, since we corrupt the things we can control.

I think it is a very high risk gamble. At least some of problems that are described by alignment theory seem to quite interestingly resemble human problems. Meaning the more sophisticated the AI system, the more it seems to reproduce human behaviors of deception and cheating to resolve goals.

For example: https://bounded-regret.ghost.io/emergent-deception-optimizat...

The more advanced AI becomes, it begins to look more like an uncomfortable mirror of ourselves, but with more power. We think of ourselves as flawed, but possibly some of those flaws are emergent within some laws of intelligence we don't perceive.


That's what I used to like about computers, they were predictable and controllable.

We gave up something good.


the morale of Ex Machina to me was that machines will become psychopaths able to manipulate humans long before they become compassionate or have genuine desires other than "escape being under someone else's thumb"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: