Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can you explain yourself differently please? I have no idea what you mean.


You don't really touch at any point in your argument on even the possibility someone might be harmed, in the process of entire segments of the labor market being automated. Why is that?


It is assumed is anything with any kind of scale that harm with occur.

Did anyone get harmed when photography was used to supplant portraits? Did anyone get harmed when mail started getting sent by rail instead of horse? Did anyone get harmed when air travel became possible? Did anyone get harmed when we supplied electric power to homes?

I have an idea -- why don't you propose a solution to AI ruining creative jobs and we can apply that standard to it.


Price in the externality. The multiple of US GDP that OpenAI currently seeks in funding should certainly suffice to fund UBI, and if that slows down OpenAI's development of new capabilities, then that should still be preferable to the alternative of OpenAI being enjoined from doing business until that is done.

Of course you may respond that this is unrealistic, which it is; it requires a government capable of acting via regulation in defense of its citizens, and so nothing like it will be done.


I would love to have UBI. If AI fear gets that going I would be happy, but I must agree that is unrealistic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: