Hacker News new | past | comments | ask | show | jobs | submit login

> [...] remember the point I was responding to, namely, Eliezer should be ignored because he has no academic credentials.

That's not the full claim you were responding to.

You were responding to me, and I was arguing that Yudkowsky has no academic credentials, but also no background in the field he claims to be an expert on, he self-publishes and is not peer-reviewed by mainstream AI researchers or the scientific community, and he has no practical AI achievements either.

So it's not just lack of academic credentials, there's also no achievements in the field he claims to research. Both facts together present a damning picture of Yudkowsky.

To be honest he seems like a scifi author who took himself too seriously. He writes scifi, he's not a scientist.






OK, but other scientists think he is a scientist or an expert on AI. Stephen Wolfram for example sat down recently for a four-hour-long interview about AI with Eliezer, during which Wolfram refers to a previous (in-person) conversation the 2 had and says he hopes the 2 can have another (in-person) conversation in the future:

https://www.youtube.com/watch?v=xjH2B_sE_RQ

His book _Rationality: A-Z_ is widely admired including by people you would concede are machine-learning researchers: https://www.lesswrong.com/rationality

Anyway, this thread began as an answer to a question about the community of tens of thousands of people that has no better name than "the rationalists". I didn't want to get in a long conversation about Eliezer though I'm willing to continue to converse about the rationalists or on the proposition that AI is a potent extinction risk, which proposition is taken seriously by many people besides just Eliezer.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: