Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As expected, AI is great for stuff thats been done a million times. AI will slow you down for anything even slightly novel.


99% of coding is doing something that's been done a million times and gluing it together in a novel way.


for now


for all of time as long as machine learning = AI

If it's not in the dataset the AI won't handle it correctly (unless it's trvial and a linear model is good enough, but then why even use AI)


How long until the statement is false? You seem to have something in mind.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: