Hacker News new | past | comments | ask | show | jobs | submit login

> Understanding the difference between O(1), O(n) etc is essential for literally everyone who writes code.

No it isn’t.

It’s a great thing to learn and understand, and essential for designing and maintaining data-intensive systems, but your statement simply isn’t true.




Sure, you don't strictly need it to write working code. But you will very quickly run into situations where you're writing unnecessarily slow code because you don't know what you're doing.

To me, it's essential.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: