Hacker News new | past | comments | ask | show | jobs | submit login

Wildly different goals, given that there should be some interesting reason to avoid GC nowadays.



How are they "wildly" different. I can see different but "whildly" really?

GC is an implementation detail with some performance characteristics. Nim can turn its GC off. It can do a soft-realtime GC behavior where you limit its maximum time slice.


> Nim can turn its GC off.

At the expense of losing memory safety.


An interesting to reason to avoid GC would be a belief that you could make a usable general-purpose language without GC.


Or having code that's callable from another GC-ed language like Ruby/Python/etc. Two GCs dancing around each other (e.g. Python calling Nim) is a recipe ripe for problems.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: