Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder how much it's changing the learning curve vs just making the experience more comfortable.

>For someone who spent a decade as a “Ruby developer,” becoming a multi-language developer in less than a year feels revolutionary.

Revolutionary? They've snitched they have no frame of reference to make that claim. It would have taken "less than a year" with or without AI. They just spent 10 years not trying.

Everyone's first language learning experience is also learning to program. Learning a new language once you have years of professional programming practice is completely different.



Same here. Reading the article, I could not really relate to the experience of being a single-language developer for 10 years.

In my early days, I identified strongly with my chosen programming language, but people way more experienced than me taught me that a programming language is a tool, and that this approach is akin to saying "well, I don't know about those pliers, I am a hammerer."

My personal feeling from working across a wide range of programming languages is that it expands your horizons in a massive way (and hard to qualitatively describe), and I'm happy that I did this.


Good analogy IMHO. Knowing whether a given language is a tool vs a toolbox is important.


The idiosyncrasies of Ruby, like Perl and JavaScript, lead to a certain kind of brain damage that make it difficult to build correct mental models of computing that can then generalize to other languages.


Unless you’re writing instructions for a Turing machine the impedance mismatch between the real world and “computing” is always going to have idiosyncrasies. You don’t have to like a language to understand its design goals and trade offs. There are some very popular languages with constraints or designs that I feel are absurd, redundant, or counterproductive but I cannot think of a (mainstream) language where I haven’t seen someone much smarter than me do amazing things.

The language I consider the lamest, biggest impediment to learning computer science is used by some of the smartest people on the planet to build amazing things.


Although I disagree with your opinion (and who cares?), your comment reminded me of Winp Lo.

https://m.youtube.com/watch?v=d696t3yALAY


This is profoundly racist.


Thank you, but I wasn’t going for racist.

What you may have missed, from the perspective of your vertically scaled horse, is that you compare learning certain models to a mental disability. It makes calling my comment racist similar to the whole pot/kettle thing.

However, I do appreciate reading about such opinions because it offers a peek into the elitism that surrounds programming languages.

Also, as a person from a non-traditional and non-privileged background, Im a little unsure about how to proceed. Shall we cut our losses and move on?


> Thank you LMAOOOOO I love the response, it's similar to "Thanks for noticing" B A S E D


how is that video racist? calling it racist appears to be much more racist


From my own personal experience I'd add Visual Basic to that list.


I never understood the hate. Beyond the stranger syntax, it's not terribly different from a language such as Pascal. It's an old imperative language without too much magic (beyond strange syntax sugar).


Terrible language, excellent tools.


<< It would have taken "less than a year" with or without AI. They just spent 10 years not trying.

I suppose we can mark this statement as technically true. I can only attest to my experience using o4 for python mini projects ( popular, so lots of functional code to train on ).

The thing I found is that without it, all the interesting little curve balls I encountered likely would have thrown a serious wrench into the process ( yesterday, it was unraid specific way of handling xml vm ). All of sudden, I am not learning how to program, but learning how qemu actually works, but it is a lot more seamless than having to explore it 'on my own'. And that little detour took half a day when all was said and done. There was another little detour at dockers ( again unraid specific isseus ), but all was overcome, because now I had 4o guide me.

It is scary, because it can work and work well ( even when correcting for randomness). FWIW, my first language was basic way back when.


Lots of people just went the traditional way of learning things from first principles. So you don't suddenly learn docker, you learn how visualization works. And it's easy because you already know computer hardware works and it's relation to the OS. And that network course has been done already since years so you have no issue talking about bridges and routing. It's an incremental way of learning and before realizing it, you're studying distributed algorithms.


Eh, it works as an abstract, when you are intentional about your long term learning path, but I tend to ( and here I think a lot of people are the same way ) be more reactive and less intentional about those, which in practice means that if I run into a problem I don't enroll in a course, but do what I can with resources available. It is a different approach and both have uses.

Incremental obviously is the ideal version especially from long term perspective if the plan for it is decent, but it is simply not always as useful in real world.

Not to search very far, I can no longer spend more than a day on pursuing random threads ( or intentional ones for that matter ).

I guess what I am saying is: learning from first principles is a good idea if you can do it that way. And no for docker example. You learn how they should work. When playing in real world, you quickly find out there are interesting edge cases, exceptions and issues galore. How they should work only gets you so far.


My philosophy is something like GTD where you have tasks, projects, and area of responsibilities. Tasks are the here and now, and they’re akin to the snippets of information you have to digest.

Project have a more long term objective. What’s important is the consistency and alignment of the individual tasks. In learning terms, that may be a book, a library docs, some codebase. The most essential aspect is that they have an end condition.

Areas is just thing that you have to do or take care of. The end condition is not fully set. In learning terms, these are my interest like drawing or computer techs. As long as something interesting pops up, I will consume it.

It’s very rare for me to have to learn stuff without notice. Most will fall under an objective or something that I was pursuing for a while.


> They just spent 10 years not trying.

Being in the zone of proximal development will do that: https://en.wikipedia.org/wiki/Zone_of_proximal_development

We shouldn't dismiss high friction problems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: