It is widely recommended that learning a new language will improve your skills with languages you already know, and I've never heard anyone contradict this.
But I've never noticed this in my experience. Moreover, whenever I come back to a language after any time away from it, I'm amazed how much time I spend remembering (or looking up) superficial differences.
Stuff like looking up whether I get the length of a vector with "len" or "length" (for a language I already knew) just feels like a distraction.
One thing I'd suggest is to learn a vastly different language. If you use ruby all day, learning python probably won't give you the type of insight that learning C would. Or learning a lisp.
So much is looking at things in a different way, and a quick way to do that is by picking something across the metaphorical language town, rather than the language next door.
Exactly. Although I've found it helpful to learn similar languages too (i.e. more than one scripting language). Seeing how subtle differences in the languages play out (beyond the superficial len/length type differences) has helped me understand some of the more subtle concepts of computer science. That said, you probably get the most bang-for-the-buck by picking up a language with a fundamentally different programming paradigm or at a lower or higher level than what you already know.
Languages exist to solve certain classes of problems. PHP, for example, is primarily designed as a web language. Learning ANY language designed for other uses will teach the developer something new. Once you have learned something like Python or Ruby, learning C would probably be the next logical step.
Are you venturing outside of the realm of C/Java descendants?
The nature of how those languages model execution, their basic paradigms, data structures and design patterns all fall within a fairly narrow scope. Stepping outside those trappings lets you see them with a bit more clarity because you're forced to stretch your brain to accommodate something altogether new and get fresh perspective on the problems the languages are solving and how they solve them.
You won't necessarily gain any perspective if you're writing the same exact kind of code in languages X and Y, and you can gain perspective by writing an entirely different kind of code in a language you're used to. It just tends to follow that when you learn a new language, you'll learn to do things in the style of that language.
I spend most my time doing Python (for scientific programming).
Knew Java, C++, Matlab and Perl before I learned Python.
A couple years ago, I learned R... I still use that a few times a week. Don't think it's improved my python programming.
More recently learned Clojure and Javascript. I think those count as meaningfully different from Python. I program differently in those languages, but I'm skeptical they've made me a better python progammer.
I'm skeptical they've made me a better python progammer.
Maybe it hasn't, the whole idea is basically acquiring tacit knowledge and understanding through osmosis. That's why its hard to quantify how it should work.
Try learning Haskell. The concepts I discovered while learning it were very influential in my approach to other languages, even if I don't use Haskell itself all the time.
Lookups for syntax and the like is very annoying. However, learning new languages is all about the principles you pick up.
I'm a better iOS developer today, because I learned how awesome blocks can be from Ruby. Same goes for meta-programming.
Languages usually have features that span everything you do in that language. Getting comfortable with blocks in a language that uses them as a primary/native data type shows me how useful they can be in a language that uses them more cumbersomely.
That's how I've benefitted from learning multiple languages. As well as the fact that I will avoid Java/.NET because they and I don't agree! :D
I think the benefit of learning a new language is only felt when you are at the expert level on that language. For instance, I'm an expert at python. If I were to learn Ruby at a novice level, I would get nothing out of it. I need to learn Ruby to the expert level to get the full benefit of knowing another language.
That makes a lot of sense... though people that telling you to "learn language y to become better at language x" never tell us they mean "become an expert in y."
And suggesting a 1000 hour investment is a lot different than suggesting a 100 hour investment.
How about getting into the habit of writing tests? First, they're code themselves. Second, it really forces you to think about interfaces and coupling.
Third, it's a nice habit to have if you work on a larger project.
Through my work at Mozilla, I regularly contribute to a large number of open source projects. I do my best to follow my own advice, but I'm always working to do better. Sadly sometimes there just aren't enough hours in the day!
I don't think that qualifies. I think fixing bugs in open source is a good way to become a better programmer, but if you are fixing those bugs you are a good programmer already. Fixing bugs in open source project is really not trivial. I wouldn't know where to even begin looking at the source code of some of the open source I use, for example Firefox.
I think starting your own open source project works to better improve your programming skills that contributing to existing open source projects.
Existing projects already have most of the code already written. A typical open source changeset is maybe 3 +'s and 5 -'s. Mostly contributing to open source projects will improve your people/communication skills because most of that you'll do it is talking to existing developers to understand the problem/how to debug the problem.
Not really it's a good advice BUT this book is targeted at wanabee php developers. Telling them to patch Apache is a very bad idea. He should have chosen phpBB or something like that.
You could think that people are smart enough to understand that they can't patch the most used webserver in the world like that, but hey, remember that Dunning–Kruger effect ?
He says: "Working with others to get patches accepted and bugs closed is a crucial skill in writing good code"
This article does not tell people to patch Apache. The point, as for as I can tell, is to explore the projects you use so you can learn to write better code.
Then you say "they can't patch the most used webserver in the world like that"
What is "like that"? As far as I can tell it's "working with others". Although I've never patched Apache, I'm confident I'll need to work with others to do so.
The article gives good advice. I don't understand why you would jump to the Dunning–Kruger effect to seemingly assume that the readers are incapable of following the advice.
>This article does not tell people to patch Apache. The >point, as for as I can tell, is to explore the projects you >use so you can learn to write better code.
You're right. But saying something like
"I’ve never taken a hard look at the Apache source code before, but I have found a few bugs in Apache that I’d love to have fixed."
is like launching a stick in front of a dog.
>The article gives good advice. I don't understand why you >would jump to the Dunning–Kruger effect to seemingly assume >that the readers are incapable of following the advice.
It's a book about php. Don't tell me about Composer, Symfony or other things that 0,1% of php coders are using. Php coders don't read, they cut'n'paste.
The article is about writing better code. My guess is that the PHP developers who do want to write better code would probably take the advice.
Is your point that nobody should ever suggest that PHP developers write better code?
People wo could write better code in php have already learned how to write better code in other languages. I don't even see the point of this book. Learning to write good code with php looks to me like learning ornithology with Angry Birds
I understand and generally agree with your point. However, if we're going to slide and dice, it should be pointed out that the Apache Software Foundation (founded in 1999) was named after the Apache web server (1995), not the other way around. So, if the context clearly suggests the web server, as it does here, I think he's OK.
I've become jaded with "top 5" style lists recently, but this one's pretty nice. I'm doing a few "secure development" training sessions with development teams in the next week, will be referencing this for sure.
I'm not sure writing tests makes you a better developer...gives you better habits, but not sure it results in your code being better, compared to things like fixing bugs in others code and learning a new language.
The learning a new language holds true past software development - when I learned Spanish in high school, my understanding of some of the stranger forms of English (say, past participle) improved.
I have to disagree with one point: that writing tests doesn't make you a better developer. Here's why:
Writing a test requires you to think about your code in different ways than just creating it. TDD requires you to plan your code. And the more code you write, theoretically, the better your skills will be.
Now, will it make you the best developer? Probably not. I doubt writing tests is the most productive use of time. But for the practice of software development, writing tests is critical.
Also, write lots of code! If you're stuck with legacy-code, start a hobby project that you think might be useful to someone (if you're not stuck with family that is, then there's really no hope).
But I've never noticed this in my experience. Moreover, whenever I come back to a language after any time away from it, I'm amazed how much time I spend remembering (or looking up) superficial differences.
Stuff like looking up whether I get the length of a vector with "len" or "length" (for a language I already knew) just feels like a distraction.