Hacker News new | past | comments | ask | show | jobs | submit login
Python squeezes out JavaScript, C as best starter programming language (itworld.com)
43 points by btimil on Jan 25, 2014 | hide | past | favorite | 79 comments



I have taught an introductory course for programming in Python, and there certainly are a few areas that are less than ideal for beginners.

1) Setting up Python, a text editor and a shell is easy on Windows (using WinPython), but pretty confusing for Mac users.

2) It is actually surprisingly difficult to teach object oriented programming in Python, since most simple problems are much more easily solved in a functional manner than with classes.

3) It is more or less impossible to peek under the hood. There is no way to teach integer logic really (Python has auto-growing arbitrary precision integers) and floats can lead to strange errors (x=1; for i in range(10): x+=0.1 does not yield 2).

4) It can be very difficult for students to grasp the difference between different types and when or how they are to be used. I suspect that these kinds of problems might be more easily explained in C.

All that said, I feel that the students did learn quite a few things that would have been more or less impossible with C or Javascript. In many ways, Python provides a happy medium between full-on dynamicism and strictness. Four stars, would teach again.


> 2) It is actually surprisingly difficult to teach object oriented programming in Python, since most simple problems are much more easily solved in a functional manner than with classes.

That's actually a good thing. I'm afraid too many people learn programming early on with e.g. Java and end believing programming equates defining classes and mutating objects. I've worked with people with such background (actually, I routinely do), and it's not pretty. I notice a difficulty in these people to grasp fundamentals, like data structures, idempotency and high-order functions, which leads to awful bad practices.


I completely agree. Still, OOP is an important tool to have in your belt. I would say though, that a difficulty to grasp data structures or higher order functions is pretty universal in beginners regardless of background. Those are just hard concepts.


Could you explain how setup is more difficult on a Mac than on Windows? Other than a decent beginner's text editor, everything you mentioned comes pre-installed on OSX. Trying to install 3rd-party Python libraries on OSX is kind of a pain, but if you're just using the standard library (which will get you pretty far in an intro class). Finding a good text editor for Mac also isn't that difficult. When I've taught beginners on a Mac, I usually just download Textmate.


Yes, I'm interested in hearing about this as well. OS X comes not only w/ Python 2.7, but also easy_install. Getting IPython Notebook up and running is a single command:

sudo easy_install ipython[all]

Personally, I think IPython Notebook is a fantastic (the best?) tool for teaching Python, and being able to distribute/collect notebooks seems for classrooms seems like an awesome thing.

There are plenty of text editors, and if you must go the IDE route, IEP and Light Table are both free and cross platform (among lots of other options).


Python itself is no problem, that's true. But even installing simple modules leads to headache. You suddenly need to have XCode installed, and you get compiler errors, and you need to modify your PATH...

The lack of binary installers for Python/Mac packages really was an issue for my students.


Doesn't Emacs and Vim come preinstalled on OS X? Or has that changed with Mavericks?


Those are exceptionally poor choices for a beginner's class.


I disagree. Why do you say they are?


Most new students have seen a text editor before (like Notepad or Microsoft word). They are used to things like using the arrow keys for navigation (instead of HJKL), not having to switch modes, and using the mouse to manipulate text. There are plenty of serviceable programmers' text editors which follow these idioms. Better to let students use something familiar so you can focus on the actual subject (programming) and not frustrate them with learning a completely different set of keyboard shortcuts.


When was the last Emacs that didn't use arrow keys by default? 1987 on a VAX? Ditto with mice and such. And it's modeless.

People just ape things they've heard without thinking. Emacs and Vim may not be perfect beginner's editors, but they are much better than just decent.


Because it's a class to learn programming, not a new text editor.


But they don't know any text editor. They have to learn something.


> 1) Setting up Python, a text editor and a shell is easy on Windows (using WinPython), but pretty confusing for Mac users.

How is this even possible? TextWrangler (or other choices) come from the App store, Terminal is right there and Python is already installed.


> introductory > object-oriented programming

Unless I was mandated to make my syllabus match the language reference's table of contents, I would defer most of object-oriented programming to a second semester. How to use particular objects from the standard library would likely be covered on an as-needed basis in an introductory class -- for example, "you use the syntax d.keys() to get a list of a dictionary's keys" without attempting to explain that "keys" is a function attribute of a class of which d is a member, and thus by MRO rules d.keys triggers a lookup of the "keys" member from the built-in dict class, and d will be passed as the first parameter when the function call operator denoted by the parentheses is executed ...You may be tempted to include such detail for various reasons, but trust me, your students will thank you if you save that content for a more advanced course.

> There is no way to teach integer logic really (Python has auto-growing arbitrary precision integers)

You can still teach binary representation and use bit-wise operators.

Also, I once wrote a storage-efficient bit vector class with the built-in bytearray module, which is a good exercise to tie together object-oriented programming and binary representation. However, a student at the level of doing that exercise is no longer beginner, but firmly intermediate or advanced.

> floats can lead to strange errors

These are a natural consequence of the floating-point representation and this problem exists in each of the article's top three language choices.


Re floats:

True, this problem is universal. I still wish there was a way to circumvent it, much like the arbitrary precision integers circumvent integer problems. Rationals and fractional integers come to mind, though I don't know their performance characteristics.


Python has a decimal package in stdlib. It provides an arbitrary-precision floating point type. The performance will obviously be worse than with floats, but I doubt it matters in an intro programming class. The only real drawback is that there are no decimal literals.

Actually, several languages I know of have decimal types in stdlib. Two that do not are C and JavaScript.


Python has the "fractions" module which provides a more powerful abstraction: It's not restricted to denominators that are powers of 10.

> I don't know their performance characteristics

It's easy to make some good guesses.

Since (I'm guessing) it's implemented with pairs of integers, adding probably uses

    a / b + c / d = (ad + bc) / (cd)
which is 3 multiplications and an addition, and multiplication probably uses

    (a / b) (c / d) = (ac) / (bd)
so only two multiplications. So it's a small constant factor times the cost of integer multiplication, plus a bit of overhead for the wrapper class.

So my back-of-the-envelope guess is that the fractions module is less than 10x slower than the same operation with integers, or fast enough for just about anything you do in a beginner level class.

These numbers will change significantly if fractions are automatically reduced by gcd of numerator and denominator; I'm not sure about this point.


While obviously it's a fundamental thing to teach/cover the various data types (or at least how '1', 1, and 1.0 are different since they'll be encountering this over and over if they continue programming), on a practical level, frequent casting can be sidestepped by using string formatting when printing numeric output.

In all my production code (my string printing is not performance critical) I pretty much always use string substitution when printing as a matter of style/convenience. It's much easier to add/replace substitutions, and I often need formatting anyway.

If students get used to printing w/ string substitution, they can then simply use %d or %f to control how they want their numbers to display, which is probably the right way of thinking about rendering text output.


> It is more or less impossible to peek under the hood. There is no way to teach integer logic really (Python has auto-growing arbitrary precision integers) and floats can lead to strange errors (x=1; for i in range(10): x+=0.1 does not yield 2).

I believe the same problem applies to C, and any program that employs a float register. 0.1 cannot be defined accurately in a float, there is incompatibility between a base 10 system and a base 2 system. This should be true no matter the size of the register. The larger the register, the more precision you get, but you don't get more accuracy.


> 2) It is actually surprisingly difficult to teach object oriented programming in Python, since most simple problems are much more easily solved in a functional manner than with classes.

Do you have an example? From the outside, it doesn't seem to be that difficult, for example, in the C++ course I followed, the teacher introduced classes with the implementation of a RPG character. That was pretty simple.


While I don't know that it's difficult to teach OOP in the sense of 'objects exist' in Python, I do think it's more difficult to teach a lot of tthe concepts associated with OOP, because the way it's done in Python is so unusual (explicit self, no real encapsulation, etc., etc.).

In the introductory programming course I took, the way OOP was introduced involved writing incredibly un-idiomatic Python. So in that sense it works fine, but if you're going to ignore the 'spirit' of a language in order to use it as a teaching tool, it seems to me what language you choose doesn't really matter, so long as it has the features to shoe-horn in whatever style you want to teach people in.


I explained it using a bank account. That works well enough. It's just that I couldn't come up with many real-world problems and exercises that are naturally solved with classes.

Classes are a rather high-level abstraction, and most problems in introductory classes are decidedly low-level.


Lately I'm way more excited about Julia than I've been about any language for years. My son is learning programming, and I was debating between starting him off with Ruby, Python, or Clojure. But I think Julia might just be it. Out of all the languages I've learned it definitely feels the most intuitive. As I read the docs, I constantly find myself saying "yes! this is how it should work!" way more than I did when I learned any of the aforementioned languages (except maybe Clojure).


I think the Python community is more mature in terms of the number and quality of applications and documentation. Julia is getting close to critical mass, but it's still "niche," but Python is firmly "mainstream."

I think it's important that a beginner's first "real" language is a mainstream language, due to the existence of libraries and a community which will help them through and beyond the "beginner" phase.

Clojure suffers from the same problem as Julia (not mainstream), and Ruby suffers from the same problem as JavaScript (most applications are tied to web development, which is such a Gordian knot of related technologies all tied together that it will simply overwhelm most beginners.)

EDIT: Seriously, parent was downvoted? Why?


Lack of community isn't a problem for him, since he has me to help him, and I'm proficient in all these languages. However you bring up a good point, I am somewhat concerned about lack of tooling and libraries.


> he has me to help him, and I'm proficient in all these languages

I think we have different assumptions about the goal of teaching your child programming. Your goal appears to be "create an activity where parent and child can spend time together." My goal (if I had children) would be "get them to be able to program independently as soon as possible," because IMHO that will ultimately both increase the amount they learn (because they'll start learning on their own time to scratch their own itch, just like mature hackers do, so the amount of learning your child does is no longer so dependent on the amount of time you spend teaching them) and increase the probability they'll stick with it for the long term (because being able to do their own projects and have their own ideas makes it much more attractive, especially as they head into the teenage years and start to establish their own identity).

Having an activity parent and child do together which increases emotional family bonds is great, but I think the more important goal is the transfer of knowledge, skills and the hacker mindset.

Disclaimer: I don't actually have kids, but if I did, the above would describe my philosophy.


Yes, it's a good option to pick a language with no powerful IDEs (static analysis and proper automated refactoring) as those are harder to move on from later on (IntelliJ IDEA or Visual Studio vs any typical Python/Ruby notepad). If you are using a basic text editor then you really have to learn the language and have fun debugging.


I'm not so sure about that. One big conceptual hurdle that many beginners have is the notion that a program is a sequential sequence of steps, and at any point in time during a program's execution, there is a unique "current statement" [1] and "current state" (content of variables).

Single stepping through your program with a good visual debugger makes these concepts very clear.

[1] Of course this is not true for multi-threaded programs, but multi-threaded programs are well out of "beginner" territory.


While I think Julia is great, for children I have to say Python is better, especially in various subcategories that might keep a teenager interested: Games, 3D Animation (blender), Websites and Mobile Apps.


I am admittedly interested in trying out PyGame. Although it's only a matter of time until that kind of thing exists for Julia too.


Yes, but your kid might be an adult by the time they do.


Python certainly makes sense to new programmers because the only thing that makes people wonder is the indentation. But with good resource out there (books, stackoverflow, Codeacdemy, Coursea, visual Python), Python tutorials have been better. The language also makes reading code quite easy.

On the other hand, Javascript seems to be always start with HTML and CSS. That's where the obstacle is. People begin playing around with three separate technology and the learning is really high.

however, if you ask me "yeukhon do you recommend freshman start with C / C++" I'd say absolutely. I am a bit bias since I knew a bit C++ before I started Python (although my first official programming class in college was Python and Matlab... funny right?) but I think the learning curve in statically typed languages like C++ is absolutely helpful in the long term.

If people want to learn Python without Hello World, try http://www.youtube.com/watch?v=RrPZza_vZ3w


I think that depends on how it's being taught. My cofounder learnt to code via python this year. He just did courses with my assistance along the way. Python is playful, and you feel like you're getting somewhere on your own. I'd wager that if he'd started with c (or c++!) it would have knocked the enthusiasm out of him. I learnt c early at university, but there you're dedicated to learning software full time.


These days you can start JavaScript with Node.js and create simple command line applications.


I agree with this. I currently teach beginning programming using JavaScript in HTML, and we currently just provide pre-made web pages for the students to write their basic programs in. They come in already knowing basic HTML as they've taken a previous class.

But I've been thinking of rewriting the curriculum to use Node.JS for about a year now. The only reason I haven't yet is that the students seem to enjoy doing things in web pages.

I think there's something awesome about writing console apps, but I don't know if they really hold the attention that today's students have.


True, but remember, if you google javascript or look around books in the book store, many are still teaching with web technology :) which is unfortunate (although the intent is great).


the asynchronous model of node might be a bit hard to grok for beginners though (when I was in high school I tried writing actionscript(which is basically JS with types) without really understanding event-based programming, and it was pretty surreal)


To be a good developer you'll need to learn several languages, but that is beyond the scope of the article. But in terms of a "first" language, are those who learn python first and THEN C, Haskell and C# in the end more likely to be better developers than those who learn the same four languages in another order?

I think it's unlikely that the language you start with has much effect in the end. Those who never use a low lever language or never use a functional language will be worse developers than those who do, however.

So for a first language: just pick a fun one. If you happen to pick JS you can always choose again later.


My impression is that if you follow that order you are way more likely to get through the end than if you followed any other order (except by switching Haskell and C#, I don't think that'd matter). But yes, if you did go through those languages in another order, there is no reason to think you'd know any less than somebody that followed the "easy" sequence.


I'd go with 3 first languages simultaneously, in 3 tracks, PL focused on programming concepts and constructs, OS focused on low-level interfaces, and SE focused on what the job market requires. From first to last in each track...

PL: Python, Prolog, Clojure, Haskell/Parsec

OS: C, assembler, Bash, regex/sed/awk

SE: Java, Javascript, HTML/CSS, Django


How about picking a focus, then a language? I see no point in "becoming a programmer". Want to make web apps? JS. Want systems? C. And so on.

Going for a "first" programming language seems to assume certain things:

* That you ARE going to become a programmer and that you will NOT change your mind/priorities

* That somehow whatever language you first pick will definitely be useful for your final goal. Sure, C will teach you all the ins and outs of manual memory management, but why go through it if all you really need is Processing?

* That there is some special point in time when you're supposed to learn how to deal with the grave dangers of the misplaced semicolon/memory leak/etc.

How many of those "learning to code" really become programmers and end up doing it for a living, or making habitual use of pointers or some other concept that a programmer must/must not deal with in their first programming language?


This kind of question, "what first?" can still make sense if you think of all of the vaguely technical tasks that many non-programmer/engineer types will encounter. Then, what's a good language to server as a good enough tool if they go no further, and a good enough foundation if they do.

Python would serve that, especially given all the batteries that people can draw on. In fact "batteries" should be part of that introductory course, so that you know to look for a battery first instead of jumping to writing it.


But some people are actually interested in being competent in computer science and not just whipping out blog engines in the latest fad framework.


The main idea still applies. C and Scheme show very different aspects of computer science, so even those people will need to choose a focus.

If all you want to do is "become a programmer" you'll discover that you'll never get there, as there's always more to know. If you're asking yourself now what language to pick, you'll later ask what to learn next, and once you've done that, ask again. Asking what first language to pick is just the first manifestation of a potentially never-ending bad habit of never gaining focus. This applies to both web apps and computer science.


There's too few of us...


> How many of those "learning to code" really become programmers and end up doing it for a living, or making habitual use of pointers or some other concept that a programmer must/must not deal with in their first programming language?

I recently talked with someone from an employment agency that deals mainly with IT companies (I live in the UK). The gentleman stated that since I don't already work in an area if IT, my chances of getting into it are very slim. He also said that many people who work in IT often start in a company in an administration context or something similar. Since I work as an arborist, I have practically no chance.

I started learning C++ nearly two years ago, then switched to straight C. Most of my programs have very little practical use, but almost all the code that I write involves pointers, arrays of pointers, function pointers and void pointers. I would be stuck in the mud without having an address book.


That would make more sense if learning to program was easy, and learning a language, hard. But in practice it's exactly the other way around.


>“I see JavaScript as the modern equivalent of BASIC. ...because of its simplicity, it is a great language for learning fundamental programming concepts.”

So what's wrong with BASIC? It still exists...


Because people still use JavaScript


Sure, and Ruby squeezes out Python as a starter language because it's closer to English and the identation doesn't have to be strict.

That said, I find both Ruby and Python considerably user-friendly and extremely powerful. JS of course is today's hot stuff and if you want to be on-top of the game, you have to know and understand JS. But once you've become comfortable with any other OO language JS is easy.


Strict identation is a good thing.

A teacher can compensate a non-strict compiler by being strict himself, but identation is one of the most important concepts you should be teaching to beginners.


>“Python, because you do not need a compiler and it's very VERY easy to run your programs and test.”

That really depends on your definition of "compiler". If your definition is "translates a high level language into machine code" then the popular implementations don't do that. But the most popular implementation does pretty much act like a typical compiler except that it generates code for a VM, not a physical machine.

On a more on topic note, these sorts of things are pretty pointless because it's just the Emperor's nose thing really.

"Nobody was permitted to see the Emperor of China, and the question was, What is the length of the Emperor of China's nose? To find out, you go all over the country asking people what they think the length of the Emperor of China's nose is, and you average it. And that would be very "accurate" because you averaged so many people. But it's no way to find anything out; when you have a very wide range of people who contribute without looking carefully at it, you don't improve your knowledge of the situation by averaging. "


I think the original meaning of that quote is more "I don't need an intermediate compiling/linking step before I run my program," not "Python doesn't need a compiler because it doesn't lex/parse/generate code." I agree it's an ambiguous and confusing statement though.


Where is Scheme? :'(


> :(

More like ((((.))))

But seriously, given how much effort the folks behind DrRacket have put in making programming accessible [1] it really makes you wonder if m-expressions are not such a bad idea ..

[1]: http://en.wikipedia.org/wiki/Racket_features#Educational_Too...


It's funny how often I see people criticize lisp's syntax and write things that aren't even valid syntax like (() ()) or (()) (assuming they don't get autoquoted).


Fine, let's see your 'syntactically valid' LISP smilie .. and also one for 'tin ears' while you're at it. TIA!


Well, that was always one of Guido's goals if I am not mistaken.


I've been teaching programming to many people, for many years. The majority of my students are experienced programmers, but no small number are new to programming beyond very simple stuff. I've found that Python has a number of aspects that are ideal for first-time programmers:

- It's dynamically typed. Say what you want about static vs. dynamic typing, but this is one less thing that newbie programmers have to get right. There's no chance of an error when they say "i = 'abc'", if you've defined i to be an int.

- It's interactive. The fact that you can "play" with the language within the interactive shell is a huge selling point. IPython and the IPython Notebook are easy to get working, and for people to work with.

- Python's restricted command set and simple, regular syntax let you concentrate on ideas: Yes, many newbies to Python (and to programming in general) get confused by indentation, blocks, colons, and the like. But they're going to get confused by the syntax of nearly any language. Python has a simpler syntax than most other languages, meaning that there's less to learn, and less to remember. This lets the new programmer concentrate on the ideas that they're learning, or the implementation of what they're doing.

- It's cross platform. The fact that people can use Python on any computer they like is a big selling point.

- You can easily teach object-oriented and functional-style programming. Python is obviously object-oriented, but can also be used to introduce functional programming. In this way, you can expose programmers not only to multiple paradigms in Python, but also in other languages.

- You can use it for real applications. People are often surprised to discover that real-life applications are being written and used in this language that they're learning, which seems so simple.

I'm sure that there are more reasons than these. But let's consider the alternatives that the article suggested:

- I would be hard-pressed to think of a worse first language than C. You want to introduce people to the concepts of programming, which means abstractions and high-level thinking. C forces you to think in terms of the computer and its memory, which is just the opposite. The fact that it's compiled to binary form, that you don't have an interactive C shell, and pointers are just three reasons why I think that C would be a very bad choice. Sure, everyone should learn C at some point -- although I often point out that I'm a much happier person since I moved to dynamic, high-level languages many years ago -- but if you want to teach the concepts of programming, C is going to require too much learning just to get simple things done.

- JavaScript has many good points for beginning programmers -- but the chief problem, in my mind, is the language's syntax, which is far too inconsistent and forgiving/flexible for newbies. I think that someone coming to JavaScript from Python will have a very easy time; the mapping of data types is fairly straightforward, and even the notion of passing functions is pretty easy to get. But the learning curve in JavaScript seems steeper to me than in Python, despite the obvious advantages of being able to work within a browser.


> There's no chance of an error when they say "i = 'abc'", if you've defined i to be an int.

Probably you mean no chance of a compiler/interpreter error, because the program almost surely will not do what the rookie meant, much to his/her surprise. That's an error in my book :)

I agree with the rest of the benefits of Python as a beginner language.


> You want to introduce people to the concepts of programming, which means abstractions and high-level thinking

Completely disagree, what you describe is more "Software Engineering" than "Programming".

To me, learning to program means learning to describe the way to solve a problem in fully comprehensive and exhaustive steps, with a very limited set of tools and primitives to work with (data, operations and flow control to begin with). You can do that in any language, so your choice is really up to convenience and context.

As the basic tools of programming become second nature and problems to tackle become larger, then it's time to move into SE, abstractions, organization and etc. You can still do that teaching in pretty much any language.


> Completely disagree, what you describe is more "Software Engineering" than "Programming".

I think that a huge proportion of programming is about understanding and creating abstractions. But hey, I was brainwashed by SICP (http://mitpress.mit.edu/sicp/) back in college.

> You can do that in any language, so your choice is really up to convenience and context.

Given two people of equal ability, and without any background in programming, I think that the person who learns Python will advance more quickly than the person who learns C. They will understand fundamental programming concepts sooner, and will be able to apply that knowledge sooner.

This doesn't mean that C is bad, but rather that it was designed to solve certain problems, while Python was designed to solve other problems -- among them, the ease with which people can learn and use the language.

Learning C requires that people understand the memory model of the computer, on top of everything else that you need to know to programming, such as variables, conditionals, loops, and functions. Programming is hard enough for people to learn; by teaching them a language that lowers that threshold, they'll learn more in less time. They can always learn C down the road -- but if C is their first language, then they might (in my experience, from having spoken with many frustrated wannabe programmers) just give up on it.


Based on my own experience teaching people who knew absolutely nothing about programming, the biggest factor is visibility of results, i.e. making programs that do stuff you can see and touch (games are the best). Certainly, the immediacy and speed of iteration of Python and JavaScript trump that of C these days.

My choice today is actually JavaScript, partly because of ubiquity, partly because it's very easy to do something interactive and visually interesting (Canvas), and partly because its simplicity soon leads into its shortcomings. Python as a language is "batteries included", and I believe in the value of students bumping into barriers and limitations so they can properly value the reasoning behind solutions. In that sense, the value of C is in teaching students about lifetimes and ownership of resources.


> I would be hard-pressed to think of a worse first language than C. You want to introduce people to the concepts of programming, which means abstractions and high-level thinking. C forces you to think in terms of the computer and its memory, which is just the opposite.

People should learn two programming languages at the same time, a very high level one such as Python, and a very low level such as C. Combining the top-down and bottom-up approaches to learning, simultaneously, yields the best results for most people. While learning Python, someone can begin to look inside the abstractions and get a feel for algorithmic complexity and such stuff. While learning C, someone can see how to use it to build an OS such as Linux.


> People should learn two programming languages at the same time

I find this completely impractical. Programming is hard for people to learn. They're juggling all sorts of new concepts and ideas. It's hard for them to remember the terminology, the data structures, the ways in which functions work, or even when and if they should use functions. Teaching them two languages simultaneously would be far more confusing than helpful.

I'm all in favor of learning different approaches and different languages. But for someone without any background to learn two languages simultaneously would be making it more difficult, not less.


> People should learn two programming languages at the same time

I agree with this. I think its a great way to understand how those new concepts and ideas work in two different universes. Its helped me separate and understand how core ideas and concepts work and how they can work different.


> I would be hard-pressed to think of a worse first language than C. You want to introduce people to the concepts of programming, which means abstractions and high-level thinking. C forces you to think in terms of the computer and its memory, which is just the opposite.

This is actually exactly why I like the idea of C as an introductory programming language. Most of the time it's very clear what you are doing there is no magic. Mathematics also involves abstractions and high-level thinking, but you don't start out with calculus or linear algebra. You learn the basics then you abstract them.


It is also not very verbose (most of the time (I'm looking at you list-comprehensions))


The use of list comprehensions is optional, and would probably not be covered in an introductory course anyway.


I wish they( my teachers back then ) had python but instead we ended learning cobol :( unforgiving syntax but boring as hell


Best for starting... You just need to remember to move on later!


Why do you have to move on? The one disadvantage to Python as a beginning language is that there are few strong motivations to move elsewhere...


Dynamic typing and it's consequences. It's handy when you start and want to cut the amount of things you have to care (safety).


I don't care about "type safety" discussion. Hardly have I heard this argument from serious Python users at all. You won't hear Python users bitching about bugs that could have been caught by some compiler magic. I do hear programmers bitching about other compilers and their anal retentiveness though...

For me type safety just isn't an issue in practice. Guido also says, and rightly so, that every moderately large project, especially in a strongly typed language, eventually implements all sorts of mechanisms for dynamic typing, even if these mechanisms are then called something else.


Hardcore Python fans will always defend it. I am just asking to think out side of the box.

http://evanfarrer.blogspot.co.uk/2012/06/unit-testing-isnt-e...


Hardcore Fans of static typing will always attack our state of well being. We shouldn't feel this good about Python because we are "clearly wrong". Is that it?


And never stop moving...


How come most discussions about the merits of Python devolve into praising Julia?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: