As if rating one's skills in a particular subject with a 1-5 rating under no context bears much meaning... it's so arbitrary as to be insignificant (but it could be a cool concept if not abused). It also depends on one's motivation for representing their skill level; for example, jobseekers probably overestimate their skill levels.
It would be great to have a standardized rating system based on a few different tiers wrt a language/framework/etc.:
1 - Wrote some really basic code in the language
...
5 - Wrote some really, really advanced code in the language
On my resume, I have a "skills" section that basically lists technologies I've worked with that I'm interested in working with now (and some I haven't work with) with a "code-word" based description of how I'd rate myself. The "code-word" selection will beat most HR folks, but I include a "decoder-ring" style description in my cover letter on what I mean by those code words. Things like, "familiar: know at least basic syntax and can easily use all syntax with reference, will likely need to use a reference for common libraries" and "expert: Need documentation only for infrequently used or obscure library calls, comfortable writing a tokenizer and lexer for the language given the specification". I actually got the idea when I applied for a job way out of my league and actually got a response explaining it was out of my league, so I asked for some feedback on my resume, figuring don't let a good thing go to waste and the response was "years of experience in a language doesn't quantify your knowledge of the language." I was quite young at the time and the approach I ended up with hasn't steered me wrong yet.
I'd put 5 at "wrote the language or contributed to the language". So 1-5 on Python where 1 is completed first tutorial book, written a non-trivial program. 5 == Guido Rostrum / contributed to a release. 3-4 would be if you have written a compiler.
Since the scale is only 1-5, saying that only a few people in the world qualify as a 5 or even 4 wastes 20% or even 40% of the expressiveness of the scale, which makes it pretty useless. It's better to round in the other direction, and think of the scale as logarithmic. If most people in the world won't have any ratings in the 4-5 range, that makes it pretty useless for comparing your own proficiency in the languages you do know, which is the entire point of the exercise, not comparing yourself to the people who designed the language. The interesting thing she's trying to express is how much better she knows JavaScript than Ruby, not how good she is compared to Brendan Eich or Matz.
I mean how advanced can you be in a scripting language ? There is no memory management, no low level systems knowledge required. Can you give me an example of writing really really advanced code in Ruby or Python ?
Managing memory and using low-level system functions is only advanced when the problem at hand requires it. Otherwise its yak-shaving. It would be like digging a pool with a spoon because you're too bad-ass for shovels.
You've been downvoted a decent amount, and I can see why, but I believe it is unfair because it's only due to your perspective.
At one job I ended up having a lot of work writing language bindings from C to what is commonly considered a "scripting language" (TCL), so it's easy to have to deal with memory management and low level systems knowledge in relation to a scripting language (knowing the semantics of memory management within a language drastically changes how you have to write the bindings). However, the idea is to rate your skill with that particular language. No matter what the most advanced state is of a language, a 5/5 (or whatever the scale is) means that you can accomplish the most advanced task possible in the language. That doesn't mean that someone who is, say, 5/5 in Perl, will have the same skillset and capabilities as someone who is a 5/5 in AT&T syntax assembler. However, there are a list of base assumptions of skills they do have that may align with what we need as a developer when looking at resumes.
"5/5 (or whatever the scale is) means that you can accomplish the most advanced task possible in the language"
I appreciate your sympathy. I have no problem with people who are born into scripting languages. Personally I started with C, C++ and worked on few scripting languages. My question came from more of curiosity. There is more learning curve in low-level languages because the very structure of your code can affect the memory foot-print of your programs and that requires careful crafting of data-structures and algorithms. At a scripting language level you are in a virtual "world". It does allows you to SOLVE pretty advanced problems but it does not necessarily translate into your expertise in the scripting language itself. For example, I might use Ruby to create true AI system but in the end its the algorithm that mattered not my knowledge of the language.
I don't know how much effort you put into learning high-level languages, but people coming from C tend to keep writing "C" in scripting languages as well.
While that can work, they miss out on all the cool and more advanced stuff these languages has to offer.
People downvoting: this guy is admitting to his ignorance and actively and constructively asking to be challenged on it. I don't think that's the kind of comment we want to downvote here, even if the tone makes you bristle.
Also, you may have to reevaluate your definition of a scripting language. I will try to guess as to what it could mean to you currently:
A script language is interpreted, a non-script language is compiled
First we have to define interpreted as it could mean many things itself. The most restrained vision of interpreted is a language that would take one line (or enough to form an understandable command), eval() it (which means parsing the line, executing it and changing some internal state) and then proceed to the next one. There's a second case, there are languages that parse the whole content into an tree (precisely an AST) and proceed at evaluating it. A compiled language will have to first parse the code into an AST, then proceed in transforming each node of the tree into a set of smaller instructions, then encoded as bytes. The resulting bytes are called bytecodes and they can be either native or executed on a virtual machine (which translates them to native bytecode). Fewer and fewer languages falls in the first case; PHP3-, older JavaScript, Perl 5 and Ruby 1.8 (MRI) in the second one; PHP4+, Perl 6, Python, Ruby 1.9 (YARV), modern JavaScript, Java, C# in the last one with a VM; C, C++, D, Go and Objective-C in the last one as native code.
A script language has limited tools, a non-script language has a sizable standard library
Take for example (ba)sh: it is really a glue language that controls flow and calls external programs. Those are called shells. As convenience and for performance, shells often include in their own code implementations of previously or current external programs (e.g test, aliased to [) or allow to control or use features specific to the shell. Those are called builtins.
Now you can compare the size of the C/C++ sdtlib and e.g Python, Ruby or Java. The latter ones are an order of magnitude bigger than standard C and C++.
A script language has no external library facilities, a non-script language has third party library facilities
The only thing resembling library features of bash are sourcing an external file and executing external program (which is native enough to extend the language itself since command-calling is first-class in shell languages). On the contrary python has extremely advanced library facilities called modules and packages, which create namespaces that you can selectively import. Ruby is simpler and arguably less advanced as it relies on a 'require' and a 'load' function that will trigger loading and interpretation of a file, while namespacing lies in the hand of the developer who manually nests classes and modules. This is similar to the 'source' feature of bash, but also of the #include preprocessor directive of C, which is not even really part of the compiler and literally stitches the content of a file into another. Usually this #import is done to include so-called header files that describe prototypes of function lying in a library. What's interesting there is that the library feature is actually not even part of the language, but of the infrastructure surrounding the compiler, and precisely the linker. Indeed compiling to native code results in object files which are totally independent of the actual language and totally dependent of ABI calling conventions (which is really unrelated to the language). This way you can link objects having been built from fortran or C, or C++, or whatever. So it turns out C #include is actually closer to bash 'source' in that regard, with the onus of library management being not on the compiler but on the linker.
A script language has no types, a non-script language has types
Bash actually has types, precisely strings and arrays and it's up to each program to parse the strings into somethign meaningful. Now what you may be distinguishing there is weak typing vs strong typing. Let's take PHP, which when given "3"+2 spits 5 (or "5" I can't recall). Try that in C and you will get an error/warning/core dump (ironically '3'+2 in C would give both a warning and '5' because of ASCII and char really being bytes). Yet try that in python and ruby and you will get an error (an exception precisely). PHP is weak-typed and Python, C and Ruby are strong-typed.
Now maybe that's because we're not declaring types and not having function/method signatures that makes it a scripting language. Really what's at play here is static vs dynamic typing. Python and Ruby are dynamically typed, while C is statically typed. But Objective-C is dynamically typed too.
A script language is used to write scripts
Maybe you encountered #!/bin/sh in scripts, and also #!/usr/bin/python and concluded 'Ha! They're scripting languages!'. Amusingly enough, it's quite easy to build a thin wrapper to gcc that will make it possible to start a file with #!/usr/bin/c and subsequently write code in C. Does that make C a scripting language? Maybe, but that makes it equally easy to make any language a scripting language.
A script language is not written in itself, a non-script language is written in itself.
This is called self-hosting. You could argue that C is written in C, while Bash, Python and Ruby are written in C. Well too bad, as D is written in C, C# is written in C and C++, Java is written in C, and even g++, the C++ compiler is written in C. (for each one of course, part of their standard library is written in their own language). At the same time, Python has PyPy which is able to produce native code straight from Python code, and various other languages are self-hosted. Now you could argue that we're using C because of performance, but that's not even true, since PyPy regularly outperforms CPython. In fact we're only often using C because there has been a tremendous amount of work thrown into C compilers (notably regarding conversion of code to each native platform) so it's merely by convenience that we reuse them.
A script language has no memory management, a non-script language is low-level
So, C and C++ have memory management, while Python does not. So much for Java and C#, which would become scripting language by that criteria. Also, as for low-level Python can use things like mmap and has ctypes which allows you to tap into system devices (via e.g /dev) and native functions (which, as mentioned above may or may not have been written in C, since at that point they're just native code respecting a convention allowing them to be called. If anything such code could have been generated by PyPy) like malloc and free, so you can go low-level in Python if you wish.
So I think we have made quite a round-up of things, and hopefully enouch to demonstrate that well, while Python and Ruby are effectively able to be used (and quite efficiently so) to write scripts, they are clearly just not only "scripting languages", but full-blown, extremely advanced and potent programming languages.
Great roundup. Its nice to see all this points. One big difference between scripting and non-scripting languages is that, in scripting languages you can modify the structure of the program at runtime, while in compiled languages you are stuck once you compile it. C++ does have metaprogramming through the use of templates and those programs look immensely complicated, but it can't introduce new data types and logic at runtime. I kinda dislike C++ from beyond templates as the code looks super ugly.
The ability to modify the program at runtime (and elegantly) is a huge advantage over compiled ones and allows you to express new category of solutions.
Programs that change itself is in my opinion pretty advanced.
So it seems, Python and Ruby allows the programmer to free the mind from the low-level housekeeping and focus 100% on logical thinking and give incredible expressiveness. I would buy that.
I wonder how often an above average python/ruby programmer use its metaprogramming / reflection capabitlies?
I once had a professor teaching a course on elegance in software design proclaim "Python is not a real language". The course is taught in Java. I cringed.
How about PyPy? It's a Python JIT compiler written in Python.
There are thousands of other examples. Managing memory and low level system is not the difficult part, it's just tedious - algorithms and mapping a problem correctly is the difficult part.
the way i think of my language skills in my own head is mostly a three-level system: read, write, and debug.
at the first level, you know enough to be able to look at existing code and have some idea what's going on. (this is me with C++.)
at the second level, you know enough to write new code with real functionality and have some chance of its working. (this is me with C.)
at the third level, you have a fairly thorough understanding of most parts of the language and know idioms, common pitfalls, etc., and can fix other people's code. (this is me with q, and it was me with java seven years ago when i was working in java.)
if you want to extend this, an extra level up could be "hack", where you contribute to the language itself--modify gcc or the python interpreter or whatever. (i aspire to get to this level in q, and i think i'm close.)
not sure how to wedge a fifth level into the system....
It would be great to have a standardized rating system based on a few different tiers wrt a language/framework/etc.: