I certainly cannot speak for Mr. Ritchie, but when someone says it "can't be done" to me, it provides me with even more incentive to prove them wrong. If nobody was bothering him with "it can't be done," maybe it would have never been done?
How many times throughout history have we heard this? I think that this is probably the most important part in any huge discovery throughout history, people refusing to believe or not knowing that it that it "can't be done".
(Of course the trick is knowing when something really can't be done, ex. alchemy)
Alchemy can be done. Gold has been synthesized on the nuclear level by way of neutron bombardment of other elements. It's just not economically practical since the process costs more in equipment and energy than the value of the gold.
"Alchemy" predates meaningful definition of the word "chemical"! If you had presented them the modern solution, they would not have been so inclined to split hairs. They were looking for any way, regardless of whether it was "chemical" or "nuclear" or magic for all they cared.
Right, and finding another way is how the "can do" person wins over the "can't" naysayers. It's like saying Jules Verne's cannon to the moon is impossible... which is true, but not really the point, since rocket travel to the moon has been proven.
My point is that you can make any prediction 'come true' if you redefine enough words in it. This is a dumb parlor trick and tells us nothing about how people of the past actually thought.
And yet, without all the efforts that went into alchemy (including Newton) we might have been delayed getting chemistry among other fields.
(I've been reading Gleick's bio on Newton, and it makes a point about the impact that the 'fruitless' alchemy had on modern science.. the real mistake of the alchemists was being secretive about methods and results)
all before C. Personally I appreciate the terseness of C and its closeness to assembly a lot, and I believe it all reflects the good taste of Ritchie, but still he didn't do anything "impossible" from my perspective.
where Knuth writes an ALGOL compiler for Burroughs in 1960 working 40 hours a week in violation of Cal Tech's policy that limits the number of hours that a Ph.D. candidate can work.
> to compete with custom code written for just that hardware.
And that is true. You couldn't run the Burroughs 5000 Master Control Program on a PDP-11, as far as I know, at all, and certainly not efficiently.
Ten years before the birth of C was about 1963. ALGOL-60 lacked a lot of things to compete with C, which is why people in the 1970s switched to C. ALGOL-68 required garbage collection, which wouldn't be reasonably efficient until the invention of generational GC in 1983, and still poses problems for real-time software like device drivers, which is why people usually use C or C++ instead of ALGOL-68 or Java for them.
For some notes on what ALGOL-60 lacked in comparison with C, I recommend reading "Why Pascal is Not My Favorite Programming Language", from 1981: http://www.lysator.liu.se/c/bwk-on-pascal.html. Most of the objections to Pascal also apply to its ancestor ALGOL-60, although ALGOL was intended for real work, not just teaching. ALGOL-60 did have a sort of equivalent of pointer parameters, namely Jensen's Device, but it was dramatically less efficient than pointers. I think (I wasn't around in the 1960s) that ALGOL implementations from different vendors had different nonportable extensions to work around the lack of pointers, and of course the same thing happened with Pascal later.
Burroughs was an immensely innovative company, and all of today's popular programming languages except C, C++, and PHP (JS, Java, C#, Python, Perl, Ruby, Objective-C) owe an enormous debt to Smalltalk, which draws much of its inspiration from the B5000. I do not deprecate the importance of Burroughs! They did indeed write the first OS in a high-level language. But C achieved what they could not.
The measure of C's achievement is that in 2011, on a multiprocessor 32-bit machine with the x86 instruction set and a gigabyte of RAM, I still occasionally run programs originally written in 1981 for a single-processor 16-bit PDP-11 with 64k of RAM per program, and they're still efficient; and I still constantly run software written in the late 1980s, such as parts of the X server and much of GNU coreutils, on single-processor 32-bit 68000s, and much of the optimization done then is still valid. (Though not all of it!)
It was C that first made it practical for people on different architectures to share code on a large scale, for code to outlive the architectures it was written for without suffering a dramatic slowdown, and for people to switch architectures, as Sun did from the 68000-family Sun3 to the SPARC, as everyone eventually did to the i386 we use today, and as we now are to AMD64.
The SNOBOL Implementation Language, SIL, achieved the same thing in 1966 — but only for one program, the SNOBOL interpreter! TeX was another early endeavor in this direction; concurrent with the early evolution of C, Knuth wrote WEB, a literate programming language which compiled into Pascal, in order to get Pascal's portability without suffering from its drawbacks. Among other things, WEB used a single humongous Pascal packed array of char for all of its strings; and that was what Knuth wrote TeX and Metafont in. C's twin sibling Ratfor was a third similar approach, compiling C-like constructs into Fortran rather than assembly. I don't know of anything else that predated the popularity of C.
In effect, C enabled both the birth of the SPARC and its death.
Today we stand at or near the end of the C era, for three reasons.
First, a vast amount of software is being written for which efficiency is of minimal concern. So C's drawbacks — its proneness to subtle bugs, its difficulty of debugging, its limited facilities for abstraction — drive people to more modern languages.
Second, dynamic recompilation has reached a level of performance where it's feasible to use it to emulate another processor architecture at acceptable speeds. Transmeta was one of the most interesting explorations of this concept, but Apple's transition strategy from the PowerPC to the i386 probably had more field-deployed units, and I think some of the currently popular OS virtualization approaches work this way as well. (And of course there's Valgrind, although calling its speed "acceptable" is a bit of a stretch.) So now there are viable alternatives to the recompilation approach.
Third, both computer architectures and compiler optimizations have changed so much in the 40 or so years since C was invented that C is stretched rather thin. Compilers exploiting undefined behavior makes it increasingly difficult to write working code in C, or to recompile old C programs. And, although people largely program GPUs in C, you cannot simply recompile Emacs for your Fermi or your Spartan-II to get it to run faster or use less energy. The C abstract model of computation is an increasingly poor fit to modern hardware, despite the pressure it has exerted on hardware designs for the last 25 years.
I'm not mistaken, only you appear not to have experience with FORTRAN and Pascal.
> There was no such thing as a general-purpose program that was both portable across a variety of hardware and also efficient enough to compete with custom code written for just that hardware.
Of course ALGOL easily matches this claim. Proof: Knuth's TeX was written in Pascal.
Not to mention everything that was written in FORTRAN.
> I still occasionally run programs originally written in 1981 for a single-processor 16-bit PDP-11 with 64k of RAM per program, and they're still efficient
Try to check when most of the FORTRAN libraries still used today were written. Some -- decades before 1980.
See also how long FORTRAN compilers generated faster scientific code than C -- for decades after C appeared. You can also find why:
> "Why Pascal is Not My Favorite Programming Language"
The paper, as far as I remember, doesn't claim impossibility of writing useful software, mainly laments for the lack of return, break, continue constructs, which are convenient to have but not real show-stoppers. Note also that Pascal standard features are just to be a "learner's language" not a "system language" -- for which there were working compilers and uses before C.
> You couldn't run the Burroughs 5000 Master Control Program on a PDP-11, as far as I know, at all, and certainly not efficiently.
You'd also have to port Unix the same way you had to port MCP -- you can think of MCP as Unix in which drivers are part of the kernel -- but "integer" in ALGOL is certainly as portable as "int" in Unix -- they both fit the "hardware word."
Nicer pointer arithmetics in C is a good point. Though note that Wirth also had "Pascal lite" with pointers and which was closer to assembly for OS-level stuff just as Burroughs had ALGOL lite for OS-level stuff. The concepts of "closer to assembly" but higher-level languages existed certainly well before C. The idea to separate kernel and drivers is something else, and I don't know who achieved that first and when.
Finally, see Google's go -- it's more or less acceptance of Wirth's directions, mixed with terser notation of C. (Now, finally, why are you still on my lawn?)
> The C abstract model of computation is an increasingly poor fit to modern hardware
No, you can observe C (as well as more or less all ALGOL descendants) as a higher level representation acceptably close to assembly, so as long as there are CPU's which execute machine code and we need to care about details (see http://news.ycombinator.com/item?id=3068513) we'll need something like that to have efficient and not too low level representation.
(Note: I continued writing my original comment after you posted yours.)
I don't have much experience with Fortran and Pascal. But the original article also talks about how C was different from Fortran: "Fortran did okay for array-oriented number-crunching code, but nobody could do it for general-purpose code such as what you’d use to build just about anything down to, oh, say, an operating system."
And that covers most of the FORTRAN libraries still used today.
Both Fortran and Pascal came close — ADVENTURE may be one of Fortran's greatest triumphs — but they were not really usable by themselves. Software Tools was written in Ratfor (as I said, C's sibling) and TeX was written in WEB, rather than Pascal. (Writing a text-processing program in a strongly-typed language without a string type is an exercise in frustration.)
> you can think of MCP as Unix in which drivers are part of the kernel
I'm not sure what you're trying to say here. Drivers are part of the kernel in Unix too.
> but "integer" in ALGOL is certainly as portable as "int" in Unix -- they both fit the "hardware word."
But ALGOL-60 doesn't have bitwise operations, bit shifts, unsigned arithmetic, char, or more than one precision of floating-point. Instead you get ↑, exponentiation. So many algorithms that can be expressed efficiently in C cannot be expressed efficiently in ALGOL-60.
> Wirth also had "Pascal lite" with pointers
Where can I learn more? Googling [Pascal lite] is not helpful.
> The concepts of "closer to assembly" but higher-level languages existed certainly well before C.
Certainly true — BCPL could be described as one of them. But BCPL didn't quite get it right.
> Finally, see Google's go -- it's more or less acceptance of Wirth's directions
What does Golang have in common with Oberon (?) that it doesn't have with C?
A lot! Garbage collection, no unsafe pointer arithmetic, type specification different from variable use (varname: type)
> ALGOL-60 doesn't have bitwise operations
Ah.... That's like saying that Unix IV didn't have driver for your network card. It was just not standard only because CPU instruction sets were not standard enough. Real-life compilers had it:
FORTRAN had it, C didn't care initially. And FORTRAN remained faster for long.
> Where can I learn more?
Actually, my bad, sorry, "lite" "closer to the system" language was even before Pascal and obviously before C, it was a bootstrap language for ALGOL W in sixties, see sources:
> A lot! Garbage collection, no unsafe pointer arithmetic, type specification different from variable use (varname: type)
Hmm, I'll give you the last one, although they left out the colon. The others are common to basically all high-level languages, so I don't really think of them as due to Wirth's influence. To my eye, the interesting aspects of Golang are interfaces, slices, and goroutines, none of which are present or even hinted at in Oberon. Interfaces were kind of anticipated in OCaml, slices in D, and goroutines in a family of CSP-derived languages going back to 1980.
> > ALGOL-60 doesn't have bitwise operations
> It was just not standard only because CPU instruction sets were not standard enough. [ALGOL-68]
Well, on one hand, it wouldn't be very useful to try to do bitwise AND on a decimal machine. But the original claim is that, prior to C, general-purpose (i.e. not purely numerical!) programs gained so much speed by being written nonportably that portable versions could not compete, and C enabled high-performance programs to be written portably. Your original rebuttal, as I read it, was that 10 years prior to C (i.e. in 1963) ALGOL had already achieved this.
We can stipulate, I hope, that bitwise operations are crucial for the inner loops of a lot of important algorithms.
Now, it appears that you're saying that not only had ALGOL not achieved this in 1963, but that it was impossible for any language to achieve it in 1963 because CPUs were too disparate, but that ALGOL-68, whose first usable implementations were concurrent with the first usable implementations of C, still didn't standardize those operations, so you still couldn't write portable programs that used them! (Although you could write programs for one or another ALGOL compiler that used them.)
I think you have proved the point of the original article rather than rebutting it.
> FORTRAN had [more than one precision of floating-point], C didn't care initially. And FORTRAN remained faster for long.
For numerical code, yes. But I was talking about the inadequacies of ALGOL-60, not Fortran (which is still faster, as you alluded to earlier). C's limited support for single-precision floating point was a sore point for decades, but not supporting it at all, as standard ALGOL-60 didn't, is much worse. It doubles the size of all your arrays! That's much worse than simply doubling or quadrupling your run-time, as C could; you can almost always run the program for twice as long, but you can only rarely double the core at its disposal.
> sorry, "lite" "closer to the system" language was even before Pascal and obviously before C, it was a bootstrap language for ALGOL W in sixties, see sources:
That code is written in PL360, and although, yes, it has bitwise operations in it, nearly every line of it contains assumptions that it's running on a 32-bit computer (such as the 360) and about which CPU instructions set which CPU flags, with gems like "R0 := FLAGS(I) AND #80000000; IF ¬= THEN ...". It's pretty strong evidence that, in 1966, even Niklaus Wirth thought he had to write nonportable code — essentially assembly language with ALGOL syntax — in order to get acceptable performance.
He explicitly rejected FORTRAN, and he claims he didn't have an ALGOL-60 compiler available.
> And then Pascal was also written in Pascal after it was bootstrapped once.
I've certainly seen Pascal compilers written in Pascal, but the ones I've seen were concurrent with the development of C or later. I don't suppose you have one in mind?
> It's pretty strong evidence that, in 1966, even Niklaus Wirth thought he had to write nonportable code — essentially assembly language with ALGOL syntax — in order to get acceptable performance.
I'd however say that that 1966 code is not at all so far away from C. Today I can also use registers in my big C compiler, heck, I have to for really serious optimization. Just like then. C is also not automatically portable unless active care is done to test it on another platforms and rewrite the parts of it -- if you claim the opposite I have for you some 2 million lines of code I maintain after at least 40 people worked on it -- it's not an exception, more a typical example.
Yeah, you're right, PL360 is pretty similar to C, but it failed to achieve what C achieved: providing just enough of an abstraction from the concrete machine to make efficient portable software possible, and in fact even practical.
As far as register optimization in modern C, I think there's a world of difference between saying "register long a, b, c, d, e;" and saying "WHILE W ¬= #1 DO BEGIN R1 := MEM(W); R0 := R1 SHRL 24 SHLL 12 + T; MEM(W) := R0; W := R1 and #FFFFFF; END". But maybe you were talking about inline assembly.
My experience with C portability agrees with yours, although it sounds like yours is deeper.
Nothing earth-shaking. The _Generic keyword is probably the most exotic addition (at least among those listed on the Wikipedia page) -- it's basically switch(typeof(X)) for macros.
Is there a reason C11 doesn't include typeof? It's been a gcc extension for decades now, and I see no reason it can't be part of the official standard.
My guess as someone who is not familiar with compiler internals is that the design of many compilers prevents them from implementing a typeof() operator without major changes to the way they analyze and store the parsed source code, so the standard writers decided not to include a feature that would not be widely implemented.
But, using _Generic(), one can do many of the things one would do with typeof(). For example, the max() example in [0] could probably be implemented using at most n^2 separate macros plus a _Generic() macro for each type, where n is the number of arithmetic types (though I think you'd still need GCC's statements within expressions in order to evaluate A and B only once).
I'm thinking something like this that combines the cbrt(X) example from the C1x spec [1] (§6.5.1.1 paragraph 5) with the gcc max() example, though I haven't tested it, and it could almost certainly be simplified by exploiting type promotion rules:
#define max_ld_ld(A, B) ({long double _a = (A); \
long double _b = (B); _a > _b ? _a : _b})
#define maxld(A, B) _Generic((B), \
long double: max_ld_ld, \
int: max_ld_i, \
/* etc. */ \
)((A), (B))
#define maxi(A, B) _Generic((B), \
long double: max_i_ld, \
int: max_i_i, \
/* etc. */ \
)((A), (B))
#define max(A, B) _Generic((A), \
long double: maxld, \
int: maxi, \
/* etc. */ \
)((A), (B))
The case of typeof has already been discussed for C99 and it has been rejected. In ISO C99 Rationale document (section 6.7.7), it is written: "A proposed typeof operator was rejected on the grounds of insufficient utility." Don't forget most of the work of the C committee is spent in resisting features suggestions.
The opening sentence bothers me. “Rob Pike reports that Dennis Ritchie also has passed away.” (Emphasis mine.) As though he’s just some kind of footnote in light of the death of Steve Jobs! Both Jobs and Ritchie were “I don’t care if it’s impossible, I’m doing it” types, but I feel that Ritchie contributed more to computing as a whole, while Jobs’s innovations were mainly in user experience.
Being a "can't be done" person is easy, being a "I'll do it" person is hard... but it's so much more fun and liberating. True story.