Hacker News new | past | comments | ask | show | jobs | submit login
Wadler's critique of SICP (1987) (kent.ac.uk)
80 points by puredanger on Aug 4, 2011 | hide | past | favorite | 50 comments



As others have mentioned, it's really a critique of their choice of Scheme rather than Miranda (a proto-Haskell).

As somebody who did most of the first chapter in both Haskell and Scheme, I think it's crazy for Wadler to not mention the downsides of a strict type system; it's listed only as a positive.

The main errors that I faced in writing my programs were type errors, where I knew what I wanted to do and spent upwards of several hours figuring out the magic incantations to convince the compiler to do what I wanted it to do. They were not type errors in the sense that the compiler was preventing me from making bugs, but rather type errors in the sense that I had to figure out a whole bunch of things about the type system in order to try and convince the compiler to accept my program.

There's a heavy cognitive load to programming in Haskell, and I hit many walls where I just had to Figure It Out™ before I could compile my programs.

(All this was as an experienced programmer. Maybe it would be easier for a newbie not used to dynamic languages!)


So you did first chapter of Scheme book in Haskell. Ie, in a language authors weren't prepared you for. And you get errors you're not expected as authors couldn't prepare you for them.

I think that's pretty straightforward thing that happened with you. The same will be with Scheme if you do some chapters from Learn You A Haskell.

Also, Haskell type system is one of most logical out there. Most type systems are either build around some weak logic with many exceptions (dynamic languages, C++, Java, C#), or just build from some limited number of cases (Ada, VHDL, Verilog).

The Haskell type system incorporates a logic system (some kind of constructive logic, I think). You actually learned that logic while "fighting the compiler" and you did some of first session with "proof assistant" which Haskell compiler/interpreter is.

So I think you're complaining for nothing.

Not that I stopping you, just my thoughts. ;)


I'm not trying to complain! I'm just saying that the type system is A Thing that people need to learn, and I think that a rigorous type system can be a hindrance to learners. As such, I think Wadler should have mentioned it as a possible downside of his hypothetical world where Abelson and Sussman used Miranda in SICP, even if only to debunk it.


"SICP in Miranda" would prepare reader for type system. So the reader of that imaginary version of SICP wouldn't encounter those roadblocks. And those who read Learn You A Haskell don't encounter ones.


I challenge both of those assertions. I haven't done LYAH, because I was learning haskell before it existed, but I did nearly every available haskell tutorial when I was learning.

As soon as you step out of bounds, you find strange corners of the type system. It's quite a complex beast, and many of the error messages are cryptic at best.


I had two colleagues who done LYAH and one who attacked a program with type level programming without LYAH (because of LYAH non-existense at the moment).

The first two didn't asked me or my colleague about anything type system related too often. After month or so of on-off work on his project one of them finished a complex translator from CPU description into VHDL.

The one who studied Haskell without LYAH did asked us. I attribute it to complexity of program he worked on.

My experience allows me to remain confirmed that proper exposure to Haskell type system greatly lessen associated burden.

I should note that we didn't encounter many errors in our Haskell code. Much less than in code in Java or C#.


An alternative solution: I'm thick! It's consistent with the evidence.


I'm curious about your thoughts about whether type errors really were a net negative. I spent a fair amount of time with Haskell and it certainly took a bit of time to get used to working with the type system (type inference is awesome, but it can also make for some mind-boggling errors). In particular, I wondered if your Haskell programs Just Worked after you were able to get them to compile?

The payback for grappling with the type system is usually confidence that your program will run without silly errors (though you can still make silly errors (hello, cut-and-paste)). I'm now spending a fair bit of time in Javascript/NodeJS and Python lands and I need lots of testing to make sure that my program won't experience some silly error sometime in the future (<- perhaps this means I'm not a good dev).


I have no idea! I was just solving SICP problems, so part of the issue for me was that, once the program worked, I was done.

I had to do the hard part, but not really gain the benefits of the type system.


When I started to study programming, one of the languages we used was a dialect of Lisp. It didn't make much sense to me at the time because it only allowed me to do boring things, not cool things. Programming only started to make sense when we switched to C, I did some simple tasks first, then paired off with a classmate and we implemented a simple raytracer for a summer project. Graphics is neat.

So I'm skeptical of the idea that programming education needs to be even more hardcore than Scheme and we need to use a lazy functional language instead. I think the right way to teach programming is to give students a simple procedural language (perhaps not even OO) that makes the computer do cool things, like graphics or webpages. The ones who are curious for more hardcore stuff can always discover it afterward, like I did.


I think it comes down to whether you want to teach programming, or computer science. If it's the former, just teach to POKE with BASIC. If it's the latter, I think you'll quickly find yourself in want of a more sophisticated language, in order to teach any of the non-trivial concepts.


I think that's more a libraries thing that a language thing. For example, you can do very cool, HTMLy things using Racket's web-server/servlet and Scheme continuations, which would be incredibly hard to do in C.

But I agree with your suggestion that C should be teached first. If anything, because C's "computational model" is very simple: you know what executes when; it encourages a more common, procedural way of thinking, and it's easy to do side effects in single-threaded C.


IMO what you call the "computational model" of Scheme is also a poor fit for beginning web programmers. The most straightforward model is exemplified by PHP: put a file on the server, type its path into a browser, get the page. Change the code, refresh the page. I'm not defending the language design choices PHP made, but it nailed rapid development and deployment. Every environment that requires a recompile or restart feels like a huge step backward in comparison. I want to see my changes instantly, dammit, we had it in 1995, why can't we have it now? :-)


You can do it in Lisp... even in a long-polling application you can change a class definition and all instances are immediately updated without restarting the image or anything.



Well, forcing me to do Lisp when I was busy figuring out the graphic card registers to do cool things could have turned me from IT altogether. But I also think that if you got to the point when SICP is taught (basically university level CS), you have to do the hard things.


The preface to SICP (http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-7.html) says it was an entry-level course and introduced many students to programming.


Wow!!! If this was intended to be an introduction to Programming then what would a special/advance topic in programming be??


Operational & denotational semantics, for example...

Dana S. Scott. Outline of a mathematical theory of computation. Technical Monograph PRG-2, Oxford University Computing Laboratory, Oxford, England, November 1970.

Dana Scott and Christopher Strachey. Toward a mathematical semantics for computer languages Oxford Programming Research Group Technical Monograph. PRG-6. 1971.

Gordon D. Plotkin. A Structural Approach to Operational Semantics. (1981) Tech. Rep. DAIMI FN-19, Computer Science Department, Aarhus University, Aarhus, Denmark


I don't wanna sound off-topic but I liked your suggestions, they're great actually... Can you please provide me with some suggestions on the following topics? An introduction to Programming Languages Design theory. Computer Science Mathematics, introductions and advanced topics.


If you're interested in books, I like these.

Intro to programming books:

Matthias Felleisen, Robert Bruce Findler, Matthew Flatt, and Shriram Krishnamurthi. How to Design Programs: An Introduction to Programming and Computing.

Daniel P. Friedman (Author), Matthias Felleisen (Author), Duane Bibby (Drawings), Gerald J. Sussman (Foreword). The Little Schemer.

Programming language principles & implementation books:

Daniel P. Friedman, Mitchell Wand, and Christopher T. Haynes. Essentials of Programming Languages.

Shriram Krishnamurthi. Programming Languages: Application and Interpretation.

Matthias Felleisen, Robby Findler, and Matthew Flatt. Semantics Engineering with PLT Redex

These are all by pretty much the same group of people (but distributed across the US: http://racket-lang.org/people.html) so the books all go together really well.


our of curiosity, have you read "Types and Programming Languages", and if so, what is your opinion?


I haven't, but I should. Everyone I know who's read it says it's great.


Intro to Computer Science, not programming. Different subjects.


I almost agree with you that programming can be approached from a Scheme mindset... It can be really mind-opening. But I think teaching programming from C++ or Python perspective can be of (almost) equal greatness... You can do many cool things here and there... It only depends on the level of the course. I mean, if you are going to take an advanced course in programming languages design, learning Scheme, Lisp and other languages can be of great help; but if you want to give an introduction to programming - to people with no programming-related experience - C, C++ and Python are very good choices, I think.


C++ is an immensely complicated language. I don't think it's a good idea to throw beginners into that.


When I was in uni I got my introduction to programming in C. And it was the most funny and mind-opening course in my first year. So, I think it's subjective.

I agree that C++ is a complicated language, specially when you use Boost or Tools.h++, or the even make things worse. But when you teach beginners to code in C++ and only utilize STL, I believe it would be an interesting and challenging course. Plus, in an entry level course, you only need to introduce the basics of programming, which are (in an abstract sense) general logic principles and language agnostic.


C is actually a relatively simple language and a good choice for beginners.

I agree that you need to teach general logic principles which are language agnostic, but if you do this in the context of C++, the beginner is bound to waste a lot of his time fighting irrelevancies in the language.


I got your point. I agree with you actually.


Sorry, you seem to be agreeing with the opposite of what I was trying to say in my comment :-)


Note here that the language used (Miranda) was perhaps the most influential language in the design and implementation of Haskell. The motivation for Haskell was mainly to explore the non-strict semantics of Miranda in an open environment with an open specification.


That is true.

There was a lot of programming language research in the 1970's and 1980's that is not really used in mainstream programming languages in these days.

A good collection of this material can be found in Simon Peyton-Jones' book "The implementation of functional programming languages". Probably out of print (my uni library has one), but freely available on the net, here's the link: http://research.microsoft.com/en-us/um/people/simonpj/papers...

It's worth to note that since the Miranda days, Haskell has added type classes that can do some OO-style stuff and more, a sensible and efficient solution for IO that also works well in parallel and a kick-ass compiler and run time system.


Also Haskell slowly approach dependent type system with type functions (type and data families).


This critique is just a bit unfair to Scheme, given his first couple examples being used to show how cumbersome it is compared to Miranda. Normally you'd have a fold function defined and use that instead:

    (define (sum x) (fold + 0 x))
Writing it this way is more idiomatic Scheme, and isn't really cumbersome.


I have never seen this sort of definition in scheme. Can you provide a pointer or three to show that this is 'idiomatic'?


See chapter 1.3 of SICP. It talks about the motivation and usage of higher order abstractions in general, and also about fold specifically, which they call 'accumulate'.


Ah yes, thanks, we discussed it several times over on the arc forum (e.g. http://arclanguage.org/item?id=10791)

I was asking more about code in the wild that uses accumulate instead of the naive recursive formulation. That's typically what we mean by idiomatic.

I see foldl often in haskell, but rarely in scheme. I'm not disagreeing that it's a great idea, just that this is how it 'would have been written'. Personally I find the naive recursive formulation plenty clear.


Maybe you've never seen it in Scheme because Scheme's standard library is so anemic. Of the usual FP higher order list manipulation functions it only provides map! And people are understandably reluctant to have everyone write their own differently named library functions. The situation has probably improved since the list manipulation SRFI (I think it is SRFI-1), as those are quite popular and available in most implementations.


If Jamie Oliver critiqued El Bulli, it doesn't mean he is advocating for Burger King. Miranda != Java.


It's a critique of Lisp as educational language rather than one of SICP.


Yes, and for good reason. I have seen suggestions elsewhere on the internet that SICP be taught in Python and other languages, and have even seen "ports" of parts of the book to these other languages. But SICP is by now part of a family of textbooks from MIT courses (CLRS and Walter Rudin's Principles of Mathematical Analysis come to mind) that have been utilized and perfected and picked apart and cited for so long that they now are accepted as the "right way". In essence, this is why UC Berkeley's decision is such a sad one. SICP is a classic, and I would go so far as to state that it is beyond critique, in the sense that it does what it sets out to do to the point of perfection.


Honestly? It's the things you think are beyond criticism that most need to be criticized.

(and especially SICP, which inspires such a dogmatic belief in its advocates that it really ought to just be considered a religious text at this point)


No not really. SICP doesn't, whatsoever, inspires any sort of dogmatic beliefs. And I think it would be more appropriate to discuss the subject (SICP itself) rather than discuss what it's advocates says about it.

Simply put, it's more than an excellent book in the world of Computer Science. It is concise, clear, straightforward and challenging. And I think a text like that deserves every interest and respect from every computer scientist.

Respects.


Here's the statement I was replying to:

I would go so far as to state that it is beyond critique

No matter how excellent a book someone might think it is, that's not a healthy attitude to have, and is essentially dogamtism.


Thanks for clarifying your point. I think this matter is subjective and there's nothing from my side to express.


Wadler's critique seems to be that Scheme is too imperative for his tastes and that programming ought to be more like maths.

In my opinion, comp sci is (or ought to be) about computers. And the underlying hardware of nearly all Turing-complete computers is imperative: there's a CPU which executes instructions one at a time, and which can vary which future instruction is to be executed based on the results of previous ones. I get the impression that Wadler, on the contrary, regards comp sci as a branch of mathematics rather than engineering.


You're not going to turn shlubs into Einsteins by forcing students' heads under water next to a calculus textbook.

I've suffered at the hands of "CS is all math" profs. I was in a course that subjected a bunch of CS-101 students to this kind of thing. A whole semester of "Here's a proof that this ten line program sums some numbers correctly."

A collective "Huh?" from the class.

It became crystal clear, after nearly everyone failed the mid-term, that things were not working. The profs held a "come to Jesus" class, where they explained everything; all the proofs, all the ten line programs that added or multiplied numbers, all the assignments full of greek symbols they made us learn. That all became crystal clear, and you could see the lightbulb blink on, just above everyone's head:

/Yup, they're trying to fuck us./

[I survived this course because I wrote my own functional language interpreter, then just typed their damned calculus into it for the programming assignments. What a crock.]

I take a real dim view of teaching "mathematically pure" CS to beginning students. You don't start riding a motorcycle by doing experiments with gryroscopes in a lab; you listen to an instructor, get on the damned thing and learn to ride in a safe environment. Save mathematical purity for after you get 'em hooked.

I've heard: "Oh, but if have them program in BASIC we're doing irreperable harm!"

Horsepucky. I'd rather have an ex-BASIC hacker at my side, helping me ship, than someone who can't write a hundred line program without dragging in declarations that nobody will be able to understand a week later. The best engineers I know weren't hurt by exposure to BASIC, or assembly; they were good at everything.

I guess I'm just a greasemonkey with a keyboard.

[I love Scheme, btw. I don't get functional languages that think it's okay to ditch the programs-as-data paradigm.]


>The best engineers I know weren't hurt by exposure to BASIC, or assembly; they were good at everything.

I wish I spend my time with assembly language with something more fruitful, like Coq (which was developed during my assembler-using days).


You happily forget (an ever increasing) non-determinism in current (increasingly) parallel hardware.

That way you cannot count only on one CPU executing instructions one by one, you have to leave engineering, you have to do some math. You have to prove the absence of deadlocks, race conditions.


Or you could teach Java/Assembly/C++/[insert least favorite language] first so that when the student finds Ruby/Python/Scheme/[insert favorite language] then s/he is amazed at how cool programming can be.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: