I think a better alternative would be "Don't Make Students Use Java", I seriously can't think of a worse language for training beginner/intermediate programmers, I certainly wish my schooling experience consisted of something else. From day one you're introduced to magic on top of magic: "Open up <IDE> and click 'new project' and name it 'MyProj' and now click on the file in the side menu that says 'MyProj' and inside `public static void main(String args[]){}` inside the `public class MyProj{}` write `System.out.println("Hello World")` and go to the top bar and hit 'Run' and then 'Hello World' will appear in the integrated console, ah, programming!"
And you don't even begin to grasp what all those steps you just took in the IDE did until you're at least 1+ years in, and you don't learn what all of those magic words even remotely mean by at least halfway into your second semester.
I felt like every single thing I ever did was magic, never having a clue why it ever actually worked the way it did, and as a result nearly dropped out. The only time things _finally_ started to make sense is when I got to Computer Architecture/Organization with MIPS followed by some work in C.
I still feel like C# is a better Java. It has most of Java features, has other features, implemented important features before Java and it doesn't reach Java level of verbosity and bloatness.
I used to feel that way a few years ago, but my opinion has shifted over time. C# has its own issues with verbosity and bloat. For example, although I appreciate the readability increase from overloaded comparison operators, it's a lot of boilerplate to write a class that implements IEquatable<T> and IComparable<T> and overloads all of the related operators.
Here I disagree. I would not start teaching programming in a manual memory language (though that is how I started) because memory corruption is really hard to deal with and completely inconsequential to the basics of programming.
I do think it's important that every programmer should know how to work with raw memory and pointers, but it is a good subject for an intermediate class, after you have the basics of what you are supposed to do while programming.
For a good "what's programming really is" class mobile app development and some web dev would be great. First a mobile hello world. Kotlin preferably. Then they can whip up a very basic backend (it just spews out rows from a DB), again in Kotlin. And then they can slap an admin UI on top in HTML.
And then over the next few years it would be great to dissect this. CPU, ALU, ASM, raw memory, network basics, IP (BGP ~ distance vector routing), TCP, buffers, syscalls. RDBMS, MVCC, B-trees, merge sort, heap sort, and so on.
Oh well, one can dream. But usually students are just bombarded with completely disconnected courses.
I was thinking of higher education. I think it's far more useful to teach algorithm design first and the nitty gritty of computer implementation later. This is, after all, how the field actually evolved. We had algorithms and programs far before we had the first computer.
Java is great for exactly that reason. It gives students an introduction to compilation, runtimes, stacks and heaps etc. Perhaps you had a poor learning experience due to who/what was teaching you rather than Java? My second comp sci course in college (first if you count high school AP credits), taught me all of this, and prepared me for the "full details" of MIPS and C.
My main complaint about Java as a pedagogical language is that it's just so big, with so many complexities around semantics, compilation, runtimes, etc. So my fear is you end up spending as much time learning Java as you do learning how to build software. I wouldn't choose C# or C++, either. My sense, from working with others who came out of CS programs that relied on Java, is that they end up only really knowing how to work in Java, and therefore only really knowing their way around the ways of solving problems that come most naturally in Java.
It could be my chauvinism speaking. I'll admit to having some. My CS program taught us three relatively small programming languages (Scheme first, followed by C, and then MIPS assembly) in the first year before introducing OOP in the second year. So we ended up getting exposure to a variety of programming paradigms and environments, and came away with a lot of tools in our belt, and a fairly deep understanding of programming language theory and paradigms. And I did eventually find myself settled in a Java shop. When I did, I found that within only a couple months I had already developed a deeper understanding of the language than colleagues who had spent their entire careers working in it. And I believe a lot of that is because I had enough background knowledge to distinguish between, "This is how programming is done," and, "This is how programming is done in Java."
Pass by reference / pass by value is so syntactically batshit insane in Java that it shouldn't even be considered as an intro language option.
And before someone points out that, actually-technically, Java is always pass by value, I will pre-retort that reference-as-value isn't an argument we should ever be having about an intro programming language.
Either teach a systems language with unambiguous pointers, or teach something that abstracts consistently.
The only problem here is using the term "reference" for reference variables. If you would say "these are pointers to objects, but you can't actually do any pointer arithmetic", this would be a non-issue. The pass by value is not the problem here.
C and Pascal and even C++ have a very simple model. An address of a variable is a pointer. You can actually understand how memory is layed out on hardware and why you would want to pass references or pointers instead of copying very large data structures.
Learning C first would also introduce students to all of that as well, outside of the magic shell of Java and its tooling; get your compiler, get your text editor, and you are so much closer to the action. If I could have chosen what I started with I would likely have picked C, or possibly a LISP as well.
I went through Basic - > Pascal - > C - > C++ - > many other kinds of languages.
Thanks to C and C++ I did understand the most important bases of programming, how a program runs, how hardware works and the most important algorithms and data structures.
I think it's important to learn something that doesn't do too many abstraction but it isn't too low level like the assembly language.
The point is, it's a terrible first language because it's impossible to actually understand what's going on even in a basic "hello world" program until halfway through the semester. So inevitably you have to start them off by saying "here's a bunch of noise that you don't understand, but that's OK, just copy and paste it" which is a terrible habit to reinforce.
You need to understand
* access modifiers (public/private)
* classes
* static methods
* void return values
* the main() function
* modules (System.out.println)
Just to understand a basic "hello world". It's too much for first-time programmers.
Actually, the hello world doesn't seem that bad to me. And I program mostly in Python nowadays.
class Test {
public static void main(String[] args) {
System.out.println("Hello world");
}
}
And then
javac Test.java
java Test
The class acts as a kind of namespace here, you don't need to go into OOP concepts for hello world at all. The beginner will have to understand
* Class is in file with the same name ¯\_(ツ)_/¯ and contains functions (maybe it'll help me later in organizing code?).
* Command line arguments (not strictly necessary concept but not too problematic either)
* Types - why is there String[] in the definition? (A must have in a statically typed language anyway)
* void - the function doesn't return anything (OK)
* public (not OK, some magic here)
* static (not OK, some magic here)
* Compilation vs. run step (a must have in a compiled language)
So IMHO there are just two concepts that could be thrown away. Or three, if you accessed CLI args from a library. But the meaning of public and static will come naturally when the students will learn about OOP.
I think you can omit the "public"s in the Java hello world (thereby making everything package protected, and sneakily sidestep the issue), but I don't want to ruin this machine by installing Java on it to make sure.
To be super thorough, you would also have to understand:
- semicolons (and the related issue of newlines being semantically equivalent to normal spaces)
- naming conventions (to explain why it's not "System.Out.Println" and "Main")
You can in C#, but not in Java. You can make it an enum, I think, to shorten all that. But my Java golfing knowledge is a bit rusty by now, admittedly.
You are right about first time programmers. But should people come to university without any prior programming knowledge?
If someone applying for an architecture degree or engineering degree is required to have basic drawing skills and basic math knowledge, why we shouldn't assume the same for CS?
Because computer science isn't about programming? That's exactly why a simple and elegant language should be chosen, one that allows to express the concepts that are taught. Java is such a horrible choice that it's laughable that it enjoyed this success at universities.
Yes, a simple language is better to teach CS. CS is not just about programming, but I think CS is about programming too and having a reasonable level of programming is useful.
If I was a teacher and I'm going to teach someone what a linked list is, or what a binary tree is, I can of course use natural language and drawing, I can use abstract algebraic concepts.
But the student should be also able to construct said structures and observe how they work. The same for any other concept.
Yes, and that's why a simple language is always preferrable to Java. It doesn't matter if a student has previous exposure to programming languages if this language can be taught in about two or three weeks worth of class.
Seems like your complaint is against using IDEs early on and not Java. It's been awhile since my undergrad, but we used Java and text editor. We compiled with javac and ran the jars from the CLI.
Likewise. It hasn't been that long since I finished my undergrad (<5 years), and if I remember correctly, we didn't start using Eclipse until second year, and even then it was considered 'optional', i.e. as long as your code compiles and passes the test cases, it doesn't matter whether you use Eclipse, vim, Notepad, etc.
The upshot of this was that I didn't have a clue what Maven was until I started working, but you can learn tooling on the job.
It wasn't until the final year of my 4 year Software Engineering degree I used an IDE for the first time (Eclipse, actually) and I was totally disoriented, not understanding how all the magic was happening.
I consider the IDE's harmful magic don't get me wrong, and perhaps you were taught better than I was, but you still have classes and related incantations out of the gate (I would prefer a non-imperative language as an ideal starter, but I'd settle for a procedural over an OO)
for very beginners, yes... Java is not the right programing language to teach. (I'd say Python is).
But for intermediate/advanced classes I think Java is a must learn for every CS graduate.
1. While it is bloated, it has so many modern and not so modern concepts into it, that learning it makes you understand those concepts better. (eg: It is hard to understand Generics if your favorite language doesn't have them)
2. A lot of enterprises (and major tech) backends still run on Java. I thin knowing it is a must to be competitive in the job market as a fresh grad
3. Learning Java (and probably disliking it) it will make you appreciate and perhaps make you better at learning other more nimble languages (aka. GoLang and friends)
CS programs aren't to let you make some easy money in the next few years. CS programs are to provide an thorough and high level of computing, from which writing code is just a small part. Having a CS degree much more than learning a language and framework.
If one's goal is just to earn some money in the next few years, he would be better served by taking an Udemy course.
University didn't teach me how to code, I did that myself before going to university. What it did teach was the why's and how's of computing without which I would be just a code monkey churning code without having a better understanding.
It is the same difference between someone who make an course learning how to repair electrical appliances and electronics and the engineer who designs them.
I'd like to stress on the second point. When I graduated at 2015, and have done multiple interviews with enterprises, Java or .NET were critical. Don't know if it's the same, but looking into job postings, I'd say it is.
It'd definitely different if you work for an IT company, but not always.
It’s kind of embarrassing but I didn’t understand generics from my beginner CS course (taught in Java). It wasn’t actively tested on either. I only began to appreciate it when I took an online MOOC in OCaml on a whim.
Type erasure gets an unfairly bad rap. Yes, it makes reflection difficult sometimes. Yes, it doesn't play well with value types. But other than that, erasure is a very powerful concept.
My understanding is that the only value in type erasure is that it maintains compatibility with libraries compiled in ancient versions of Java. Are there language-level benefits to type erasure?
Absolutely. Erasure lets you write code that's actually generic, instead of code that appears to be generic.
For example, in C# IList<Foo> and IList<Bar> are two different interfaces that happen to have similar methods. Whereas in Java, List<Foo> and List<Bar> are the same interface. This means you can do things like this in Java:
List<?> list = ...
Object o = list.get(0);
To do something similar in C# is significantly more effort. You either have to duplicate all of your interfaces (IList + IList<T>), use `dynamic`, or use reflection to compile delegates at runtime.
A common complaint is "but erasure lets you add an integer to a list of strings". But as long as you follow PECS[1] rules you can avoid most of those situations.
I don't think either of your points has anything to do with type erasure, and everything to do with allowing generics to take value types. This was an easy decision in Java, since it doesn't (yet?) allow user-defined value types. But C# has had `struct` since the beginning.
C# creates one instance of a generic for all reference types. The general consideration in instantiating multiple versions is object size. All reference types are the same size, so not a concern. Value types, however, range wildly in size and so generally get their own specialized versions during code generation.
The same choice effects the ability to wildcard. C# probably could implement wildcards over reference types, but it would feel inconsistent without value types. And, honestly, a good portion of wildcarding in my experience is to handwave away the compiler when you know what you're doing without reified types. Simply not an issue in C# -- its stronger guarantees around generic types means I can make that same code generic over the type I'm wildcarding in Java.
Code generation is definitely important to talk about, but that really wasn't the focus of my first example. Even if Foo and Bar were both reference types, the same reasoning would apply.
In C#, you can do:
class MyClass : IList<Foo>, IList<Bar> { ... }
In Java, you can't do:
class MyClass implements List<Foo>, List<Bar> { ... }
I know this is commonly viewed as an annoying restriction, but, IMO, it's rather an indication that you're writing code that doesn't respect the contract of the generic interface. For example, what should the `Count` property return if you're implementing IList<T> twice? (C# wiggles around this with explicit interface implementations, but I think it's fair to argue that that's not a strictly superior approach to Java's).
I really want to know what other languages large enterprises are using that have, or are on the track to getting, the kind of volume Java has. Especially languages that don't run on the JVM.
Here's an anecdote - Java almost made me drop out. My university was teaching Java and it was absolutely awful experience. Me with my web 1.0 html/css and turbo pascal experience was blown away how awful "professional" programming was.
Then I found this new kid on the block Python and it was a completely different story. I fell in love with programming and hacking in general and finally at the end of the study I knew enough of python to bootstrap myself with Java which I still think is an abomination.
When I become a student in 1998 we used C and C++ for CS courses. C is simple enough of a language and low level enough to understand what your code does when running on the hardware.
Fast forward to 2 years ago, I did MsC at the same University and most courses were done using Java. I guess that's due to the long arm of Sillicon Valley.
Although Java was used for teaching material, the teachers allowed assignments in your language of your choice if the problem permitted. I did most of the assignments in C#. One I did in Java because I needed a powerful search tool and C# version of Apache Lucene was old. Other people used Java for assignments, one guy was using Python.
I have nothing against Java or using Java for teaching. I think it's better to use Java than Python since it has strong typing, enables parallel computing and it's faster being a compiled language.
However, I dread Java and I avoid it as much as I can. It feels verbose, boilerplate code is too much, it lacks some features.
C# feels a lot cleaner. Even if it does have many libraries and enables you to target anything and do all kind of programming, it lacks the massive ecosystem of Java.
If someone would force me to choose between C++ and Java, I'd happily choose C++ if the task permits (i.e. I won't use C++ for web). And that says a lot since I am not very fond of solving the kind of bugs C++ almost guarantees you will have if the code is large enough. I'd rather solve C++ bugs than deal with the boilerplate and bloat and verbosity of Java.
That being said, I don't blame Java or people who designed Java and Java libraries. Allmost all old systems have their share of problems and Java is kind of old now.
Never programming languages and their ecosystems are solving some of Java's problems but if they will still exist in 20 years, we'll see they will bring their own baggage of problems.
I guess a perfect programming language is like an unicorn: we strive to catch it even if we know it doesn't exist.
What matters to me most these days is productivity, the speed something decent can be brought to market. By decent I mean something somewhat testable, somehow easy to understand in the future, somehow extendable and somehow maintainable.
There will always going to be trade-offs, making the right trade-offs based on the particular problem and resources is an art.
Not to mention those classes would make good programmer candidates start hate programming.
I can't imagine why people start teaching teens from C and Java and make them think programming is boring and annoying. Absolutely worse than not taking the class.
What do you ask teens to accomplish with C? And all the memory management and pointers? The worst way to enter programming. It needs to be taught with what is meant to be fun by doing it.
My undergrad was a while ago, but we also did assembly and prolog first. The lecturer basically said assembler was almost to shock us into seeing how preferably the magic was. I still don't understand why prolog.
I struggled with the Java, but got my teeth into it on my own time to the detriment of my marks in other modules. Eventually it clicked enough, I got a degree and walked into a career where my skills have been in demand for 15 years. I'm pretty happy with that, I'd go so far as to say Java was by far the best part of the course for me.
I was thought through Java but it was all command-line and jacvac.
I did once sort a much better programmers final year project for him though - years later that I worked with - cos he had a CLASSPATH issue. That did make me realise there are people thought the way you say that miss some of the basics.
I teach Java to first-semester students and I tried a couple of times to switch to another language because I think it’s not a good language to start with and because the students don’t like it either. However, the business (and we are a private university driven by businesses sending students) builds on two primary technologies: Java and COBOL. So even if I did teach another language, I would fail to prepare them for their careers in engineering.
What I do instead, though, talk about the architecture and backgrounds, show different ways and let them make the mistakes that help learn to solve problems.
I disagree with this as much as I dislike java I believe its a great language to teach someone object oriented programming. And you don't really need and ide at all, any text editor will do and will probably take away the complexity of learning the UI and help concentrate on what actual programming is. An IDE is really not needed to write your first algorithms as a student.
> ... it insulates the student from the javac command line program, and the command line environment itself
Such a misguided article. The Java language has (quite literally) nothing to do with `javac`. In fact, there's a lot of other compilers out there. Taking a Java class should not focus on the intricacies and weirdness of Java compiler command line interfaces. It seems pretty amateurish to argue in favor of it. Case in point: back in high school, I learned C++ using a Borland compiler (and I haven't touched `bcc` in the 15 years since then, even though I've written plenty of C++ in the meantime).
> ... imports and file naming requirements.
These, again, are (compiler) implementation details and should not be part of a generalized Java curriculum.
> And of course, mathematics is the foundation of computer science.
Stuff like this can be misleading and even though not completely wrong (discrete math, logic, metalogic, mathematical logic are sorta' kinda' similar), it's just wrong enough[1] to lead people astray.
A lot of my peers in college were very bright and could write great code, but they were absolutely useless as developers because they didn't know any tooling. They couldn't compile, run, test, or source control their code if the professor didn't set it up for them.
Yes, tools change, but the knowledge from one tool is almost always transferable. Once you're comfortable on the CLI and understand the concept that Java source is compiled to bytecode and then run on a VM, it's easy to switch to a new compiler.
If you only want to know the absolute bare minimum, go to a boot camp. People go to college to learn, and being comfortable with tools is an essential part of being a developer that students need to learn.
Agreed. The least effective developers I’ve had the displeasure of working with never understand how their code executes end-to-end. And the best ones have a deep knowledge of it.
This is no fluke! If you don’t know how a classloader works with the Java classpath, how to set various JVM flags for operating the JVM, etc., then I’d argue that you’re an amateur, and I wouldn’t trust you to write correct code in a production setting. It’s critical knowledge to know how your code runs, and if you don’t know that, then stay away from production systems. If the goal of these courses is to teach students enough about Java to use it in a professional setting, then knowing how classfiles are produced, bundled, distributed, loaded, and executed by a JVM is critical knowledge.
More succinctly, if you don’t know how your code makes it to a production server and gets executed, you’re not ready to work on it.
(And if the goal isn't to teach students how to use Java in a professional setting, then why are they learning it at all? Use a teaching language so students can focus on the concepts instead of Java's idiosyncracies.)
eh, I understand the sentiment - but it's a little overkill. Junior programmers can be code reviewed and taught these things over time, and also be wildly productive in a production setting.
Sure you can teach junior programmers all kinds of stuff that they should have already learned or should know how to learn, I do it all the time. I'm arguing that this is crucial information that even entry-level engineers should know. If they don't, then I give them a link to the relevant documentation and politely ask them to read it. It's a prerequisite.
Are there any books or articles that go through the things that you’ve mentioned like how the class loader works etc? Most of the tutorials or books I have seen do not even touch this aspect and I want to get better at it.
Oracle documentations typically. And javadoc for those classes.
But I don't think it is such a key to anything. It is sort of stuff that is easy to learn when you need it, easy to forget and not useful most of time. It makes sense to read when you done learning other things, but not for beginners.
Isn't eclipse a piece of tooling? I mean I think the best argument against teaching eclipse is IntelliJ is much better, imo, and is more widespread than eclipse at this point. Student should have a module/class whatever about command line tools as well. I this comparison is comparing two things that aren't comparable, an IDE vs learning the CLI. I also think you contradict yourself in your last sentence. Teaching an IDE in class should help people be comfortable with their tools but at some point they might need to know what the tool is doing under the hood.
> but at some point they might need to know what the tool is doing under the hood.
The problem is the IDE makes it so easy to never have to look under the hood, to the point when people take a peek the get overwhelmed and quickly return to the comfort of the IDE.
However, in real world of production environments there will be no IDE holding your hand and as such you can't get away without a good understanding of the engine under the hood.
I haven't really played around with java since the 90's, and I bought a Java book and tried hopping back in. It was really frustrating because the book uses 8 and I downloaded 9 and Eclipse was slightly diffferent, something with modules blah blah.
Plus downloading the actual Java SDK was confusing, the versioning, it not being free anymore...
In particular, command line compilers have been used virtually the same way since the first command line interfaces, so knowledge of javac is definitely transferable.
I agree that many developers are not very adept with tooling, but is that because they don't use the CLI? Or is it just that they aren't interested in those parts of software engineering? Eclipse is a tool too, and not a simple one either.
The author is specifically talking about students. The emphasis at this stage plays a key role in whether they'll have that interest or not. Knowing is a prerequisite to developing interest.
GUI tooling almost entirely uses the command line layers under the hood, so it is strictly more complicated while hiding more of the real process - the worst of both worlds.
My point about Git is an example of this. Teaching Git is hard, partly because Git's command line UX is bad, but layering the complexity of, e.g., EGIT on top doesn't really remove most of that and makes it harder to teach (because expressing what to do in a GUI is strictly harder than communicating text commands, among other things).
EGit doesn't call the git command line application, it uses a Java implementation of git. Although I can agree it's debatable whether they have improved on the command line interface or not.
IDEs are convenience layers, compilers are essential abstractions - if you don't know how it works underneath the GUI layer you will get stuck really fast and won't be able to troubleshoot when you eventually run in to inevitable problems with IDE, build, compiler.
What makes the CLI interface to the compiler more essential of an abstraction than a GUI interface to the compiler? Just that it is simpler, or that it's been around for longer? A GUI isn't necessarily just calling CLI apps in the background, it could be using programmatic interfaces too.
It could be - but in the majority of cases it isn't - and you will need to fall back to CLI and UNIX abstraction layers when the IDE breaks - because that's the way it's been done since forever and if you want to work with existing (and probably future) tooling and be able to solve the problems when they happen in the layer above you will need to learn the interface used to expose all the functionality to the tools under the hood.
> It could be - but in the majority of cases it isn't
I don't think that's really true, at least for the Java ecosystem which is implicated here. Eclipse does not call out to command line tools for most things, as far as I've seen
> What makes the CLI interface to the compiler more essential of an abstraction than a GUI interface to the compiler? Just that it is simpler
It's not simpler! It's way more confusing to figure out that your make file told your compiler to run a command with some unknown set of obscure options that are causing an error than to just have the IDE pop a dialog and tell you in plain words, or better, not even let you set an option incorrectly in the first place.
The CLI is more explicit than a GUI. In a GUI, compiling / running / testing your code is an afterthought that happens at the click of a button.
In the CLI, you have to be explicit about what actions are taking place. You have to physically type in what command you want to run and the different arguments to it.
The convenience of a GUI is fantastic, but the abstraction makes it a very bad learning tool. Being able to open up a terminal and fix the inevitable git / build / configuration errors is a very valuable skill that developers should have.
I don't see the distinction. I started my career using command line tools and am completely comfortable with them. Yet for more recent projects I stay within the IDE (usually Visual Studio).
It doesn't matter whether I tick a checkbox for compiler warnings or add that option to the command line. The only difference is the IDE makes options easier to discover.
It's the same for Git. A decent GUI gives you a much better visual picture of the state of your working copy.
I’d agree that some focus on that in CS programs would extremely valuable. Maybe things are better now, but when I went in 2008-2012 git wasn’t even mentioned and basically all tooling was learned on your own.
I haven’t taken an introductory CS course since the mid 90s, so I don’t have enough knowledge to know whether I agree or disagree with you. What was the base level of knowledge like when you were in introductory courses? Back in the day, you couldn’t guarantee that everyone could even turn a computer on, so for loops were a second midterm thing.
Edit - I should mention that my last sentence was the perspective of 18 year old me. I was 18, but in my defence I was 18. :)
The point of the article is the very concept that the compiler and the Java runtime environment are not the same thing (which an IDE obfuscates to some extent) is what he wants students to learn early on. (And details like that). This leads into a lot of important concepts around how programming languages are seen by the computer.
> ... how programming languages are seen by the computer
Learning a programming language has nothing to do with how it's seen by a computer. That's a different class (a class on compilers, maybe). You can even turn Java into Javascript if you want to[1]. The fact that the compilation/transpilation flow here is Java -> Java Bytecode -> JS is meaningless in the context of learning Java.
>Learning a programming language has nothing to do with how it's seen by a computer.
Learning the syntax of a progaming language might not have much to do with how it's seen by a computer but it's important if you want to learn how to use a language properly. If you don't know how Java(or any language) works fundamentally, the code you write may work, but it'll likely be more inefficient both in terms of program and programmer speed.
Debugging sessions will be more frustrating and take longer and generally the quality of the programs you write will be lower compared to programs written with a language's quirks and peculiarities taken into account. Which you learn by understanding how a language works.
I'm not saying they should be able to write a Java compiler or interpreter, but they should have a decent idea how they work at least fundamentally.
It's harder to learn this when it all happens at the push of a button and you never learn to appreciate that button because you've never had to do it yourself and if something goes wrong with that button for some reason, you won't understand how to fix it or work around it.
But learning Java is only useful in the context of programming computers, for which you need to know practical things like files and compilers, which I think is the point of the article.
You have completely missed the author's point. The issue is not "how can we help students learn to code" the issue is "how can we help students understand the systems that they are coding on since that affects how they code"
Knowing Java runs in a VM very much affects how you have to code. Sure, when you are learning in a 100/200 level class it doesn't, but by the time you graduate if you don't understand the implications, that is bad. Why not make a very simple change early on that gives context for students later on? We shouldn't be teaching students that Java runs in a virtual machine in a 200/300 level class, by that time we should have moved on to why that matters.
Forcing students to continually use java/javac will make them start learning the concepts on how a computer actually processes their program. It makes them think a bit more about it and the concepts are not foreign when they learn it in a full class. The IDE just turns the whole process into a single button with little context.
It is a way better way of learning the concept through practice, rather than having students memorize it for a multiple choice section on an exam, then forget it.
As a former student who had to use eclipse, I think the ide dramatically slowed my understanding of what was actuay happening in my projects. A command line and a few commands however really makes things simple and clear.
The only thing you need an ide for is autocomplete and refactoring, and in a small project like you have in school, it’s simpler to use a text editor imho and you can fully understand your project that way too
> Such a misguided article. The Java language has (quite literally) nothing to do with `javac`. In fact, there's a lot of other compilers out there. Taking a Java class should not focus on the intricacies and weirdness of Java compiler command line interfaces.
Eh, javac is the reference implementation and the language specification makes several references to it, it's also by far the most widely used. It's also good to understand what goes on behind the scenes in your IDE because the IDE is even more removed from the language than the compiler. C++ is somewhat different because it does not have a reference implementation and has a somewhat more diverse set of implementations. Though I think it's still very useful know your way around the gcc or clang command line.
> These, again, are (compiler) implementation details and should not be part of a generalized Java curriculum.
Imports and naming are not implementation details... they are defined in the specification.
I think there's great value in learning to use a lightweight editor, a compiler or interpreter, the command line, etc.
I also think that Java is perhaps the worst possible choice for that.
It's a language created with the "build once run anyway" mentality that tries to abstract away the system, it's the poster child of languages married to an IDE, and verbose to the point that forcing people to write java code without autocompletion features probably violates the Geneva convention.
At this point in time if you want your students to learn about tooling it would be more useful to teach them to use chrome's JavaScript console than to teach them to use a Java compiler through the command line.
>verbose to the point that forcing people to write java code without autocompletion features probably violates the Geneva convention.
I'm happy to use just emacs. The naming conventions make it easy enough to remember. I've never been a fan of heavy IDEs. Maybe that is just me as a solo developer/entrepreneur, but I don't think that I am the only one.
Once you decouple yourself from the IDE and all of the bloated enterprizey (ahem spring) frameworks, it is actually pleasant to work with. Most of that stuff is superfluous when you have a command of the environment.
it's a regular issue, the more layers between the actual tool (JVM) the less you will need to master, but at the same time the first thing you'll learn is to forget how to use javac or any compiler and just import projects configs or let tools infer stuff for you.
In my personal case, knowing about classpaths the hard way was my only path to salvation. Otherwise I was blindly tweaking IDEs project configs without any idea what was happening.
I think there's a bit of personal preference. As a student I hated IDEs, because I couldn't tell what was actually happening. To this day, I avoid using and IDE or other tool for git/version control, because I can't tell what it is doing.
There's more typing with a text editor and javac, but it's much simpler to understand IMHO.
All of this must be understood in the context of first year programming. For my class, there were no libraries that one needed, and at most 10 or so java files to compile. The classpath would never need to be more than the working directory.
I thought it was obvious that the author means javac as a stand-in for the general idea of compiling and running code from the command line rather than having an IDE do everything for you. Her arguments apply to any Java compiler.
Once students finally graduate they might be unlucky enough to land a job as a Java developer. Here they will live and breathe continuous integration where its essential to write build scripts that can be exported to other environments, and the only way to do this is to use command line recipes.
The thing is, early in their study, students are learning that compilers exist at the same time that they are learning a language. They should become familiar with a compiler, and in the case of Java that it generates class files, that they pass them to a JVM, to inform them of the state of the art of tooling architecture, or even to help them imagine that they could change some pieces or swap them out.
Using the tools directly serves to re-enforce all these concepts.
They can get to know other tools for that same language like in your Borland example, sure. But if you didn't know the compiler existed, you would be in trouble. And in C you will also want to understand the preprocessor and linker.
I'm not sure I agree much with this. When I first started learning Java I had six or seven years of programming experience and I still found dealing with matters of classpath and javac remarkably complicated. I had things down well enough with the tools I used before, but it all seemed so foreign in the Java world.
And to this day I find junior devs struggle with the same things unless they have a maven/gradle project all set up for them.
I'm all for folks learning these skills eventually, but in a first-year CS class I'd really rather students worry about the language and general concepts of writing software.
Lots of javascript tutorials start off with opening and browser's javascript console and typing out some simple stuff and I don't think people would tell them they need to jump straight into node.js and webpack.
All the other intricacies of writing systems will come to them but I find something uniquely wonderful about sitting and writing code, and if we're lucky we'll find students who can have that same experience. I'd rather meet them where they are and talk purely of the language and core concepts, and introduce them to the ugly details as their progress dictates.
> And to this day I find junior devs struggle with the same things unless they have a maven/gradle project all set up for them.
Personally I’m still shocked that learning the common build tools for whatever language is used for a CS course is not the standard. If you’re teaching Java you should be teaching Maven, if you’re teaching C++ you should be teaching CMake. Not to the “I can write my own plugins/macros level” necessarily but at least to “I can create a project that will build without my specific IDE or needing to recite compiler arcana”.
Nobody in their right mind works without these things, and while knowledge isn’t going to directly transfer from one to the next you should at least have enough to know where to look in the documentation for “how do I specify dependencies”, “how do I tell it where to find my source”, etc.
For a first year course where you aren’t pulling in the kitchen sink the IDE project format works well enough, but if you’re entering the workforce you should know damn well how to create a basic project in the standard tools of your language of choice.
> If you’re teaching Java you should be teaching Maven
no you shouldn't, because the premise is invalid. A university student shouldn't be "learning java". They should be learning programming, and theory of computing (and algorithms etc), and perhaps use java as the language. None of this requires maven, build tools or any tool chains beyond some unit testing framework (and GUI framework if displays are necessary).
Don't teach "industry standard tools" to a uni student learning CS.
I would not expect a new graduate to know maven, or know the intricacies of the spring framework. That's something to be learnt on the job. I expect them to be capable enough to learn this on the job - given that they're well versed in the theoretical aspect of computer science. It's easy to explain maven's core by telling them that it's a directed graph of dependencies.
A student that just learns the "industry standard tooling" should need a bootcamp, not uni degree.
CS students should be learning those fundamentals by programming, and so they should be taught enough tooling to program without tripping over themselves all the time. Programming without a build system, a version control system, or a reasonably intelligent editor is unnecessarily difficult and gets in the way of the concepts being taught.
> Programming without a build system, a version control system, or a reasonably intelligent editor is unnecessarily difficult and gets in the way of the concepts being taught.
Programming with a build system, with a vcs and/or an editor with intellisense features will unnecessarily burden the student with learning these extra things. vanilla java, taught using a pure text editor (like pico, or notepad++, that has a simplistic syntax highlighting), and make sure the students type out their own imports, etc, is going to teach them more basics.
Until the day you need to build actual working software to ship to people, there's no need for a build system, and until they need to work in groups, there's no need for VCS. And until they start writing something _very_ complex, like a full game using many libraries, they don't need intellisense.
Most CS exercises fit in one class. And when it comes time to design something reasonably complex, the student would've learnt all the fundamentals (like 6 months in), and can move to using an IDE with little issue. And then when group projects come, the students can learn VCS as it makes group work simpler. But till then, showering the first year student with these tooling is just noise.
Compile errors due to misspelling and "what was that function called again?" expeditions are distractions. My department did us a great disservice by never even hinting to students (at any level) that tooling can make them go away. It was almost painful, working on group projects, to see my classmates struggling so hard just to move around the code. Being unable to seamlessly work your editor and navigate your codebase is just as crippling as hunt-and-peck typing. Is it worth a lecture? No. But a mention? Absolutely.
VCS is essential in a solo project of any real complexity because you will reach intermediate milestones, go off on tangents that don't pan out, and then desperately wish you had a way to get back to the working state you had hours previously. A true VCS with branching and merging is technically overpowered, since you could just make copies of the source for each "commit," but you may as well just use Git.
Build system is not about shipping to people, it's about running what you just wrote on your own machine when it's bigger than one unit of compilation (C file, Java class, whatever). You don't need anything fancy, can just be Make or a shell script, but building and running the whole project should take no more than a few keystrokes. Kids should be fumbling with GCC flags for maybe one assignment, not 4 years.
On the very first day of programming, all you need is a REPL. But once students are doing programming projects of nontrivial size, they should be invited to try incorporating the tools that make them manageable.
It’s also a pretty decent representation of stuff they will be expected to learn and deal with on the job. I can’t imagine programming without dependency/build tools, and yet I also often struggle with them. It seems dishonest and impractical to me to only focus on core concepts. Students should also be exposed to real life!
>A university student shouldn't be "learning java". They should be learning programming, and theory of computing (and algorithms etc), and perhaps use java as the language.
What is the real world difference between "learning java" and "learning programming, theory of computing (and algorithms etc), using java as the language"?
Surely to learn all that shit (using java as the language) you at some point have to, you know, learn java?
Was about to reply to GP similarly, but after rereading it - I think they just expressed the sentiment badly. Note the single sentence they quoted in the reply - about teaching Maven.
Pretty sure they're reading "learn java" as "learn the language + the common build tools + more", instead of just the language, as we are.
Almost all beginners absolutely need to be taught the language first (so they have something concrete to mentally latch on to), not concepts and algorithms. However, Maven is not one of those things - either a GUI with a compile button or "javac" is sufficient at that level.
> Don't teach "industry standard tools" to a uni student learning CS.
This is the critical point. By the time a curriculum is in place around one set of "industry standard" tools, new tools will have come out, in a lot of fields (see, for instance, `npm` and `yarn`.)
> They should be learning programming, and theory of computing (and algorithms etc), and perhaps use java as the language.
I would argue that using Java as the language for introductory CS courses is not a good idea. We have better, simpler languages to teach foundational functional (Racket, Scheme) and imperative (Python) languages.
There's nothing wrong with Java in general, but if your goal is teaching "programming, and theory of computing (and algorithms etc)", there's nothing that Java can do that Python can't that helps with that goal.
In fact, I'd give my optimal progression of languages as starting with Python for basic imperative and OO programming, or Racket for basic functional programming, then switching to the other one, and then looking at Kotlin (or Java if you really must) for advanced OO programming and C (or, in a few years, maybe Rust) for systems programming.
> ...if you’re teaching C++ you should be teaching CMake.
I've been programming very advanced C++ for close to 20 years now.
Never had any reason to use CMake.
Plain old make for when you just want to build something without expending effort and brain cells, or (nowadays) nix if you're putting together an industrial-strength build environment.
CMake is the worst of all worlds, and provides no benefit unless you really want to build Windows exe's with Microsoft's proprietary tooling. (And let's face it, Windows is a legacy dying platform here in 2020.)
Actually I agree for the Java world here. Thanks to the design of Java and many Java libraries you cannot write a little more complex Java code from your mind. Even importing all this nested OOP libraries is complicated without an IDE. For me Java is really a language which is impossible or too time consuming to master without a good IDE.
Working for many years in the python world professionally some people are still amazed that I'm essentially using more or less a normal text editor. IDE is optional for many languages but for some languages it's a strong requirement.
When I was teaching myself to code I started writing Java in Notepad. Not Notepad++, just Notepad.
And it was as painful as you'd expect, put me right off Java (most of my learning was in Python), and didn't really teach me much that I now use as a developer in a JVM shop.
Java is special since the whole compile/link/run loop is relatively slow and there's no REPL. In Nodejs or python you are expected to just run your code. Any error like a bad import/require statement you can correct quickly and re-run. Maybe even add/remove print statements while at it. And it's not just dynamic languages! You can have the same experience in Haskell.
Especially if it is their first experience writing code. Until a student learns to tolerate the discomfort of "the machine isn't doing what I want for some inscrutable reason", it makes no sense to throw them into package-management errors.
> I still found dealing with matters of classpath and javac remarkably complicated.
I started my programming career writing C and C++ using the cli and a text editor.
Some years later I ended doing some Java and to me I found it quite easy as I could see similarities with things I had learned writing C and C++.
I could see the classpath and jar files where roughly equivalent to the concept of libraries and libpath found in the C and C++ linker (only much easier to use) and the javac was roughly equivalent to the gcc or g++ command line compilers.
> I'm all for folks learning these skills eventually, but in a first-year CS class I'd really rather students worry about the language and general concepts of writing software.
This is a big part of why I recommend that another language should be used to teach introductory level courses. Java has too much additional complexity. That complexity has benefits in production, but not for first-year undergrads.
I completely agree. IDEs obscure the behavior of tools and languages. It's the last thing a student needs. I've been talking for a while about writing a course that teaches programming in 2 or 3 languages side by side to basically so the opposite of this. The specific quirks of languages or the things they obscure with magic melt away when you look at languages side by side. I wish university would teach in this way.
Thats a different argument, and a much stronger one.
"Universities should offer a class on comparative programming languages and language tooling." is much different than "Intro-to-programming classes should not have a strongly-recommended IDE"
I was referring to the second part of his post. Also this class is not introduction to programming it is upper division and considered one of tougher classes in terms of workload.
The assignments aren't trivial. Take look at them:
I agree. The intro class should use whatever IDE & config will most enable them to teach the fundamentals.
Understanding the details of packages and javac and other things should be left for a later class which can put them in the context of other languages.
Oh I had a very interesting class in eng. school about programming languages paradigms. About subtle differences between different implementations of the same concept. Ada, C++, Java, C# (could've gone to functional languages but the lesson was painful enough with these 4). Type-erasure, explicit/implicit/partial template instantiation, visibility rules. And then design patterns, in all those languages. How a singleton makes no sense in a language that's not 'full' object and has 'packages' (not the java ones), how GC/RAII/controlled-types change the patterns. It was both a course on 'using language concepts correctly, picking them carefully for a specific need and not 'to avoid writing 3 more lines of code' and also a lesson in 'how can you paint yourself in a corner and prevent yourself from handling the simplest of requirements change'...
Very interesting to see how our mind is shaped by the tools we use.
My small liberal arts school has this as well. It's a great idea but, at least for me, it was hamstrung by the professors being too reliant on their IDEs.
I can see the merit of this, but as an anecdotal counterpoint my first programming class was Fortran 77 using gedit (in the late 2000s). The delayed feedback of whitespace and syntax errors at compile time was painful, slowed down my learning process and distracted from the principles we should have been learning.
I carry this thought process over into my current life. When I start with a new technology/language/etc. I try to start barebones command line, text editor.
I transition quickly into more visual / abstracting tools, but that first foray into the guts really helps ground me with a better understanding of what's actually happening.
As someone who started with C and then made a really hard effort to become an expert C programmer, when later in life I was introduced to Smalltalk and Lisp, I was angry that C had warped my brain and I could only think of programming in the context of a physical machine. I feel it would have been much better to go in the opposite direction.
That sounds wonderful! A real overview of the programming landscape, and definitely a good start assuming they follow my other points as well. I'd love to take that course.
University programs exist to prepare workers for the industry, as a general goal. A subset of students are either already hardcore or will become hardcore developers, cum computer scientists. How should a university 'optimize' their pedagogical approach so that all students are maximally rewarded by their education?
There is no question in my mind that you are correct, in principle. But I wonder if what you propose would in fact result in higher attrition rates. With the current approach, the majority benefit from an IDE that reduces the 'cognitive surface' of the "programming environment", and for those who prefer or gravitate towards the underlying layers, no one will prevent them from firing up e.g. Vim and maven.
That said, yes, don't make students use Eclipse. Make them use IDEA or VS something. /g
It is in my country. We routinely attend jobs fairs put on by their CS departments alongside every other software dev firm, we take on interns in their 2nd CS year, etc. etc.
Hell, they're even learning version control (mainly Git, some SVN) in CS degrees now, which was a nice change to see start coming through in the grads we interview.
Although we're seeing universities now start offering Engineering degrees in software with a far more "real-world" requirement including mandatory internships and an industry sponsored research project.
University programs as I know them prepare the enrolled for a career in science and are typically far removed from concerns of practicability or applicability.
Not hugely, although the current Labour led coalition is trying to make them more widely used. They used to be common place, I'm not really sure what happened.
And even then, we only have them for the trades - plumbers, sparkies, joiners etc.
I can't help but feel like this article is less a condemnation of IDEs in introductory teaching and more a condemnation of java in introductory teaching.
It feels like the use of Eclipse or IntelliJ in intro CS courses is so common in large part because Java has such relatively complex tooling and language conventions. I've never seen someone intro Python with an IDE, because it's extremely easy to use Python without one. The same goes for C, and indeed all of the intro C classes I've seen have gone without an IDE. Java, though... you really could spend an extended part of the course just getting everyone setting up their first file and using the compiler correctly, if it's the first language.
> I can't help but feel like this article is less a condemnation of IDEs in introductory teaching and more a condemnation of java in introductory teaching.
It is both, but I started with the IDEs because they are the chicken of this chicken and egg problem. Java is too complex, so schools reach for IDEs, which just makes things more complicated, et cetera.
If a standard development environment could be devised that hides the incidental complexity of Java without hiding the essential complexity of file systems, etc., I think it would solve the problem just as well as teaching Python.
I don't see how java tooling is any more complex than any other language, they all have learning curves. Even Python has complexities around venv, requirements, pipfile etc. C's tooling in particular is probably the worst. You have to learn the compiler, the linker, or a 3rd party tool like cmake or meson.
Using an IDE in Java is a choice just like picking PyCharm is for python.
I agree with you in general, but here I am discussing introductory CS programs. It's very hard for an intro CS student to build and run a Java program without an IDE, even with help from the professor. It's completely trivial for that same student to write and run most of the Python programs they'll be doing in their intro class after having installed _just Python_.
I feel like this kind of makes the point though... while all of this tooling does exist for Python, it's very easy to use Python without any of it at all. So it can be relegated to more of an advanced, 'production-ready' topic, e.g. in a software engineering course. Indeed, I've never seen an intro course in either Python or C that touches on any of this tooling, save one professor who distributed a standard makefile for his intro C course - but I'm honestly not sure why. It usually suffices just fine to have the students call gcc themselves, the projects they work on in the first year aren't big enough to make this cumbersome.
While that's partially true of Java, it's less true of Java. More of the tooling around Java is de facto required.
This can fuck right off. My elitist university decided that if it wasn’t pure C built on a Linux box, then it wasn’t worth using. Consequently, I never learned how much more enjoyable and quick programming could be until I used VS at my first job. Even worse, I was a VS novice, struggling to perform even the most rudimentary of debugging techniques, despite being fairly fluent in GDB/Valgrind.
I had the opposite experience. My university decided that they would standardize student development environments on vanilla Visual Studio, put everything that students would ever want into a single header (i swear it was something like "header.h") that you copy/pasted into your project folder and that was it. We didn't even use the terminal.
It was tedious as hell to learn everything else in real world development, like static versus dynamic linking, include/linker paths, make, system libraries, and so on. Like as in myself nor my classmates would have any idea those things existed after two semesters of C++ programming courses.
The moral here is that education needs a degree of breadth. Teaching CS students involves three concepts - Computer Science, Writing Code, and Writing Software. The dependency graph between those topics on what knowledge and tools are needed contains cycles.
I'm sorry you had that experience, but you'll note that this isn't what I'm actually suggesting; my suggestion is Python or another dynamic language on any platform so long as it's standardized and accessible, for which I give the example of Ubuntu, but I also mention that Windows with Powershell is an option.
I'm laughing right now because ... I mean, I really thought this was going to be saying "Don't make students use Eclipse (because it's so inferior to the JetBrains tooling they'll use as professionals". My university had us in the command line plenty for C--that's no reason to pretend that that world has anything to do with how Java is written. It's the same "up by the bootstraps" 70s LARPing fantasy that makes people think that assembly will Teach Computational Thinking Skills that will somehow confer ability to handle distributed systems. If you want students to be able to come out able to effectively use tools, they have to come up using them.
> The issue is, as Kevlin Henney is fond of saying, “Software is nothing but the details.” When students don’t understand what a file is, or haven’t ever edited text in anything but Microsoft Word and don’t realize they can edit code outside of an IDE3, they will not be able to do the crucial work of self-directed learning that is a hallmark of all computer science success.
Not sure why the aggressive comments here, but I agree with this. IDE for learning programming ends up shielding you from reality that you should probably understand before you use an IDE.
People who don't know what a file is need to take a computer literacy class before they can jump into something like programming.
An IDE is fine to hide programming related magic for a beginner, but if it needs to hide things like a filesystem, file extensions, navigating through advanced options in simple software like word and internet browsers, or how to use google to help diagnose technical issues, that person is in way over their heads.
> Most importantly, though, it limits the ability of their peers to learn. If a 300-level software engineering class which budgeted a week to teach basic version control skills has to take a two-day detour to teach the Windows users how to get rid of the CRLFs in their commits, and teach the Mac users to remove the .DS_Store files from their repositories, and get everyone set up in Eclipse EGIT, that’s wasted time. If the professor has to schedule time with students outside of class to demonstrate their code because so many students aren’t able to submit their code in a form that successfully runs on the professor’s computer, that’s wasted time for both professors and students, and it undermines important lessons about portability and good practices.
This is might be an argument against a particular IDE setup for a class, but it isn't an argument against IDEs in general. Instead it is an argument for providing a config file to make these problems go away.
> When students have only ever programmed in Java using some bespoke learning library provided by their professor, it will take them much longer than necessary to figure out other languages, other libraries, and other approaches.
The article doesn't support the case that an intro class using an IDE prevents students from learning anything else. The class doesn't set people back--it merely fails to move them forward in that way. I agree that it is important to teach this eventually. But it should be through teaching it deliberately. (EDIT: Someone else posted a link to a class which compares different languages and their tooling side-by-side. That sounds fantastic.)
I have yet to see a class which can do a good job of teaching how package management works to a degree that the student can then confidently debug weird setuptools errors.
The author brings this up as a reason for using an IDE:
> This is valuable in an introductory course, as it avoids wasting class time and lowers the barrier to entry
and then proceeds to argue why other things are more important than that.
I don't like Eclipse or other IDEs, having watched my fellow students get stymied by basic issues that I could resolve with one or two relatively simple instructions on the command line. I think IDEs can be a crutch that programmers ought to learn to do without. Learning what the "real" tools are will pay dividends—quickly too.
That being said, I think it's important that the barrier to entry be low. Java almost necessitates the use of an IDE. There's so much "enterprise" cruft involved in doing the simplest things that, unless you have an IDE to hide it, you're going to confuse beginners.
A better choice of language and tooling can help here. I think starting off with something like Racket would be best. Those who need an IDE (out of laziness, familiarity, whatever) can use DrRacket, and then switch to running their programs from the command line shortly thereafter with next to no transition overhead.
i agree with everyone saying you shouldnt use IDE as a crutch.. but as a seasoned Java dev, it is probably my main exception to the rule as well. I would just recommend intellij over eclipse for beginners, but its better than nothing.
> Ultimately, my core belief is this: Students need to know how to use computers before they can program them in a serious way.
Nonsense.
If we went this way, young people would get completely turned off from programming in the first hours.
Show them Python, Javascript, Scratch. Get them to display something on the screen, anything.
Get that spark.
And once they're hooked, now you can start showing them more details about the wonderful world they have just uncovered.
The author of this article needs to spend a solid five more years thinking about computer science and education because right now, her writing is just dangerously naive.
I think that this is mixing up two different skills.
The first skill is learning to program, and in particular learning to program data structures, or algorithms, or whatever. For this you don't care at all how your code builds, you care about correctness and easy to use testing infrastructure, and an IDE like IntelliJ or Eclipse is going to be almost frictionless for that. Teaching command line necessities together with this provides nothing of value for this skill. If I'm trying to get an algorithm right, I don't care at all about how my code builds, I care that it is easy to hit the "recompile and run tests" button. I also think that Java is a decent choice of language for this.
The second skill is learning how to build, version control, package and distribute software, and for this you really need to learn the common tools used in the industry. Which in addition to the IDE, is use of the Unix/Windows command line, git, compilers, Makefiles and build systems, and so on. Here you really need to care about the details - for example, such a project could be to take a previous project and package it so that it runs on a computer without Java/Python/etc installed.
I think the problem is that many universities never have a course on this second skillset, and so while for teaching the first skillset they choose appropriate tools (in my opinion), they leave this second skillset behind completely. But I think the debate is more nuanced than IDE = Good or IDE = Bad, I think we should carefully figure out what skills should be taught, and what the best way to teach those skills are, keeping in mind that for a lot of first-year university students this may be their first introduction to using anything other than a web browser, office programs, and video games on a computer.
> I think we should carefully figure out what skills should be taught, and what the best way to teach those skills are, keeping in mind that for a lot of first-year university students this may be their first introduction to using anything other than a web browser, office programs, and video games on a computer.
This is absolutely spot on. I used IDE Good/IDE Bad mostly as a way to get people to think about this, and clearly it worked :)
Yes, I’ve thought about it before, since at my university they started on an entirely-eclipse sort of syllabus, and then overcorrected massively to command line. A bunch of new CS undergrads were driven off because they thought that programming was mostly memorising arcane command line invocations just to get simple code to run, and that just doesn’t seem right to me.
Out of interest, your original article pointed out the potential use of repl.it as an alternative to Eclipse and friends, but isn’t it still the whole build-in-a-box experience?
The "stop using IDE _____ so that students can learn how the computer works" has been around since at least Turbo Pascal and Turbo C. In my own career, my toolchain seems to change radically every 3 years, and the transition time is measured in days to weeks. It's not that deep. Neither is bringing a new grad or new hire up to speed on whatever tools we're using.
I've worked with enough really good developers who were lost when not given an IDE. Most of the time, it takes a few days to teach "here's how to build at the command line" and here's how to use git (followed by the eventual day lost to finding a way to break git completely). The IDE centrism seems especially common in Windows and Java shops (and every Java shop I've worked with was using Windows).
I tend to agree, but I wonder how many in the "IDE bad" camp would agree that in order to obtain your drivers license, you must pass the drivers test on a manual transmission?
Since I got my driver's license over a decade ago, I've had to parallel-park less than ten times. I've probably changed a flat tire more often.
I think you would find that the vast majority of drivers in the US don't parallel park even once per year. Furthermore, an increasing fraction of new cars can park automatically.
The chief value of having a road test, in my mind, is to force most teens to learn to drive in a structured program that introduces rules of the road and safe habits. Parallel parking is rare and not dangerous, so we shouldn't waste the limited hours that teens spend with professional driving educators on it.
> I think you would find that the vast majority of drivers in the US don't parallel park even once per year.
In the UK, I had the exact opposite experience. Immediately after passing my test, I was parallel parking daily, and I've frequently been in situations where parallel parking is the only way to find a parking space. IMO it _would_ have been dangerous had I not learnt how to do it and was confident doing it. I'd have likely have been too distracted thinking about what I'm trying to do rather than paying sufficient attention to my surroundings. I know I was when I was first learning, and I failed my first driving test because I didn't adequately check my surroundings before starting to reverse.
At least here in Sweden, manual and automatic transmissions have different driver's licenses, a manual license lets you drive both. Since getting the license is pretty expensive to begin with (with no difference between the two), people tend to prefer getting the manual one.
How much of a difference does learning on a manual actually make? I learnt in the UK, where it's the norm to learn in a manual, and I know after about 5 lessons, it's not really something I thought about. It became habitual very quickly (other than _occasionally_ on steep hills, which took a few lessons longer), and it's something I picked up whilst learning other things anyway.
Why? I would expect manual drives to diminish in popularity over time. It's harder to use, and does not give any benefits at all, other than a slightly cheaper price, and a slightly higher performance if the user is knowledgeable.
An auto transmission is so much better in every respect, and the performance improvements in manual hardly comes into play during every day use.
I consider the ability to use manual transmission vehicles, of which there are many, a benefit. Last year I visited my aunt for a week and traveled around the state with her in her car. I couldn't share in the driving, because she has a manual transmission.
I think it might be more fair to say the "IDE bad" camp is comparable to folks that would say you shouldn't be able to press the "auto parallel park" button on your smart car to pass that part of the test.
Well, considering being able to parallel park is a really insignificant part of being a good driver, and the biggest complaint I hear on the internet is people apparently not knowing where the turn signal is on their car, perhaps people should be required to know more about the buttons on their car.
Agreed! I can't think of many professional trades where you aren't required to know certain fundamentals just because there are higher level tools available. A trained cabinetry carpenter, for example, likely knows how to make a fancy cabinet door without a CNC router, even if that's how they mostly do their work. I'd venture a guess that statisticians could calculate variance, t-score, f-score, z-score, etc by hand.
I'm not saying you can't be a productive professional without knowing lower-level fundamentals, though I don't think you legitimately lay claim to be an expert in the field without them.
I agree that the file system is not emphasized enough anymore, and I've met a fair share of students who struggle with file paths.
Also, in the last section:
> or UC Berkeley (oh, I’m sorry, “Cal”)
A bit off-topic, but as a Berkeley student, I was a bit amused/confused here, what's the sarcasm in the parenthetical about? According to our official branding guidelines[1] both are acceptable, though Cal is mostly used in athletic contexts.
Strongly disagree about the IDE part. At so many places I've been it's been the complete opposite problem. Most developers are using notepad++, or at best vscode without any plugins or custom config.
When you ask them to put a breakpoint somewhere you get a puzzled look back: "what's a breakpoint?".
They don't even know that debuggers exist!! Instead they add printf all over the place, recompile and rerun. Mind you these are people whose CV claim 5+ years of experience.
It's terribly ineffective. University should teach the fastest way to iterate logic, Being able to pause your code, introspect the data and then step through it slowly line by line is the best way to really understand and learn the theory of what is going on. Autocomplete is also a fantastic inspiration on what else that's possible to do that's not covered in your lab handouts.
How to import libraries and configure your build system will come later by necessity and will have changed by the time you start your next project. Knowing that debuggers exist and how to use them is a more general knowledge and it's something you have to be shown first before you miss it, (if Henry Ford asked what people wanted they would have said faster horses). Build systems and compiler flags can also be very challenging and uninspiring to understand before you even know what a linked list is. As someone else said below, after decades of experience i still have problems understanding the intricacies of CLASSPATH, or PYTHONPATH, ask a student to get this right and they'll just blindly copy-paste the first answer from stackoverflow so they can continue their printf-debugging session and finish their assignment before deadline.
"Software Build Processes" is now a field in its own right. There's not just a compiler. There's a build process, a dependency manager, a package manager, a source control system, tools for updating the files those use, etc. It used to be just "make", but now each language has some kind of package management system. Crates, wheels, eggs, containers, boxes inside of boxes - it's complicated.
This is all trade-school stuff, really. It's like learning how to wire electric power, with wire size, type, and color rules, fixtures, grounding, and in the US per the National Electrical Code. There's little academic content there; it's just a mass of detail. Not too hard if it's written down and the documentation is current. Which it won't be at many companies.
An argument against teaching the details in college is that students will need to know the one they're using in great detail. Knowledge about unrelated build systems doesn't transfer much.
Then there's the machine learning parallel universe, with notebooks and YAML.
I disagree that knowledge doesn’t transfer much. The commands don’t but the concepts do. I cut my teeth on Maven, then later was able to pick up a JavaScrip build system, and lately had to pick up a SBT. And I think my knowledge or at least wisdom has compounded. And those previous battles seem to have given me an upper hand or context that other coworkers sometimes lack.
I basically agree with this. This is why, at the end of the post, I recommend Python in a Ubuntu environment, where the "build process" is a single step (`python my_program.py`), and packages can generally be managed with `apt`, which is a standard tool representative of a large class of tools, which are likely to be used at some point by a large fraction of CS students.
Yes, virtualenv. A few years ago, I said that "version pinning" was not going to end well, because it would reduce the pressure on library developers to maintain backwards compatibility. Now something equivalent to version pinning is everywhere. Moving forward is tough when, three levels down, some package wants an obsolete version of some other package.
I used to spend a lot of time thinking about this while helping build software & TA'ing for Berkeley's introductory course (which uses Python & Scheme) and I largely agree with the author. If you're wondering why the author ignores large schools, it's because they have the resources to invest in this. Here's a sample of the things the efforts that Berkeley did for their introductory course [1]
* Online tools to run & visualize code execution directly from the online textbook (ex: https://composingprograms.com/pages/16-higher-order-function...)
* Easy submission for students (no git in CS1, no scp, no printing)
* Instructor autograding for all assignments so the course staff could focus on reviewing code for style/other components.
* Automatically backing up student code
* In some assignments, there were automated hints for syntax, styling, and even correctness [2, 3]
* Online collaborative editing for partner assignments [4]
* Full time staff allocated to building teaching infrastructure to "reduce accidental complexities when using code to solve problems"
* Completely hosted environments (via Jupyter Notebooks or Scratch) for courses with a lot of non-majors.
If there is one thing that Eclipse doesn't do is insulate anyone. Half of Java Enterprise development involves various concoctions of Eclipse projects.
I'm not sure what the difference is between a student learning about Eclipse idiosyncrasies or .DS_STORE files. One topic is as pointless as the other.
Defering learning of implementation details is virtuous and on the plus side, intellij IDEs have state of the art static analysis that show to the coder anti patterns and help it rewrite them in a cleaner way, and with an explanation.
Autocompletion and integrated documentation do increase discoverability and learning of the language, not the reverse.
I agree with sentiment of this. I meet a lot of developers that cannot function outside of the IDE (in my case Visual Studio). As a consequence they struggle when the IDE fails.
In the introductory classes I took, there was no IDE. We telnet into a server, and were taught Pico as an editor (although you were free to use vi or emacs if you wanted). You compiled using g++. You did everything via the command line.
Most of the problems the author is pointing out existed there as well.
The author is misguided in assuming that IDE's are the cause.
I used to do C, C++, Haskell, and php in sublime when I was a student.
But I greatly disagree about students using IDEs on languages like Java. Getting familiar with debuggers, stepping around, that's absolutely a necessary skill--unless you want to go into stdio, or echo based debugging (which is what I'd do with the languages above).
I guess the sentiment is more like "Don't make students start programming in IDEs first"
I get it. It shields them from understanding what's happening underneath. But not everybody is interested or needs to know all the things happening underneath, and its okay. If they're going to learn the whole stack, they will.
Even for someone who is interested in knowing what happens underneath. To succeed at actions, it important not to expand the scope of what you're trying to do too broadly.
I like the idea of starting in a browser. It is a more familiar way to get people used to the concept of text controlling behavior. The jump to compilation and executables might still be difficult, but it’s way better than students on their second year who only know how to run their programs by clicking a green triangle.
I thought this was going to be the standard Eclipse bashing and then found it was more fundamental than that.
In many ways Eclipse is one of the less problematic IDEs because it does expose a lot of the underlying warts... it's one of the reasons people dislike it (among others, to be clear - memory usage, less fluid ergonmics in many other ways). But one of the reasons I have stuck with it is that, like when you understand Git at a fundamental level it all "makes sense", Eclipse has a better direct relationship to the underlying mechanics than other IDEs I've used.
I like Eclipse, but now that I know my way about, it's much easier to knock out a small Java problem on the command line than to fire up Eclipse, setup a project, etc. Then there's all the magical metadata and whatnot that can get screwed up.
Much less mental overhead and less things to worry about on the Linux command line. Plus, I can do more with the commandline tools.
I think students in general will be more versatile if they start with the command line, and then graduate to Eclipse. That way, they'll realize what they lose and gain in an IDE.
Maybe eclipse is too much IDE. Maybe a advanced editors like VS Code hit the sweet point. The tasks infrastructure there - as bad as it is - will force you to understand details. It also has a terminal.
I do not understand the Java bashing. Python is nice, but it is an interpreter, does not compile, link or create executables. It misses so much in the compilation chain that it is neither a good language in the regard of this post.
C++ has all, but well, editors are crappy and pointers for a first lesson student are hard.
Java might not be the best first language to learn, but if you can't grasp Java as a college student, you won't survive the winter as a professional programmer.
I liked the approach Cambridge took for computer science (at least while I was there). Practical assessments:
* Year 1, term 1: ML, on a Windows machine, using Cambridge ML (for the practicals; the actual course, Foundations of Computer Science, used any Standard ML and recommended Moscow ML)
* Year 1, term 2: Java, on a Linux machine[1], using javac; Eclipse was introduced in another course
* Year 2, term 1: Further Java, on a Linux machine, using Eclipse, Verilog on a Windows machine with a DE2 Board, MIPS assembler using a soft processor (same board) plus one or both of:
* * C/C++, non-compiler specific, no recommended IDE/editor, must run against gcc -std=c99 -Wall --pedantic sourcefile.c or g++ -std=c++98 -Wall --pedantic sourcefile.cc (your choice) with no warnings
* * Prolog, on a Linux machine, using SWI-Prolog
* Year 2, term 2: Group project - group's choice (we used C on an Arm mBed board in whatever IDE that came with and Java in Eclipse)
Everything else (the bulk of the course) was either abstract/theoretical or used you/your supervisors' choice of tools - courses used a varying mix of pseudocode/ML/Java/C where useful, but you didn't necessarily have to use them outside of the lectures and there were no practical exams on these (someone I knew used Haskell for his group project and dissertation, and for testing concepts in supervisions).
The idea was to make you comfortable with the fundamentals and moving between technologies, and not be reliant on any particular set of tools. IMO, the contrasting concepts (Windows vs. Linux; functional vs. object-oriented; high-level vs. low-level; hardware vs. software) did this rather successfully. It didn't necessarily get you completely fluent in any particular language (you were largely expected to do this in your own time if it was a direction you wanted to pursue), but it did prepare you for jumping into pretty much any language/environment and getting up to speed quickly.
(There were also digital electronics and physics practicals in the first year, but they're a bit of a different thing)
[1] It gave you enough bash to run javac and manipulate files. There was a follow-up course the next year that went into a bit more depth on shell, and introduced make/Perl/LaTeX/MATLAB.
Honestly I wasn't aware that current curriculums used any sort of advanced IDE's.
When I took courses in the late 90's & early 00's, from assembly, c++, java and python the focus was on minimal text editors and command line compile, link etc. Though for python, Idle was promoted, but that's pretty bare bones relatively speaking.
I wasn't aware that a shift away from basics had occurred.
I feel like people upvoted this because it said don't use eclipse but didn't read the article. Now I'm questioning my sanity - given a choice, do professional developers choose to use eclipse?
The question is somewhat loaded given the previous statement based on my own biases but I've never written java professionally so excuse my ignorance.
My uni was nothing like this and I appreciate that. First year: every thing is terminal based, vim was taught, Javac was used by hand, as was nasm/ld/gcc. Only in the second year were we moved to netbeans and eclipse.
Harvard's CS50 does a great job at this. My first intro to CS was through this course, and I'm grateful I did it this way rather than wait till I was taught in the "broken" way at college.
My first year CS classes went by SCIP and were taught in scheme. Of course, I went to a place that didn't much distinguish between comp sci and comp eng, so what do I know?
Yeah I found Eclipse a pain to set up for python the first time I tried. Kept throwing weird Java errors and GUI glitches. Abandoned that plan very fast
>Using Java or Python in a professional IDE like IntelliJ IDEA, NetBeans, PyCharm, or Eclipse is not a good first introduction to programming for computer science students, whether they’re in the field to become web developers, systems software engineers, or academic computer science researchers. //
So "for computer science students".
Which seems like too major a thing to miss out of the title.
Also, how does webdev fit in here: seems like webdevs should probably be using IDEs from lesson 2. And I say that as someone who started their web content production path writing HTML in pico.
People often forget how few hours go into a typical university paper.
It’s not unlikely that a programmer with a full-time job could put in more hours in just one month than a student would for an entire introduction to programming paper.
It’s unrealistic to expect to be able to cover all aspects of programming in such a short time.
> First, in a Java-focused curriculum, it insulates the student from the javac command line program, and the command line environment itself.
I take exception to this line of thinking. Students should not be using the command line for programming. I believe that forcing students to learn to use command line tools is the reason why so much software has such horrible user experience. They learn that users don't matter unless they've steeped themselves in the arcana of each particular program and have practically become programmers themselves.
> Second, it catches some basic mistakes and allows the student to defer learning about the finnicky language requirements that aren’t deemed core to the curriculum, like imports and file naming requirements.
I had the exact opposite experience. At university, they sat us down at a Unix shell with almost no instruction on the shell or the compiler. We spent so much time learning the finicky shell requirements (and which we had no clear understanding were part of the shell and not the compiler itself or something else), that we were sidelined from understanding what the compiler was telling us or how it worked. You'd run your program and if it gave you an error 11, well, you just read and re-read it until you noticed something off.
Nowadays with an IDE and a static analyzer, it can show my code and show the exact path of execution that will lead to an invalid memory access before I've even run it. Students can learn to find problems themselves because it's laid out right in front of their eyes in relatively-understandable java or C or whatever language they're programming in. They don't need to worry about, "Oh when I compile on this machine, I have to specify the -I argument to the compiler for includes, but on machines I actually use in real life, I don't have to specify anything because it finds all the files I'm actually using because I put them into the IDE myself and can see their relationships."
> it’s crucial to introduce these inconvenient details eventually.
Why? If I'm going to be downloading my IDE from the Internet or an App Store why the heck should I ever learn fiddly Unix commands? I'm writing GUI-based applications. I don't need to understand any of that crap if I don't want to. It's like saying you can't be a taxi driver if you don't understand how the carburetor works. Really? My job is to drive the car around, not to fix it. I know a good mechanic who can do that if the need arises. If they want to learn it, then all the better, but it shouldn't be a requirement.
> What they can’t do, unless they’ve figured it out on their own, is operate a computer outside of the confines of the IDE they’ve been taught.
On the contrary, I had no problem operating the GUI-based computers of my day. It was the bizarre, often contradictory commands in the Unix shell that baffled me. Their names were often stupid puns, their "help" pages did no such thing, and I wasted years learning a bunch of stuff that ended up not being very helpful in the real world.
> When students have only ever programmed in Java using some bespoke learning library provided by their professor, it will take them much longer than necessary to figure out other languages, other libraries, and other approaches.
That has nothing to do with IDEs. That has to do with a deficient teaching environment, and is the exact same problem I described above, which sounds like what the author is advocating.
> Teaching someone to use git is very difficult if they’ve never been taught that a file is an logical unit composed of bytes and metadata.
No, teaching someone to use git is very difficult because it was designed to be difficult because the self-absorbed creator thinks that people should have to suffer to become good programmers. It's just sadistic.
> Students need to know how to use computers before they can program them in a serious way.
I agree, and what the author is proposing sounds like the opposite of that to me.
> After the first foray into programming, take time to teach students about the UNIX command line.
Ugh. I disagree strongly with this sentiment. CS students need to get away from the mentality that Unix is the be-all-end-all of operating systems and programming environments and the quicker that happens the better in my opinion.
People who learn the tool alone are substantially unhireable.
When you hire a developer you usually need them to also be at least be something of an expert computer user, especially if they're going to be working on a small team without 20 other experts to micromanage them.
Over-reliance on the IDEs leaves people appliance operators that are lost when there isn't a magic button to press.
It isn't just an issue in the job market. "Programmers" who aren't sufficiently advanced computer users can't advance their own projects in academia either-- they're stuck working on precooked assignments. I suspect this is one of the forces that has resulted in academic research being utterly starved for competent software engineering (though the main one remains that competent software engineers can get stable high paying jobs, and academia provides them with neither).
Always good to get students to dig deeper, even if it's for their initial 1st/2nd semester, and especially if they're going to be programming long term. But ofc let them have the freedom to choose afterwards.
Truth. I work with plenty of engineers that have only written Java with Eclipse or Jetbrains. Many have a lot of trouble getting their app to compile locally when asked to do so upon moving the app into containers. Many also had issues even writing software upon trying to use something like Vi or VSCode.
At some point, the training wheels need to come off.
I agree, and it has been going on for a very long time. I have coworkers who's productivity (if you can call it that) would slide to zero if you took away Eclipse. If it isn't implemented in an Eclipse plugin, well forget it. Setting up the deployment environment is so far outside of their capability as to be a joke. And these people have been out of college for over a decade.
> What they can’t do, unless they’ve figured it out on their own, is operate a computer outside of the confines of the IDE they’ve been taught.
Understanding what a file is and what a text editor should be prerequisites for entering a computer science department. Someone who doesn't understand these concepts has no business learning to program.
At my "not MIT" school, it was expected that most incoming CS students understood some basic programming before they entered school.
A CD department that tolerates students who don't know what a file is probably has other problems.
> At my "not MIT" school, it was expected that most incoming CS students understood some basic programming before they entered school.
At Cambridge (UK), this is _not_ the case. They merely note that "some knowledge of procedural programming is useful"[1] and "[n]o prior knowledge of programming is required"[2]. The main thing they are looking for, in terms of qualifications, is mathematics, so you'd typically see an A-level in mathematics plus two further ones in further mathematics, the physical sciences, or computing depending on what your school offers.
At interview (for two colleges), I was not asked about programming - the first was purely mathematics based, and the second was primarily maths based with some CS concepts introduced to see how you thought about and were able to manipulate the information given in the interview. It's not something you were necessarily expected to know.
Until this year, the introductory CS modules (making up 25% of the first year's content) were available to natural sciences students, so you would have students from a fairly wide range of backgrounds _successfully_ learning CS and to program (the maths modules, also making up 25%, are still shared, for what it's worth).
That excludes a huge number of students who will have never had the experience to try programming before university.
I never did any programming before my first year, I was planning on going into economics. But then I took an elective in CS and it became my major. I wasn’t alone either, most of the students were not programmers previously.
And you don't even begin to grasp what all those steps you just took in the IDE did until you're at least 1+ years in, and you don't learn what all of those magic words even remotely mean by at least halfway into your second semester.
I felt like every single thing I ever did was magic, never having a clue why it ever actually worked the way it did, and as a result nearly dropped out. The only time things _finally_ started to make sense is when I got to Computer Architecture/Organization with MIPS followed by some work in C.