Hacker News new | past | comments | ask | show | jobs | submit login
My outdated Computer Science degree: Was it a waste of time? (calbucci.com)
23 points by rayvega on Feb 18, 2012 | hide | past | favorite | 35 comments



My university's CS curriculum didn't teach us to use any technologies at all, besides extremely basic Unix and C++. If you wanted to build anything outside of GCC then it was up to you. And you still had to figure out how to set up makefiles on your own.

CS isn't about learning technologies. It's about learning the math and science behind computation theory. You don't learn things like Visual Studio 2010 or how to set up an EC2 instance or how to build iPhone apps. You learn how to build a compiler, the data structures behind file- and operating-systems, and networking fundamentals (among many other things of course) so that you can then learn specific technologies on your own.

If you know why TCP was invented and the core networking stack, and you know what a compiler is and how it does its magic, and when to use a linked list instead of a dictionary, then you have the tools to figure out how to build an iPhone app--or how to use any other computer technology--on your own time.

Edit: I also want to add that CS does change, and learning networking or something like RDBMS theory might not have been common 10 or 20 years ago. But there's few fields worth learning that don't change in that sort of timespan. Core CS principles do change, but probably not much slower or faster than things like physics or medicine. You probably wouldn't argue that a physics degree is worthless because 10 years ago they didn't teach the Higgs Boson (or whatever, I'm not a physicist :) )


"Computer science is no more about computers than astronomy is about telescopes."

--Dijkstra

I think half of my CS curriculum in college was done as pencil & paper assignments. Logic, algorithmic analysis, and the like. You learn enough "computer stuff" to be able to use it as a vehicle for learning how to think.

We learned enough of Java in CS101 to explain abstract data types, encapsulation, invariants, etc. We learned enough C++ to explore OO design by patterns. We learned enough assembly to explore low level stuff like how a call stack actually works. We learned enough Lisp to explore language concepts. And so forth.

Anything else, well, that was yours to learn in an internship, or on the side with whatever project you want to do to explore it. And employability after graduation was pretty much correlated to how willing you were to work harder than what was required in coursework.


Sounds like mine too. Also sounds like any decent program. I'm not sure why people expect college to produce you with every skill you'd ever need in a job. College is supposed to give you the ability to learn those skills and point you in the right direction.


Algorithm analysis, graph theory, information theory, higher order logic, number theory, and problem solving (et al) haven't changed much in a while. I realize in the article he points out specific technological stuff but a good CS program uses technology to drive home larger principles and ideas. I don't think CS degrees will be outdated any time soon.


Agreed, it's specifically because his early 90's education didn't focus on VB6, perl, and Gopher but rather logic and math that it is still relevant.


The problem is that most people in academic computer science programs really want to learn software engineering. The related problem is that most employers want to hire software engineers rather than computer scientists.


If you could create good software engineers without teaching CS, someone would be getting rich doing that. There are many people who know CS and are not good software engineers, but I have never met a good software engineer who didn't know CS.


Contrary to other degrees like English, History, Law, Civil Engineering, Biology, Dentistry, Medicine and many others, in Computer Science what you learning is not necessarily augmenting previous learning, but replacing it.

What a complete load of shit. People in technology love to reinvent the wheel, mostly because they don't know someone already did that wheel about 8 times. It's very rare to find an idea that is completely new and unique.


Remember, "computer science" is not the same as learning specific technologies. You may not learn the ins and outs a specific codec, but you should understand signal processing and maybe even some coding theory. (This isn't a rebuttal to the author, who clearly understands this.)

Also, at the risk of repeating myself, CS != IT.


Computer Science involves neither Computers nor Science - and that's not a bad thing, at all.

On the contrary, the mathematical, logical, and algorithmic underpinnings generalize to all sorts of issues, which apply at all scales of computation. Algorithms for governance? Traffic planning? Applications in other sciences (e.g. biology, medicine, etc)? Yep - all of these. The concepts are completely agnostic to the computational substrate, and I don't think this is going to become obsolete any time soon.

For what it's worth, I'll unpack my first sentence a little more, which is admittedly inaccurate due to its absoluteness. I did take several CS classes that actually involved computers, but a surprising amount of the learning took place during lecture, and every test I ever took was written on paper.

As for "science," I am really riffing on empiricism. the closest CS gets is writing inductive proofs, but this is very different from the empirical underpinnings of other sciences, which tend to rely on statistical inference instead of induction as the primary tool for discovering "truth." ...which is yet another reason why CS is valuable.


Several years ago I came to the conclusion that Computer Science is basically a narrow discipline of advanced "applied" mathematics, and I have always found that the best way to describe it.


Perhaps in an academic sense, but in terms of actual usage it's one of the biggest, no?


Good job the doctor running my treatment stays up to date with modern techniques instead of bemoaning the age of her qualification.

The degree should teach you the techniques to manage your own learning. It is a piece of paper that certifies you can work by yourself or in groups on non trivial tasks, with insight and advice from a more experiences peer.

The majority of my lecturers at university were behind the latest technology, so how could they teach them.


I graduated from a pretty good CS program in 2007 without having ever written a single line of HTML, CSS, or javascript. I also didn't use a database at any point in my college classes. I know that computer science is supposed to be about theory, but that theory should be taught with relevant technologies.

I learned exactly one thing in college that has helped me as a professional programmer. That is that I enjoy programming. Everything else I learned through internships, freelancing, and side-projects.


I definitely didn't write any HTML, CSS, or javascript in college, but I feel like I use algorithmic analysis stuff from there at least once an hour. Data structure stuff I use a bit less often, because I'm usually just using what I've been given by some framework - but if I have some huge bizarre piece of state that I'm passing around, knowing how to structure it or find what I want in it efficiently comes in handy.

Those things were a bit of a terror to learn in class, whereas the basics of HTML took a day or two, the basics of CSS took a good week or two to completely get my head around, and the worst part about javascript was getting my head around a C looking language that didn't behave at all like C. All of this was stuff I couldn't help picking up because I had to complete projects for money. Understanding how a O(n log n) algorithm looked or how fast something was going to fill up all of my ram would not have been something I learned on my own without years of experience of benchmarking and debugging, without school.

edit: I do share the experience of the author of not using any language that I used in college, since college. That's certainly not the fault of the languages, C++ and Java, which are still probably the most widely used outside of the web (well, servlets, but I hated Java.) But if it had been a .NET college teaching me how to automate Excel, I may have killed myself, though I have collected paychecks for doing just that, since.


In a sense I have the same experience as you. I regularly use little bits of the theory we studied in college, except that I only understand that theory because it was re-taught to me by more experienced programmers at my internships/jobs. Nothing I learned in college really stuck because it wasn't taught with any context or application. Computer science was part of the engineering department at my school, so I think it's reasonable for me to expect at least some understanding of how the theory relates to real-world situations.


If you view a computer science degree, as teaching you particular sub-technologies (eg: CORBA, HTTP, etc), then sure, it is inevitable that one thinks of a CS degree as a time-limited paper slip.

This is the very definition of vocational training.

But there is so much more to computer science, and technology than vocational training. I hate to say it, but your math classes and algorithm analysis classes? They were some of the most important classes you took. Even classes like compilers are not invalid, even though we dont quite build compilers in the same way as the dragon book says we should. But some things are the same, and the nature of them does not change as fast as people think. For example, parsing has a number of approaches, but they are all inter-related, and learning about one in depth lets you understand why that approach is not ideal or what kinds of tradeoffs one is making.

Another example... as much as operating systems have evolved, many aspects of them just .. have... not ... changed. EG: IPC in modern Unix systems. Pretty much all the same as 1995 (the first year I coded sockets).

This is pretty much the general education vs vocational argument all over again.


I recall a friend 20 years ago talking about his dad getting a Phd in vacuum technology. Sometimes I feel lime that today. I still don't think my degree was wasted.

- A CS degree in any era signals persistence and an ability to solve non-BS problems that have a distinct right answer.

- Modern technology is so much fun, I am happy to learn it.

- Google and StackOverflow have made learning specific technologies so much easier. No more staying up until 2 with no resources.

- The earlier you get into the field, the more likely you will know things at the bottom - memory management, database tuning, etc.

This isn't meant to imply that CS is easy now. There is a larger body of knowledge to master with many moving parts. Just that an old degree isn't so worthless. (Neither is a 20 year old Physics degree)


The theory hasn't gotten outdated. Most of the software world is still built off of concepts from the 1960s and 1970s. If your computer science degree did not teach the concepts behind the tools, then yes, it was a waste of time.


Investing yourself in the topsoil of shiny new technologies instead of sending roots down deep will set you up for rapid obsolesence. My CS program taught the latter, and it's served me well for almost a decade.


Education is for acquiring abilities, not just knowledge. Seeing that you have kept up with the current technology stacks with ease, I'd say that CS degree has served you well.


Hate to break the news to you, but school is not about teaching you every concept past, present or future. School is teaching you how to learn, dissect and digest concepts on your own. The principles are there and you have learned them, it is up to you to take those skills and apply them to your domain.

I went to undergrad in the same time frame and credit my schooling for the fundamentals. Now go build your house on that foundation and never stop building!


It's not actually generally meant to be vocational training... Or, at least, if it is, it shouldn't be called 'computer science'.


It seems that it was a waste of time on this particular case, as he clearly did not understand what CS is about.


"MVC ... only started getting more attention after Ruby on Rails launched around 2004"

~year 2000, MVC was all the rage in J2EE development, Struts etc.

Probably was invented and even popular long before that also in other environments.


Is there a comparable struggle between academic teachings and practical application in engineering? Do EEs, ChemEs or MechEs have these discussions? I would thing this hits them too.


>At the time, the Web didn’t exist. It was invented in 1993

Uh, no.


It's resonable to say that the web caught on outside CERN in 1993, with the release of Mosaic.

(Hands up, anyone else who used to subscribe to the NCSA "What's new on the web" newsletter in 1993 and visit all the new web servers ... before 10am, every morning?)


Yup, and also scrub every new USENET post by 10:10.

That stupid Yanoff List wrecked everything. =)


In 1993, the newsgroups were way too much to read even if that is all a person did, so you must mean using a program to scan the groups for URLs; right?


Google has a fascinating timeline of Usenet news:

(http://www.google.com/googlegroups/archive_announce_20.html)

Even ignoring binary groups a Usenet feed would have been considerable in 1993.

(http://en.wikipedia.org/wiki/Usenet#Usenet_traffic_changes)


Well nobody read every newsgroup, did they?

I had a core list of 4-5 newsgroups that I would check once daily. On a busy day there would be maybe 20-30 new messages per group.

After Eternal September some of those groups went to hundreds daily and got to be too much to cope with in a day.


quite I remember stumbling on a text interface to the www on ITU's gopher server. I remember thinking what cool idea they have implmeted Bushs memex but a text mode vt100 emulation isn't a terribly good interface.

This must have been 92 or possibly 91


Is a music degree 'outdated' if one doesn't learn to play the piano parts to Adele hits, or create synth patches like Skrillex?


Would you say any of your cs degree was up to date?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: