Hacker News new | past | comments | ask | show | jobs | submit login

"Maybe it’s a crazy idea, but if you’re going to spend all that money for a college education, shouldn’t you expect to learn real-world skills from people who know what they’re doing?"

Maybe it's a crazy idea, but universities are not, and were never intended to be, career training centers. Helping you get a job is just not the mission of these institutions. And it's just as well; it would be highly unfortunate for the academic field of computer science if the undergraduate curriculum focused on how to use current tools like Eclipse or Ruby on Rails or Angular.js. Universities are designed to give you a firm grounding in the fundamental concepts, which should give you the ability to understand whatever tools are in fashion in the industry with relative ease. Besides, if you're not doing any hacking in your free time outside of class anyway, you're probably not cut out for this career.




"Maybe it's a crazy idea, but universities are not, and were never intended to be, career training centers."

Maybe it's a crazy idea, but they are equally not intended to be places where you go and learn nothing of value to anybody.

I mean, I get the whole academic focus thing too, but the described education wasn't focused on academic computer science either. It was simply useless.

Moreover, the idea that academics can be so strictly partitioned from practice is a rather silly one as well, at least in the computer science world. A program that claimed to be turning out really good computer scientists that, mysteriously, are incapable of actually driving a computer to do anything useful to anybody (academic or industry), would not be one that I would be paying much attention to. Even computer science academics are, on occasion, expected to be able to write an actual program every once in a while.

And I certainly won't believe anyone who claims to be really experienced at, say, using multithreaded programming, because they sat through a couple of classes and did a couple of homework assignments that didn't involve coding anything. It's really easy to think you know something, until you actually try to do it, and end up spending 15 hours debugging a terrible race condition. Which turns out to be something really, really simple in the end, and the real problem was that your "academic" knowledge still needed to be hardened with real-world experience.

And I'd observe that fully connects back to academia... it's the people who have actually tried things and used things and have some sort of real experience who will be pushing the boundaries forward, not some sort of semi-mystical "pure academics".


The problem is, you are thinking Computer Science == Computer Programming. You are wrong. Computer Science is more math than programming, if you want to learn marketable skills get a different computer degree. Most colleges offer a more practically oriented degree.


"The problem is, you are thinking Computer Science == Computer Programming."

No, I'm not.

I'm saying you can't function as a computer scientist with 0 programming, and trying to train Computer Scientists in general while somehow shielding or protecting them from programming is something I'd look at very disfavorably. Even computer scientists need to program once in a while. A Computer Science program is doing a disservice to its "Computer Scientists" if it somehow manages to fail to teach them anything about programming.

And to be honest, I find it utterly, mindblowing absurd that anybody could take the alternative seriously. Think about it a bit.

In fact I'd go so far as to say that one of the several major problems facing the University system as a whole right now is this whole idea that theoreticians across all disciplines can somehow be fully severed from all practical considerations, as a result of too many people mindlessly chanting that Universities aren't vocational programs. A theoretician operating without any input from the practical world is a theoretician wasting his or her time, producing theories that will ultimately not even be of academic interest.

(There are very, very specific disciplines in computer science that might be able to get away with 0 programming, BUT the generic undergrad courses would not be able to be that focused. The generic undergrad courses need to be teaching some useful programming stuff.)

"Most colleges offer a more practically oriented degree."

You might want to check that. "Software engineering" or something is actually fairly rare as a distinct major, and while there will often be an "information systems" major of some sort (which it sounds like is what this guy actually took), that isn't so much "more practical" as "a reduced version" of the courses, often reduced to the point that you won't learn "programming" at all.


"You might want to check that."

You are correct. I should have said "Most colleges I considered when starting my degree".

Look, we might have to agree to disagree. What I think, and what the one Computer Science Department Head I spoke to thought, is that Computer Science is math. Period. Just like a degree in Math, it does not have to have a directly practical application to serve a purpose. The fact that we have machines that run these theories is purely a sideline to the theories themselves.

"A theoretician operating without any input from the practical world is a theoretician wasting his or her time, producing theories that will ultimately not even be of academic interest."

I'm sorry, but many many theories that have become helpful for practical purposes were created by someone 100s of years before who didn't have any "practical consideratinos". Just because something isn't useful NOW or even in our lifetime, doesn't mean it isn't useful.

Quite frankly, I think programming (distinct from computer science) should go back to an apprenticeship based program if we can make that sustainable, and computer science (as in the degree program) should be kept where it is: Mostly theorectical with a minor side of practicality occasionally.

Sure programmers sometimes need to understand Computer Science theory, but most of what the average programmer needs to know about it does not require a degree but simply reading a book or three.


"The fact that we have machines that run these theories is purely a sideline to the theories themselves."

I'm not sure you got my point entirely. If you've got a machine sitting there than can help you, you should use it.

If you're going to write about operating systems, you're an academic fool if you don't also actually take some time to gain some real experience on the very-cheap systems that can help validate that you actually do understand what you're learning about.

If you're going to do AI, you're an academic fool if you don't take the time to implement some of your theories and see how they actually perform. Who cares what someone hypothesizes may work in the field of computer vision if they have no implementation?

And so on and so forth. Even if it is just math, you are an idiot if you pass up the opportunity to implement something to verify your theories.

And the result will be courses, especially at the undergraduate level, that turn out to be useful both for academics, and for programmers who want to learn how to write theoretically grounded code. Which, remarkably, is exactly what we actually have.

You are not benefiting by trying to split academics off from practice. It's a stupid idea and a stupid ideal. It isn't a win for anybody.


And you are a fool if you dismiss people/ideas simply because you don't feel they are practical. Sure, don't try to use them in your business. However, many many very useful and practical things have come out of these impractical theories and people who never try to practically prove anything or use their theories. It is a win for everybody to have at least some subset of people who do pure theoretics.

That being said, as already stated, you shouldn't be going into these types of programs if your end goal is to get a job in the industry.


"Computer science" tends to be a pretty ambiguously-defined term, largely because it's a pretty awful term. Everything from AS/400 business programming to combinatorial mathematics could be plausibly lumped under "computer science".

My anecdotal experience seems to match jerf's: I can't think of any undergraduate computer science program that appeared to favor math over at least moderately practical topics. But I've also not seen any undergraduate computer science program that appeared to be job training.

A local community college offers an associate's degree in "computer programming", and their graduates strike me as more obviously employable in the real world than area university computer science graduates. They learn skills that are known to be desired by major area employers, and the program seems to carry more of a "this is all about getting a job" sort of attitude.


I think you're exaggerating how impractical Computer Science curricula are. CSE courses at major universities generally include courses on everything from Systems Programming, to Compiler Construction, to Data Structures and Algorithms, to OOP design, to Database Systems, to Graphics Programming, to Web Programming, etc etc. Each one of these will (at least should!) have several homework assignments and a final project that all involve submitting working code.

I'm not saying that CSE degree programs shouldn't teach you how to write code or require you to write code. I'm just saying that they are not vocational training and therefore should not focus on training students to use whatever specific tools are popular in industry.

Contrast that with community college certificate programs where they offer courses like "C# Programming in Visual Studio" and "iOS Application Development." Those programs' mission is to get you a job, and they also cost much less time and money than a bachelor degree.


It depends on the style of the college. Computer Science is certainly much more than math. I'd sooner employ a CS major who took foreign languages than one who supplemented his coursework with tons of math classes. Those special/practical degrees that schools offer are just fluff and end up pigeonholing you. If you're going to be a software developer, then computer science is the degree that's most helpful.


I'm not an expert on all colleges in america, it sounds like I came across as thinking that way so I apologize. Right or not, the Computer Science courses at the University I was enrolling in (I ended up dropping out of college due to life issues) was 60% math 30% computer theory and 10% programming. I know, because I talked with the Department Chair and actually studied what the degree would do for me before deciding it was my path. I was fully aware i'd need to work on practical projects in my free time to get a job.

Instead, I just got a job by showing off said practical projects with no degree.


Computer science / real world programming work is a mish mash of concepts and practice, and everything in between. Most of the hot research these days seems to be coming from places like Google rather than universities. Certainly the stuff I her about.


That's how it works. Generally with theoretical research, especially math-heavy disciplines like computer science, the distance in time from you coming up with a theory and someone finding a practical application for it is large enough that most people don't even put the two together, unless it was a pretty major deal in academic circles when it was theorized. On the other hand, the things we in the industry care about are not going to come from universities(at least not recent things from universities) but rahter companies like google.


You articulated perfectly exactly what I was thinking.

I realize it is a popular idea among society - that college is a career training center - but that is patently false.

It is an unfortunate idea too, because it leads to lots of people wasting thousands of dollars on an education they won't ever use in their career just for the sake of having the piece of paper they get at the end.

> Besides, if you're not doing any hacking in your free time outside of class anyway, you're probably not cut out for this career.

Bingo.


It'd be really interesting to go back and see when people started to shift towards colleges being a training ground and not the typical apprenticeship path that was the real training for generations. The way you learned to be an electrician was by being someone's apprentice, then working your way up the ranks, and this went for nearly all trade industries that I can think of. This statement has come to sound incredibly pompous (and I genuinely don't mean for it to), but college isn't for everyone, and it shouldn't be the only road to a successful life and career.


Germany is still strong on the apprentice model, but even here there's a shift towards universities.

Now companies face an apprentice-shortage and they started actively recruiting university drop-outs to fill the ranks. At least it's not an expensive experiment for those kids, given that tuition costs max out at about 1000 EUR/year.


Although German universities were meant to teach things that are going to be useful later on. They were not intended to give "well rounded experience" people these days talk about without being useful to chosen career.


Look at how many job ads, especially in software that require a degree in CS. To be fair, there are more technology related courses which are more vocational and not pure CS compared to 20 years ago, but its still seems to be the most common name for a university course teaching computer concepts.


Having a degree means you can be trusted to have a thorough education on the subject, without having to subject you to extensive testing therein.


You forgot the sarcasm tag.

Having a degree means that you have a degree. Sadly, it absolutely does not mean that you have any particular level of education. Trust, but verify: I believe that a job candidate has 10 years experience and a master's in comp sci, but I'm still going to make them code FizzBuzz before we move on with the rest of the interview.


I think universities could do more to educate the public on this, but it isn't really in their best interests, at least in the short-term. And there is a lot of political pressure for public institutions to be career training centers, since nothing else is really filling that role.


"Besides, if you're not doing any hacking in your free time outside of class anyway, you're probably not cut out for this career."

I really dislike this mentality and it unfortunately seems to be the expressed by a lot of engineers (at least ones that have an internet presence). Just because you don't spend all of your time hacking away doesn't mean you aren't cut out for this career nor does it mean you can't be a great engineer. This kind of mindset is only going to deter potentially good engineers from entering into the field.


Can you imagine "if you are studying accounting and want a job as an accountant, you'd better be doing your friends' taxes pro bono in your free time. Otherwise you're just not cut out for this career."

I'm certainly sympathetic to the idea that if you want to be a developer, it's helpful for programming to be an area of passion. But it's not at all obvious to me why that passion must be expressed in a certain way.

I really like math and computer science, and once I finish grad school, the plan is to have a job in applied math or statistics. Maybe even software. But in college, I didn't spend my free time programming, nor doing regressions just for kicks. In four years of university I did one personal project and two hackathons...total time invested, maybe 3 weekends? But let us not assume this means I'm not passionate about CS and math: I was a double major, I was a teaching assistant for 8 courses, I worked through the school as a private tutor for struggling students, I did research with professors, I published...and on my own time, I relaxed.


> Can you imagine "if you are studying art and want a job as an artist, you'd better be doing drawing and uploading to deviantART in your free time. Otherwise you're just not cut out for this career."

Yes. Yes, I can absolutely imagine judging someone in a partially-artistic field on whether they enjoy making art on their own time. That doesn't apply in every field, sure, and I bet chip designers don't generally go home and make little 8088 clones for fun. But programming is still as much of a craft as an engineering discipline, and I'm suspicious of people who claim aspirations in the field without demonstrating that they actually enjoy it.

In interviews, I ask what projects a candidate has written on their own time. It doesn't have to be on GitHub. It doesn't have to be big. It doesn't have to be generally useful. One of the best responses I got was from a guy who'd been developing his own duck hunting journal for the last decade to track location, weather, etc. I'm not surprised to see that he's the lead engineer at his company now.


I absolutely agree that when hiring an artist, you want that person to be excited about art. I suppose in a nutshell, my thought is: all time is your own time. Choosing to pursue a career in <art | programming | etc> is a monumental investment of your own time into <X>. We ought not judge as terribly different those who spend ~50 hours a week of their own time doing X and getting paid for it and those who spend ~50 hours per week of their own time doing X and getting paid for it and also spend 2 hours per week doing X on the side.

This is particularly true of developers: generally speaking, someone who has the technical ability to be a strong developer could make a good living in finance, accounting, consulting, etc. The very fact that they're applying for a software job rather than something else is a powerful signal that they'd rather spend half their waking hours on software than on something else.

...or am I missing something obvious? - wouldn't be the first time.


I don't disagree with you in general, but we're in the context of people starting new careers. A professional <artist | programmer | etc> will have a portfolio of on-the-job work they've done. A recent grad only has the fact that they've graduated and not anything to really demonstrate their aptitude or abilities. I'd be skeptical of such a student who'd never branched out to do something fun during their entire learning time.

But back to the career professional context. I think maybe I've identified the disconnect. Could it be that asking about personal projects is a way of identifying not just a competent developer but an actual geek? Those are different roles with different requirements. For example, you probably don't want to unleash a geek on your legacy business logic maintenance project. They'll probably be bored and end up breaking stuff in the name of optimization or cleanup. Similarly, you don't want "just" a programmer in your R&D team where you genuinely want and need creative innovation.

So maybe both sides are correct: for some jobs, it's completely appropriate to expect personal projects. For others, it shouldn't be expected at all. What do you think?


If you are serious about getting a job, you damn well better be hacking in your free time if you don't currently have a job in the field. The OP didn't specifically say but, at least for me, if you are in school/unemployed looking for a job you better be hacking in your free time if you expect me to hire you.


If you're not doing X in your free time outside of class anyway, you're probably not cut out for a career in X.

You should be pursuing a degree because it's a thorough & facilitated process of learning what you would be learning on your own anyway. You should be there because it's the fastest, most efficient way to get to the head of the pack of people who do have a tenacious passion for X and do spend a significant part of their free time on X. Every minute you're not doing something in X, you're not improving therein.

You think you're going to have a great career in X just by showing up for class and doing what's assigned? you think that will make for a good practitioner of X compared to others who live-and-breath the subject?

That you don't spend all your time on X doesn't mean you aren't cut out for this career and can't be great. The lead comment was "if you're not doing any X in your free time", not "if you're not spending all your free time on X". Don't wantonly construe the former as the latter.

But fact is, if you're not going above and beyond, you won't get above and beyond.


Well said, the interesting thing for me is that we have too many young people who aren't actually learning they are ticking off objectives like some giant RPG quest engine. Go to school, check, go to college check, get job, uhh where?

When treated as an objective rather than a place to learn, college becomes something to get "through" rather than some place to develop life skills.


"universities are not, and were never intended to be, career training centers"

The professional vocational degrees like engineering, medicine and law are intended exactly for career training. They are usually certified by practitioner guilds, for example.

You are quite right about computer science, but that is not a vocational professional program.

The OP studied engineering.

(edit: reading down a bit, someone spotted that OP studied Management Information Systems, which may or may not be engineering as generally understood.)


I can't speak for law school.

Medical school is absolutely not career training. It simply gives you the knowledge and vocabulary so that you can be trained to be doctor. The additional training, of course, happens in your residency.

You are utterly unprepared to truly be a physician when you graduate medical school, even if you have earned the right to be called 'doctor.'


An introductory vocational professional program is still a vocational professional program.


"Maybe it's a crazy idea, but universities are not, and were never intended to be, career training centers. Helping you get a job is just not the mission of these institutions."

Very true. If you want job training, go to a technical/vocational school, or enroll in a certification program. Those are both perfectly legitimate things to do, and there should be no shame in either.


The problem highlighted by the article seems to be that companies are no longer so interested in hiring raw material and then providing training in angular.js or whatever but expect to hire people who already have those skills.

Anecdotally, I'd say that of my CS class maybe only 5% did any serious programming work outside of class.


All of programming can be self-taught. The problem with finding people who have "skills" already is that they might only be able to use libraries and frameworks to build things, but they can't really solve problems. Can they figure out the solution to a memory management problem when throwing hardware at it isn't an option? Writing an application that doesn't fall down is par for the course these days. So knowing how to use some tools doesn't cut it. The companies hiring these candidates are digging themselves a hole, or they are just satisfied with having shallow technical requirements.

(I've rarely programmed outside of class/work since college. I've never had a problem getting a job. I just don't like sitting in front of a computer all the time.)


> Universities are designed to give you a firm grounding in the fundamental concepts, which should give you the ability to understand whatever tools are in fashion in the industry with relative ease.

One of the things that continually frustrates me about people's attitudes toward higher education (at least in the U.S.) is that they don't recognize this fact. An undergraduate university education should give you a general competence, self-motivation, and curiosity. More particular skills can be learned elsewhere; if you spend your time in college learning Java instead of learning to be a well-rounded, thinking, educated person, you're making an expensive mistake. Particular skills go stale quickly, and, unlike other subjects you could study instead, they often lack intrinsic interest.

Companies used to hire people with general competence and then train them in the particular skills they need on the job. I'm not sure why people now seem to expect students to learn these skills at universities, at their own expense. It seems to me that the burden of this training should be on industry, not on universities and their graduates.


I am very sympathetic to this point of view: my undergraduate degree prepared me very well for nothing in particular.

However, if you go back to the medieval roots of the universities, you find the Parisian model, which prepared one to be a cleric or a teacher of the same, and the Bologna model, where you learned law. Think of the account of the founding of Harvard: "One of the next things we longed for, and looked after was to advance Learning and perpetuate it to Posterity; dreading to leave an illiterate Ministery to the Churches, when our present Ministers shall lie in the Dust".


Agreed. I think a better solution to the article author's problem would be to treat 'programming' education (which is not the same thing as computer science) like a skilled trade like carpentry or plumbing, and intersperse community college courses with a formalized apprenticeship program ... it might take a while to get some traction, but it seems like it would work as a way to train programmers while not gutting our ability to do blue-sky theoretical research in computer science.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: