I didn't finish my degree, but I can tell you that "a year or two" is nowhere near enough to "learn CS on (one's) own". On a daily basis, I use the foundations that I did learn in a formal environment much more than any of my self-taught skills. It's a very complex subject that cannot be effectively navigated without some guidance. Otherwise, you may think you know it, but you're just skimming the surface (and likely doing so in a very inefficient way). Now, if you've been self-taught for a decade or more, that's a different story.
I don't agree. Most people simply don't work very hard. For a motivated person, 1 year on their own time can easily be enough time to replicate a bachelor's in CS. When I did my bachelor's I was a young dumb kid who partied a lot and wasted a lot of time.
Now, 14 years later and tons of industry experience, I realize how trivial the material in my CS bachelors was and really the big problem is that most students don't really work that hard. Now my work ethic is much, much better, and I've been learning far more advanced material, far faster than I ever did before, because I come home in the evenings and actually study (using online options) and practice.
When I look back at college level CS, it's a joke. Also, the online resources are built to actually be taught well, when I was in school, there was no Khan Academy, Pluralsight, Coursera, Udemy, etc. Most college professors are research oriented at heart and are uniformly terrible teachers. The online moocs and so on are focused on actually making things easy to learn, which gives modern learners a massive advantage.
> I realize how trivial the material in my CS bachelors was and really the big problem is that most students don't really work that hard.
As someone who dropped out, worked in industry for a few years, and is now back as an undergrad(though in math, not CS), one thing that I feel is neglected in these discussions is the time-sucking effect of homework.
When I was self-teaching as a dev, I could learn something, play around with it until I felt I had a good grasp, and move on. A college course ties a lot of work to each concept. I have solved way too many matrices by hand in the last few weeks, for instance.
This, so much this. Homework is an insane waste of time. Some people might need that level of tedious repetition to learn a concept, but generally it just detracts from actual learning.
Funny to some one else write almost exactly what I feel. I fucked around in college. I almost failed my first cs class, and failed a few others. A few years later I went back with the help of free online classes and found it a lot easier.
There's not even that much cs in a cs degree. Half the classes are gened. There's a lot of irrelevant math.
A year or two is enough time to get your foot in the door with web dev. For more advanced stuff, sure, it's an ongoing process.
On the other hand, I question the utility of most university CS curriculums. There are usually only one or two classes covering the core "algorithms" knowledge that autodidacts might not have, given how many people squeak through these classes by copying HW, I'm not convinced that anything really sticks.
I agree on your fist point, but I can't speak to the modern curriculum or how it's tackled, since it's been over a decade since I was in uni. I do know that web dev is only a very minuscule subset of CS. A typical web dev may be able to tell me which algos are inefficient for a given task, but I can tell them why.
The algos you're talking about (BSc in CS) are trivial. I went to a research-focused college for CS and at Bachelors the algorithms you learn are not very complicated, they're only made hard by the students themselves not actually doing any work OR terrible teaching. In fact, I'm fairly sure I could my 70 year old dad the various tree based algorithms, Dijkstras, A* search etc, and he doesn't even know how to program. The intuition behind these things isn't hard to grasp.
In fact, I think I could take any random person off the street, and provided they're motivated, teach them algorithms and data structures to BSc level within 3 months. It's not rocket science and any engineer who makes it out to be like "way hard" is just being a bit insecure over their knowledge.
I had some prior experience as a hobbiest, mostly messing with BASIC as a kid, but with no formal structure. I majored in music and after a bunch of wasted years in dead-end jobs, became a professional programmer after only 3 months self-study while on unemployment. So depending on your background and passion and luck, a year or two can be plenty. There are several other stories floating out there, like the blacksmith who became a Rails web dev after I think it was one year of study.
My experience with a BS in CS degree was that 60% of the classes provided no career value at all. There's all the general ed classes meant to make you well rounded, and then there's classes in the core that are either useless (Calculus, very few CS grads will ever use it) or provide history/background about the field that is nice to know, but not generally useful or necessary.
Out of the remaining 40%, the level of overlap and repetition was just absurd. It seemed like each class spent the first half of the semester re-covering things that were covered in prerequisite classes. So maybe 30% of your actual class time is spent on new concepts, skills, etc.
Of those, probably 1/3 involve very basic things, like bubble sorts and the like, which help you build basic problem solving skills and language familiarity, but not much else. Then you get another 1/3 of CS related stuff, like architectures, patterns, SDLC, etc, which is probably the stuff that separates the degree holders from the self-taught in most cases. Then, if you're lucky, you get a couple of classes that are elective and teach something moderately interesting or useful. As I recall, the only classes I had in that category were a class on Computer Graphics (which ended up just being a bit of Java Swing and a bit of math to explain some of it), Web App Engineering (neat class using ASP.NET MVC), and an AI class that really just covered some solvers and search algos, topping out with Genetic Algorithms and Simulated Annealing.
Point being, I could definitely see someone being able to learn all the same CS stuff in a year, if they were self-motivated enough. A few good programming books, a few good CS concepts books, and you'd be covered. Honestly, you could probably cover CS with both more depth and breadth in a year than I got in my degree, if you put a full-time effort into it.
And all that is ignoring the fact that half my class graduated with virtually no ability to actually sit and write code of their own beyond just copy-pasting snippets from online examples until they had something that worked well enough to pass. The only real value of a CS degree, as far as I can tell, is to signal to employers that you can show up and put in at least a minimal effort at something for years without giving up. I guess that makes for a useful enough filter to keep employers using it and keep colleges in business.