Here is my latest life hack that I have been using. Pick a major from MIT and see their degree program to form a basic knowledge graph of the major. Find them on https://ocw.mit.edu/ and study yourself. Usually, taking 1-2 classes gives you a great insight in to any topic so that you can at least collaborate better with the experts of those topics in a team environment.
Erik Demaine is one of my favourite lecturers at MIT. It's always pleasant to listen to him explain things. My favourite explanation/ analogy is the non deterministic Turing machine. Essentially it's a randomized algorithm that always makes the right choice.
Any undergrad data structure course would do. They are pretty much the same: starting with sorting and lists, then trees, and then graphs, with some introductions to asymptotic complexity. Of course there will be some variations, but the core stay the same. That's also why this Advanced Data Structures is so valuable. It really goes beyond the popular algorithm books/courses, expanding our understanding of data structures and abstractions in general.
This class, and many of Erik Demaine's other classes, are taught in a "reverse lecture" style, where the lecture videos are recorded once and released online, the students watch the videos beforehand, and then come to the classroom to solve (often open) research problems collaboratively.
I believe the video you were looking at was recorded during the 2010 instance of the class, and then released in 2012 for the next iteration of it.
My high school had a whole push for teachers to try flipped classes. I found them to be really nice, skipping through the video when they're repeating something I already got, or re-watching something if it didn't make sense. Despite this, general sentiment of my school seemed to be that they were terrible. I heard complaints such as not being able to ask questions immediately, or feeling disengaged compared to being taught live.
All and all I wish my college did them, I struggle to sit through lectures.
> Most course material is covered in video lectures recorded in 2010 (already watched by over 350,000 people), which you can conveniently play at faster speed than real time. There may also be some new material presented by the professor and/or guest lecturers, which will be recorded for asynchronous viewing.
This is somewhat off-topic, but I am not going to look into this stuff because, frankly, there's no call for it. Two major problems with advanced anything in IT:
1) business goals basically never require sophisticated software (I am lamenting this; it's most decision-makers' lack of imagination or understanding of the possibilities) and, consequently,
2) engineers love introducing unneeded complexity, maybe to exercise the mastery that decision-makers stifle in us. (I'm lamenting this, too. It's irritating and wasteful; if we right-sized work to fully exploit our human resources, they wouldn't burn out or implement maladaptive coping strategies like overengineering.)
I find this incredibly shortsighted and can sometimes hinder one's career. Case in point, Consistent Hashing was "advanced", yet now it is a cornerstone and common knowledge in building systems. Paxos was considered advanced and "never required", yet it became the building block when Google told the world how their Chubby works. Compressed-sensing was a research topic some years ago, yet now it is used in advanced image processing (and yeah, the industry does need "advanced" use cases). If you track SIGGRAPH, many of the so-called advanced theories are adopted in games and CGI systems. You thought engineers could produce photo realistic waves without understanding large-scale Navier-Stokes equations? You thought we could enjoy realistic clouds, fumes, fires in games without engineers studying particle systems? You thought that engineers do not need to study complex optimization to build airline booking systems? Even for the traditional "IT", you thought SAP didn't require advanced data structures and algorithms for their logistics systems? You thought IBM didn't need algorithms and maths to help their customers optimize their warehouse management systems or help the military track their lost weapons including tanks?
Seriously, the list can go on and on for days if not weeks. And a personal lesson: I started to learn about deep learning and large-scale machine learning back in 2010 and they started to pick up steams in the bay area. Yet I thought that level of knowledge was "too advanced" for a systems engineer, so I kept deferring diving deep into the area. Boy was I wrong. Hopefully a new generation of engineers would keep their curious eyes more open than me.
> you thought SAP didn't require advanced data structures and algorithms for their logistics systems? You thought IBM didn't need algorithms and maths
You've read a lot into my words that I didn't say, and you're responding awfully callously to lamentation. I am an engineer. I enjoy advancing my craft. I wish I could have stayed curious, instead of burning out running against business management boundaries.
I would certain never say "IBM didn't need algorithms," and it's disingenuous of you to accuse me of such strawman arguments. All I am saying is that there's a lamentably low threshold beyond which we've too few opportunities to implement sophisticated or novel technology.
My God, what does it benefit you to accuse me of eschewing all of "maths"? Absurd.
My apologies. I just wanted to give examples that those data structures will be useful for ordinary engineers like me. By "you" I meant anyone not you specifically. Certainly didn't mean to put words in your mouth. I guess this is really bad writing, for which I'm sorry.