Does anyone have the book? Having looked through the ToC on Amazon, there are a few topics that interest me and it seems to be more in-depth than these lectures. But as it is from 1997 (and doesn't appear to have been updated), I'm concerned it will be a bit out of date.
I read this cover to cover for my ML course at Imperial College London in UK. While not an easy read, reviewing the same topics a few times did make you understand the fundamentals better. AbeBooks sometimes has it going for £20(~$30). The exercises were a bit tricky as often the answers weren't attainable by simply following the book and resulted in you needing to consult other material.
I used this book in my Machine Learning course last spring at Georgia Tech. I wouldn't consider it out of date. It is missing a few topics like SVMs that we covered, but otherwise it's a good introduction.
It's still a good introduction to the principles: how problems like regression, classification, and reinforcement learning are defined; concepts like overfitting, bias-variance tradeoff, etc.; some general classes of algorithms and how to analyze them.
The age mainly affects its usefulness as an off-the-shelf guide to applied ML, because some of the currently best performing general-purpose algorithms aren't mentioned [1]. It also spends quite a bit of time on algorithms now considered mainly of historical interest, like version spaces.
So imo its main current usefulness is as a foundational text, which it's quite good for. It helps that it's also well written and understandable.
[1] A recent empirical analysis found that random forests and support vector machines seem to perform most consistently well at classification tasks, neither of which are in this book. http://jmlr.org/papers/v15/delgado14a.html