Have you ever bought something that warns "assembly required" only to find out that the instructions are missing? That's a pretty accurate analogy to how Mitchell's book left me feeling. With an empty smile, he hands you a bag of techniques and tells you to go to town. Sure, there are hints about when you might apply particular algorithms, but they are abstract and occasionally hidden in the text. It's as if you've been handed a wrench and told that it can turn things. Huzzah.
When describing a field of knowledge, it's important to communicate the "Big Picture." Mitchell does a poor job of this. That is to say that he doesn't do this at all. The lack of a pervasive thread is all the more odd and disconcerting given that his dissertation gave an amazingly coherent description of the process of inductive learning. I suppose I feel a bit taken because there's nothing so tangible or real to hold the disjoint chapters together. So, without any real historical or philosophical context, we're left with something reminiscent of a first-year calculus book. Here's how to differentiate, here's how to integrate, now go figure out what you're supposed to do with those things.
Nevertheless, anyone needing a reference guide (think of a shop manual) to machine learning techniques (that isn't quite up to date) would do well to buy this book. Anyone wanting to understand the field of machine learning should probably check out a bit of the competition. I think you'll find that some folks' kung fu is stronger.