CDN$ 75.51
  • List Price: CDN$ 119.79
  • You Save: CDN$ 44.28 (37%)
In Stock.
Ships from and sold by
Gift-wrap available.
Pattern Recognition and M... has been added to your Cart
Have one to sell?
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See all 2 images

Pattern Recognition and Machine Learning Hardcover – Apr 6 2011

See all 3 formats and editions Hide other formats and editions
Amazon Price New from Used from
"Please retry"
CDN$ 75.51
CDN$ 60.00 CDN$ 79.51

Frequently Bought Together

Pattern Recognition and Machine Learning + Machine Learning: A Probabilistic Perspective + The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition
Price For All Three: CDN$ 253.88

Buy the selected items together

Product Details

  • Hardcover: 740 pages
  • Publisher: Springer; 1st ed. 2006. Corr. 2nd printing 2011 edition (April 6 2011)
  • Language: English
  • ISBN-10: 0387310738
  • ISBN-13: 978-0387310732
  • Product Dimensions: 4.4 x 18.4 x 23.5 cm
  • Shipping Weight: 1.8 Kg
  • Average Customer Review: 3.7 out of 5 stars  See all reviews (3 customer reviews)
  • Amazon Bestsellers Rank: #35,198 in Books (See Top 100 in Books)
  • See Complete Table of Contents

Product Description


From the reviews:

"This beautifully produced book is intended for advanced undergraduates, PhD students, and researchers and practitioners, primarily in the machine learning or allied areas...A strong feature is the use of geometric illustration and intuition...This is an impressive and interesting book that might form the basis of several advanced statistics courses. It would be a good choice for a reading group." John Maindonald for the Journal of Statistical Software

"In this book, aimed at senior undergraduates or beginning graduate students, Bishop provides an authoritative presentation of many of the statistical techniques that have come to be considered part of ‘pattern recognition’ or ‘machine learning’. … This book will serve as an excellent reference. … With its coherent viewpoint, accurate and extensive coverage, and generally good explanations, Bishop’s book is a useful introduction … and a valuable reference for the principle techniques used in these fields." (Radford M. Neal, Technometrics, Vol. 49 (3), August, 2007)

"This book appears in the Information Science and Statistics Series commissioned by the publishers. … The book appears to have been designed for course teaching, but obviously contains material that readers interested in self-study can use. It is certainly structured for easy use. … For course teachers there is ample backing which includes some 400 exercises. … it does contain important material which can be easily followed without the reader being confined to a pre-determined course of study." (W. R. Howard, Kybernetes, Vol. 36 (2), 2007)

"Bishop (Microsoft Research, UK) has prepared a marvelous book that provides a comprehensive, 700-page introduction to the fields of pattern recognition and machine learning. Aimed at advanced undergraduates and first-year graduate students, as well as researchers and practitioners, the book assumes knowledge of multivariate calculus and linear algebra … . Summing Up: Highly recommended. Upper-division undergraduates through professionals." (C. Tappert, CHOICE, Vol. 44 (9), May, 2007)

"The book is structured into 14 main parts and 5 appendices. … The book is aimed at PhD students, researchers and practitioners. It is well-suited for courses on machine learning, statistics, computer science, signal processing, computer vision, data mining, and bio-informatics. Extensive support is provided for course instructors, including more than 400 exercises, lecture slides and a great deal of additional material available at the book’s web site … ." (Ingmar Randvee, Zentralblatt MATH, Vol. 1107 (9), 2007)

"This new textbook by C. M. Bishop is a brilliant extension of his former book ‘Neural Networks for Pattern Recognition’. It is written for graduate students or scientists doing interdisciplinary work in related fields. … In summary, this textbook is an excellent introduction to classical pattern recognition and machine learning (in the sense of parameter estimation). A large number of very instructive illustrations adds to this value." (H. G. Feichtinger, Monatshefte für Mathematik, Vol. 151 (3), 2007)

"Author aims this text at advanced undergraduates, beginning graduate students, and researchers new to machine learning and pattern recognition. … Pattern Recognition and Machine Learning provides excellent intuitive descriptions and appropriate-level technical details on modern pattern recognition and machine learning. It can be used to teach a course or for self-study, as well as for a reference. … I strongly recommend it for the intended audience and note that Neal (2007) also has given this text a strong review to complement its strong sales record." (Thomas Burr, Journal of the American Statistical Association, Vol. 103 (482), June, 2008)

"This accessible monograph seeks to provide a comprehensive introduction to the fields of pattern recognition and machine learning. It presents a unified treatment of well-known statistical pattern recognition techniques. … The book can be used by advanced undergraduates and graduate students … . The illustrative examples and exercises proposed at the end of each chapter are welcome … . The book, which provides several new views, developments and results, is appropriate for both researchers and students who work in machine learning … ." (L. State, ACM Computing Reviews, October, 2008)

"Chris Bishop’s … technical exposition that is at once lucid and mathematically rigorous. … In more than 700 pages of clear, copiously illustrated text, he develops a common statistical framework that encompasses … machine learning. … it is a textbook, with a wide range of exercises, instructions to tutors on where to go for full solutions, and the color illustrations that have become obligatory in undergraduate texts. … its clarity and comprehensiveness will make it a favorite desktop companion for practicing data analysts." (H. Van Dyke Parunak, ACM Computing Reviews, Vol. 49 (3), March, 2008)

From the Back Cover

The dramatic growth in practical applications for machine learning over the last ten years has been accompanied by many important developments in the underlying algorithms and techniques. For example, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic techniques. The practical applicability of Bayesian methods has been greatly enhanced by the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation, while new models based on kernels have had a significant impact on both algorithms and applications.

This completely new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners. No previous knowledge of pattern recognition or machine learning concepts is assumed. Familiarity with multivariate calculus and basic linear algebra is required, and some experience in the use of probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.

The book is suitable for courses on machine learning, statistics, computer science, signal processing, computer vision, data mining, and bioinformatics. Extensive support is provided for course instructors, including more than 400 exercises, graded according to difficulty. Example solutions for a subset of the exercises are available from the book web site, while solutions for the remainder can be obtained by instructors from the publisher. The book is supported by a great deal of additional material, and the reader is encouraged to visit the book web site for the latest information.

Christopher M. Bishop is Deputy Director of Microsoft Research Cambridge, and holds a Chair in Computer Science at the University of Edinburgh. He is a Fellow of Darwin College Cambridge, a Fellow of the Royal Academy of Engineering, and a Fellow of the Royal Society of Edinburgh. His previous textbook "Neural Networks for Pattern Recognition" has been widely adopted.

Coming soon:

*For students, worked solutions to a subset of exercises available on a public web site (for exercises marked "www" in the text)

*For instructors, worked solutions to remaining exercises from the Springer web site

*Lecture slides to accompany each chapter

*Data sets available for download

Inside This Book (Learn More)
Browse Sample Pages
Front Cover | Copyright | Table of Contents | Excerpt | Index | Back Cover
Search inside this book:

What Other Items Do Customers Buy After Viewing This Item?

Customer Reviews

3.7 out of 5 stars
Share your thoughts with other customers

Most helpful customer reviews

2 of 2 people found the following review helpful By Martin Kess on Feb. 12 2010
Format: Hardcover
This is the textbook I'm using for an undergraduate machine learning course, and so far it has been very enjoyable. There are plenty of exercises in each chapter, from simple "derive a formula for ..." to more in depth problems, and several of the problems have solutions. One thing I've found really useful is how well referenced the book is within itself. It'll say something like "Recall that, if we assume a squared loss function, then the optimal prediction, for a new value of x, will be given by the conditional mean of the target variable". In the margin, it then has in red the text "Section 1.5.5" pointing you to where we learned this bit of trivia. Formatting is also well done, charts are colourful and seem to get the point across well. I find that I learn much easier by being shown a picture/graph of what we want to achieve, and then have it described, and finally being given the equations for solving this (rather than just being given the equations), which this book does well. It has taken a fair amount of work to get through though, and so I wouldn't say it's an easy textbook by any means (I mean, come on, we're teaching computers how to think, that can't be easy).

My one complaint is that I wish that they had a chapter/appendix with a bit of a stats refresher, because the last stats course I took was over a year ago, and so this textbook took me a little bit to get into for lack of knowing what some of the early terms meant.
Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again.
2 of 2 people found the following review helpful By Tony on July 18 2008
Format: Hardcover
This books provides an excellent introduction to a wide range of techniques in machine learning and also serves as a good reference.

However, it does require the reader to be mathematically mature, otherwise it can take some time to read through. It is probably best for someone at the graduate level (who has already taken a couple of graduate courses).
Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again.
4 of 9 people found the following review helpful By Benton Lam on June 19 2008
Format: Hardcover
I've found this book a disaster for my Machine Learning course. It claims to use as little math as possible in the introduction, but as anyone that owns the book would tell you, a chapter could easily gather 200+ equations, and students would require pretty advanced calculus to actually have a clue on what the math's about.
The book does a poor job at conveying the ideas across, and although the professor reuses lots of graphics, his trimmed down notes were much more useful at an attempt to understand the topic.
Personally I found the book a waste unless you already have some understanding of machine learning in the first place. Beginners need not apply.
Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again.

Most Helpful Customer Reviews on (beta) 109 reviews
161 of 168 people found the following review helpful
Great Insights, but a hard read June 16 2007
By Sidhant - Published on
Format: Hardcover Verified Purchase
This new book by Chris Bishop covers most areas of pattern recognition quite exhaustively. The author is an expert, this is evidenced by the excellent insights he gives into the complex math behind the machine learning algorithms. I have worked for quite some time with neural networks and have had coursework in linear algebra, probability and regression analysis, and found some of the stuff in the book quite illuminating.

But that said, I must point out that the book is very math heavy. Inspite of my considerable background in the area of neural networks and statistics, I still was struggling with the equations. This is certainly not the book that can teach one things from the ground up, and thats why I would give it only 3 stars. I am new to kernels, and I am finding the relevant chapters difficult and confusing. This book wont be very useful if all you want to do is write machine learning code. The intended audience for this book I guess are PhD students/researchers who are working with the math related aspects of machine learning. Undergraduates or people with little exposure to machine learning will have a hard time with this book. But that said, time spent in struggling with the contents of this book will certainly pay-off, not instantly though.
99 of 104 people found the following review helpful
concentrates too much on the easy stuff July 9 2008
By _claudia_ - Published on
Format: Hardcover
The book is worth a look, but after some of 5 star reviews i read here, it was quite a disappointment. Yes, the book covers a lot of ground. Yes, the book has lots of nice pictures and easy examples, but that is exactly the problem. There are lots and lots of simple examples to explain the most basic concepts, but when it gets complicated the book often sounds as if the text was taken out of a mathematics book. For example: the basics of probability theory are introduced for over 5 pages with the example of "two coloured boxes each containing fruit". Nothing wrong with that. Then the chapter continues with probability densities which are covered within 2 pages and contain sentences like "Under a nonlinear change of variable, a probability density transforms differently from a simple function, due to the Jacobian factor". There is no mentioning how a simple function exactly transforms, what a Jacobian factor actually is and why we would be interested in a nonlinear change. Surely, some of the introductory pages could have been thrown out to explain in depth the more difficult issues. Unfortunately, this is not the only time, where easy concepts get a lot of attention and the truly important complex concepts are skimmed over. All in all, still worth a read, though do not expect too much.
339 of 377 people found the following review helpful
Thorough but vastly unclear Feb. 27 2007
By dc - Published on
Format: Hardcover Verified Purchase
I can appreciate others who might think that this is a great book.... but I am a student using it and I have some very different opinions of it.

First, although Mr. Bishop is clearly an expert in Machine Learning, he is also obviously a HUGE fan of Bayesian Statistics. The title of the book is misleading as it makes no mention of Bayes at all but EVERY CHAPTER ends with how all of the chapter's contents are combined in a Bayes method. That's not bad it's just not clear from the title. The title should be appended with "... using Bayesian Methods"

Second, while it is certainly a textbook, the author clearly has an understanding of the material that seems to undermine his ability to explain it. Though there are mentions of examples there are, in fact, none. There are many graphics and tiny, trivial indicators, but I can't help to think that every single one of the concepts in the book would have benefited from even a single application. There aren't any. I am lead to believe that if you are already aware of many of the methods and techniques that this would be an excellent reference or refresher. As a student starting out I almost always have no idea what his intentions are.

To make matter worse, he occasionally uses symbols that are flat-out confusing. Why would you use PI for anything other than Pi or Product? He does. Why use little k, Capital K, and Greek Letter Kappa (a K!) in a series of explanations. He does. He even references articles that he has written... in 2008!!

Every chapter seems to be an exercise to see how many equations he can stuff in it. There are 300 in Chapter 2 alone. Over and over and over again I have the feeling that he is trying to TELL me how to ride a bicycle when it would have been so much easier to at least let me see the view from behind the handle bars with my feet on the pedals. Chapter five on Neural Nets, for example, is abysmally over-complicated. Would you hand someone a dictionary and ask them to write a poem? ("Hey, all the words you need are in here!") Of course not.

Third, the book mentions that there is a lot of information available on the web site. The only info available on his website is a brief overview of the text, a detailed overview of the text (that's not a typo.... he has both), an example chapter, links to where the book can be purchased, and (actually, quite useful for creating slides) an archive of all of the figures available in the book. There are no answers to problems or explorations of any part of the material. The upcoming book might be amazing and exactly what I am looking for but it could be months away and another $50 or so to purchase it. Hardly ideal. How about putting some of that MatLab code on your site? *Something* to crystalize the concepts!

Finally, while the intro indicates this might be a good book for Computer Scientists it would actually make more sense to call it a Math book. More specifically a Statistics book. There are no methods, no algorithms, no bits of pseudo-code, and (again) no applications are in the text. Even examples that actually used hard numbers and/or elements from a real problem and explained would be much appreciated.

Maybe I am being a little critical and perhaps I want for too much but in my mind if you are writing a book with the goal of TEACHING a subject, it would be in your interest to make things clear and illustrative. Instead, the book feels more like a combination of "I am smart. Just read this!" and a reference text.
49 of 56 people found the following review helpful
The book should change its title Sept. 25 2007
By John E - Published on
Format: Hardcover Verified Purchase
This book (PRML) should be re-titled as "PRML: a bayesian approach". Yes, bayesian approach is very useful for machine learning, and sometimes the final goal of learning is to maximize some sort of posterior probability. However, if the author is such a huge fun of bayes statistics, please tell perspective readers in a clear way. Emphasize bayes aspects too much really hurt the quality of this book as a general-purpose textbook of machine learning.

For a better textbook of machine learning, I recommend:
1) The elements of statistical learning (perhaps this book a little hard for beginner in this field -- but as least better than PRML -- you can compare their chapters about linear regression to see which one is better).
2) Pattern classification (focus on classification, not regression. Also not very easy -- anyway, machine learning is not an easy field ^_^).
3) Machine Learning (a little old, but great for beginner.)

These three book also mention bayesian statistics, but in a proper way. If you have some experience in machine learning and have engineering-level math background, just choose the 1) or 2). If you are completely a beginner, first take a glance on 3), and then go to 1) or 2).

Finally, if you want a book that discusses machine learning purely from bayesian perspective, PRML is good.
12 of 12 people found the following review helpful
Cannot keep it away! Feb. 7 2014
By K. Pasad - Published on
Format: Hardcover
For math heavy fields there are a usually a ton of books but 1 or 2 stand out in terms of their ability to tell a story, using math. Bishops book ranks among those selected few. A context: I read this book after covering some topics from Hatie et al. I am a EE major and occasionally use variant of this stuff in my daily work for signal processing.

IMHO the following make this book so readable as well very useful:
1. Consistent use of a small vocabulary and a few central ideas: all techniques are boiled down to basic fundamental ideas. The ideas are developed early on, very clearly and we are told early on that the rest of the book will grow on these ideas. In bishops case, in chapter one and two, he lays down the fundamentals of Maximum likelihood and Bayesian models, linear models, explains inference and decision, and builds upon these few principles.
2. Usage of terminology is consistent and no surprising new terminology or ideas are added anywhere.
3. The basic ideas are explained, again, every time they are used. Yes, it takes up a few additional lines and makes the material a bit redundant but it serves to reinforce the basic ideas on which everything is built. You do not scamper around in endless loops. Everything is right there-clear. You do not need Google.
4. Clearly and often illustrates how the big picture is composed of basic ideas and how the basic ideas manifest themselves in advanced topics.
5. Does the dirty work of solving the math. And does it in a clean way, without using excessive prefixes and greek letters. The little details matter, and IMHO that's what makes the book readable.

Master the chapter 2 and you will not be scared of advanced topics

My thougths on some negative comments:
1. The book is math heavy: No- the required math needed is covered in chapter 2. Everything revolves around it. Suck it up. ML is math.
2. Not enough intuition: There is. A lot of it, but in its own way. You need to master some of the basic math concepts (book covers it). Sorry.
3. Two much basic stuff repeated- That's what make the book so useful, continuous reinforcement.
4. Too much theory, not enough practice: Ya, there isn't any python code. But a practical text is for advanced user. For beginners, and intermediate, you are better off understanding the fundamentals, else, you will fall into the common trap of trying 5 different models on your data and averaging them. If you want code, just go to sklearn.
5. Bayesian heavy- True, but an understanding of Bayesian model help you understand what to strive for even if you don't use it.

I would recommend reading Hatie et al. after reading this book. Hastie's book is a very practical book. IMHO, you cannot choose between the two-each solves a different problem. Bishops develops the basics and Hastie takes it to practice. Spend time, read both, and don't fear the math!