Vous voulez voir cette page en français ? Cliquez ici.


or
Sign in to turn on 1-Click ordering.
or
Amazon Prime Free Trial required. Sign up when you check out. Learn More
More Buying Choices
Have one to sell? Sell yours here
Tell the Publisher!
I'd like to read this book on Kindle

Don't have a Kindle? Get your Kindle here, or download a FREE Kindle Reading App.

Information Theory, Inference and Learning Algorithms [Hardcover]

David J. C. MacKay
4.5 out of 5 stars  See all reviews (4 customer reviews)
List Price: CDN$ 87.95
Price: CDN$ 70.36 & FREE Shipping. Details
You Save: CDN$ 17.59 (20%)
o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o
In Stock.
Ships from and sold by Amazon.ca. Gift-wrap available.
Want it delivered Tuesday, July 15? Choose One-Day Shipping at checkout.

Formats

Amazon Price New from Used from
Hardcover CDN $70.36  

Book Description

Oct. 6 2003 0521642981 978-0521642989 1
Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Customers Who Bought This Item Also Bought


Product Details


Product Description

Review

"...a valuable reference...enjoyable and highly useful."
American Scientist


"...an impressive book, intended as a class text on the subject of the title but having the character and robustness of a focused encyclopedia. The presentation is finely detailed, well documented, and stocked with artistic flourishes."
Mathematical Reviews


"Essential reading for students of electrical engineering and computer science; also a great heads-up for mathematics students concerning the subtlety of many commonsense questions."
Choice


"An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics."
Dave Forney, Massachusetts Institute of Technology


"This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn."
Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London


"An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home."
Bob McEliece, California Institute of Technology


"An excellent textbook in the areas of infomation theory, Bayesian inference and learning alorithms. Undergraduate and post-graduate students will find it extremely useful for gaining insight into these topics."
REDNOVA


"Most of the theories are accompanied by motivations, and explanations with the corresponding examples...the book achieves its goal of being a good textbook on information theory."
ACM SIGACT News

Book Description

Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Inside This Book (Learn More)
First Sentence
In this chapter we discuss how to measure the information content of the outcome of a random experiment. Read the first page
Explore More
Concordance
Browse Sample Pages
Front Cover | Copyright | Table of Contents | Excerpt | Index | Back Cover
Search inside this book:

Sell a Digital Version of This Book in the Kindle Store

If you are a publisher or author and hold the digital rights to a book, you can sell a digital version of it in our Kindle Store. Learn more

Customer Reviews

3 star
0
2 star
0
1 star
0
4.5 out of 5 stars
4.5 out of 5 stars
Most helpful customer reviews
1 of 1 people found the following review helpful
5.0 out of 5 stars Brings theory to life Feb. 28 2004
Format:Hardcover
Fantastically good value, this wide-ranging textbook covers elementary information theory, data compression, and coding theory; machine learning, Bayesian inference, Monte Carlo methods; and state of the art error-correcting coding methods, including low-density parity-check codes, turbo codes, and digital fountain codes. Theory and practical examples are covered side by side. Hundreds of exercises are included, many with worked solutions.
Three things are distinctive about this book.
First, it emphasizes the connections between information theory and machine learning - for example data compression and Bayesian data modelling are two sides of the same coin.
Second, since 1993, there's been a revolution in communication theory, with classical algebraic codes being superceded by sparse graph codes; this text covers these recent developments in detail.

Third, the whole book is available for free online viewing at
[...]
I use this book in all my teaching! :-)
Was this review helpful to you?
5.0 out of 5 stars An exciting and up-to-date text Feb. 17 2004
Format:Hardcover
Fantastically good value, this wide-ranging textbook covers elementary information theory, data compression, and coding theory; machine learning, Bayesian inference, Monte Carlo methods; and state of the art error-correcting coding methods, including low-density parity-check codes, turbo codes, and digital fountain codes. Theory and practical examples are covered side by side. Hundreds of exercises are included, many with worked solutions.
Two things are distinctive about this book.
First, it emphasizes the connections between information theory and machine learning - for example data compression and Bayesian data modelling are two sides of the same coin.
Second, since 1993, there's been a revolution in communication theory, with classical algebraic codes being superceded by sparse graph codes; this text covers these recent developments in detail.
I recommend this book to all my students! :-)
Was this review helpful to you?
By A Customer
Format:Hardcover
This review concerns only the coding theory part.
If you want to know what's presently going on in the field of coding theory with solid technical foundation, this is the book. The importance of this book is it answers why people have been going into new directions into coding theory and provides good information about LDPC codes, turbo codes and decoding algorithms. People have solved some problems that arise in coding field without going into depths of mathematics. Till early 1990's research in coding was intensely mathematical. People thought the packing problem was the answer to the coding problem. However Mackay answers the conventional thought was wrong when one tries to attain shannon limit. He gives an argument based on GV bound (warning: This argument may not be entirely true).
Now the bad part of the book. Mackay bases his entire book on the basis that algebraic codes cannot exceed GV bound. This is wrong. If you look at Madhu Sudan's notes at MIT (The prestigious Nevenlinna award winner), he says random codes are not always the best. Specifically he cites an argument which states AG codes exceed GV bound at a faster pace. So packing problem still has a relevance to coding problem as it could help attain shannon limit at a faster pace than random codes. (Warning: Madhu does not state anything about size of blocks. But my feeling is that AG codes since they exceed GV bound faster than random codes one could achieve shannon limit with comparitively smaller blocks). So still mathematicians could hope to contribute to practical coding theory while enriching mathematics.
Inspite of this, the book is a must have for engineers and computer scientists.
Was this review helpful to you?
4.0 out of 5 stars A reservoir of information - Yet few problems Jan. 11 2004
By hehe
Format:Hardcover
This review concerns only the coding theory part.
If you want to know what's presently going on in the field of coding theory with solid technical foundation, this is the book. The importance of this book is it answers why people have been going into new directions into coding theory. People have solved some problems that arise in coding field without going into depths of mathematics. Till early 1990's research in coding was intensely mathematical. People thought the packing problem was the answer to the coding problem. However Mackay answers the conventional thought was wrong. He gives an argument based on GV bound.
Now the bad part of the book. Mackay bases his entire book on the basis that algebraic codes cannot exceed GV bound. This is wrong. If you look at Madhu Sudan's notes at MIT (The prestigious Nevenlinna award winner), he says random codes are not always the best. Specifically he cites an argument which states AG codes exceed GV bound at a faster pace. So packing problem still has a relevance to coding problem as it could help attain shannon limit at a faster pace than random codes. (Warning: Madhu does not state anything about size of blocks. But my feeling is that AG codes since they exceed GV bound faster than random codes one could achieve shannon limit with comparitively smaller blocks). So still mathematicians could hope to contribute to practical coding theory while enriching mathematics.
Another bad part is the book does not talk too much about new problems such as multi-access channels, broadcast channels, zero error information theory, communication complexity, upcoming challenges and open problems and what has been done in these fields in information theory and so on...what has been done in these. May be some author bright researcher in the area like Mackay could write a book to put a direction to these questions.
Inspite of this, the book is a must have for engineers and computer scientists.
Was this review helpful to you?
Most Helpful Customer Reviews on Amazon.com (beta)
Amazon.com: 4.3 out of 5 stars  20 reviews
43 of 44 people found the following review helpful
5.0 out of 5 stars Outstanding book, especially for statisticians Oct. 2 2007
By Alexander C. Zorach - Published on Amazon.com
Format:Hardcover
I find it interesting that most of the people reviewing this book seem to be reviewing it as they would any other information theory textbook. Such a review, whether positive or critical, could not hope to give a complete picture of what this text actually is. There are many books on information theory, but what makes this book unique (and in my opinion what makes it so outstanding) is the way it integrates information theory with statistical inference. The book covers topics including coding theory, Bayesian inference, and neural networks, but it treats them all as different pieces of a unified puzzle, focusing more on the connections between these areas, and the philosophical implications of these connections, and less on delving into depth in one area or another.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary. It's ironic that much of this is done within the Bayesian paradigm, something often viewed (and criticized) as being more arbitrary, not less so. But MacKay's way of thinking is highly compelling. This is a book that will not just teach you subjects and techniques, but will shape the way you think. It is one of the rare books that is able to teach how, why, and when certain techniques are applicable. It prepares one to "think outside the box".

I would recommend this book to anyone studying any of the topics covered by this book, including information theory, coding theory, statistical inference, or neural networks. This book is especially indispensable to a statistician, as there is no other book that I have found that covers information theory with an eye towards its application in statistical inference so well. This book is outstanding for self-study; it would also make a good textbook for a course, provided the course followed the development of the textbook very closely.
30 of 34 people found the following review helpful
5.0 out of 5 stars Good value text on a spread of interesting and useful topics Feb. 19 2005
By Iain - Published on Amazon.com
Format:Hardcover
I am a PhD student in computer science. Over the last year and a half this book has been invaluable (and parts of it a fun diversion).

For a course I help teach, the intoductions to probability theory and information theory save a lot of work. They are accessible to students with a variety of backgrounds (they understand them and can read them online). They also lead directly into interesting problems.

While I am not directly studying data compression or error correcting codes, I found these sections compelling. Incredibly clear exposition; exciting challenges. How can we ever be certain of our data after bouncing it across the world and storing it on error-prone media (things I do every day)? How can we do it without >60 hard-disks sitting in our computer? The mathematics uses very clear notation --- functions are sketched when introduced, theorems are presented alongside pictures and explanations of what's really going on.

I should note that a small number (roughly 4 or 5 out of 50) of the chapters on advanced topics are much more terse than the majority of the book. They might not be of interest to all readers, but if they are, they are probably more friendly than finding a journal paper on the same topic.

Most importantly for me, the book is a valuable reference for Bayesian methods, on which MacKay is an authority. Sections IV and V brought me up to speed with several advanced topics I need for my research.
20 of 22 people found the following review helpful
5.0 out of 5 stars A must have... Feb. 28 2005
By Rich Turner - Published on Amazon.com
Format:Hardcover
Uniting information theory and inference in an interactive and entertaining way, this book has been a constant source of inspiration, intuition and insight for me. It is packed full of stuff - its contents appear to grow the more I look - but the layering of the material means the abundance of topics does not confuse.

This is _not_ just a book for the experts. However, you will need to think and interact when reading it. That is, after all, how you learn, and the book helps and guides you in this with many puzzles and problems.
10 of 11 people found the following review helpful
5.0 out of 5 stars A Bayesian View: Excellent Topics, Exposition and Coverage Nov. 20 2008
By Edward Donahue - Published on Amazon.com
Format:Hardcover
I am reviewing David MacKay's `Information Theory, Inference, and Learning Algorithms, but I haven't yet read completely. It will be years before I finish it, since it contains the material for several advanced undergraduate or graduate courses. However, it is already on my list of favorite texts and references. It is a book I will keep going back to time after time, but don't take my word for it. According to the back cover, Bob McEliece, the author of a 1977 classic on information theory recommends you buy two copies, one for the office and one for home. There are topics in this book I am aching to find the time to read, work through and learn.

It can be used as a text book, reference book or to fill in gaps in your knowledge of Information Theory and related material. MacKay outlines several courses for which it can be used including: his Cambridge Course on Information Theory, Pattern Recognition and Neural Networks, a Short Course on Information Theory, and a Course on Bayesian Inference and Machine Learning. As a reference it covers topics not easily accessible in books including: a variety of modern codes (hash codes, low density parity check codes, digital fountain codes, and many others), Bayesian inference techniques (maximum likelihood, LaPlace's method, variational methods and Monte Carlo methods). It has interesting applications such as information theory applied to genes and evolution and to machine learning.

It is well written, with good problems, some help to understand the theory, and others help to apply the theory. Many are worked as examples, and some are especially recommended. He works to keep your attention and interest, and knows how to do it. For example chapter titles include `Why Have Sex' and `Crosswords and Codebreaking'. His web site ( [...] ) is a wondrous collection of resource material including code supporting a variety of topics in the book. The book is available online to browse, either through Google books, or via a link from his web site, but you need to have it in hand, and spend time with it to truly appreciate it.
8 of 9 people found the following review helpful
5.0 out of 5 stars One of the best textbooks I've ever read. March 16 2009
By Bernie Madoff - Published on Amazon.com
Format:Hardcover
Maybe it's just that the topic is so fascinating a superb book such as this is unavoidable--I doubt it--regardless, MacKay has crafted a paragon of science textbooking. the formula: lead with an irresistible puzzle, let the reader have a go at it; unfold the solution intuitively, then finish by justifying it theoretically. the reader leaves understanding: -the applicatiuson, -the method of solution, -and the theory, why it exists and what it allows one to do
why aren't all textbooks like this??
if you're a self-learner, DO BUY THIS BOOK! if only so you can see the possibilities of what a good textbook can be!
Search Customer Reviews
Only search this product's reviews

Look for similar items by category


Feedback