Good book - but few arguments need revision from theorists,
By A Customer
This review is from: Information Theory, Inference and Learning Algorithms (Hardcover)This review concerns only the coding theory part.
If you want to know what's presently going on in the field of coding theory with solid technical foundation, this is the book. The importance of this book is it answers why people have been going into new directions into coding theory and provides good information about LDPC codes, turbo codes and decoding algorithms. People have solved some problems that arise in coding field without going into depths of mathematics. Till early 1990's research in coding was intensely mathematical. People thought the packing problem was the answer to the coding problem. However Mackay answers the conventional thought was wrong when one tries to attain shannon limit. He gives an argument based on GV bound (warning: This argument may not be entirely true).
Now the bad part of the book. Mackay bases his entire book on the basis that algebraic codes cannot exceed GV bound. This is wrong. If you look at Madhu Sudan's notes at MIT (The prestigious Nevenlinna award winner), he says random codes are not always the best. Specifically he cites an argument which states AG codes exceed GV bound at a faster pace. So packing problem still has a relevance to coding problem as it could help attain shannon limit at a faster pace than random codes. (Warning: Madhu does not state anything about size of blocks. But my feeling is that AG codes since they exceed GV bound faster than random codes one could achieve shannon limit with comparitively smaller blocks). So still mathematicians could hope to contribute to practical coding theory while enriching mathematics.
Inspite of this, the book is a must have for engineers and computer scientists.
(4 customer reviews)
CDN$ 83.95 CDN$ 72.61