Vous voulez voir cette page en français ? Cliquez ici.


or
Sign in to turn on 1-Click ordering.
More Buying Choices
Have one to sell? Sell yours here
Tell the Publisher!
I'd like to read this book on Kindle

Don't have a Kindle? Get your Kindle here, or download a FREE Kindle Reading App.

Speech and Language Processing (2nd Edition) [Hardcover]

Daniel Jurafsky , James H. Martin
3.0 out of 5 stars  See all reviews (1 customer review)
List Price: CDN$ 194.90
Price: CDN$ 158.27 & FREE Shipping. Details
You Save: CDN$ 36.63 (19%)
o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o
Usually ships within 2 to 4 weeks.
Ships from and sold by Amazon.ca. Gift-wrap available.

Formats

Amazon Price New from Used from
Hardcover CDN $158.27  
Paperback --  
Save Up to 90% on Textbooks
Hit the books in Amazon.ca's Textbook Store and save up to 90% on used textbooks and 35% on new textbooks. Learn more.
Join Amazon Student in Canada


Book Description

May 16 2008 0131873210 978-0131873216 2

An explosion of Web-based language techniques, merging of distinct fields, availability of phone-based dialogue systems, and much more make this an exciting time in speech and language processing. The first of its kind to thoroughly cover language technology – at all levels and with all modern technologies – this book takes an empirical approach to the subject, based on applying statistical and other machine-learning algorithms to large corporations. Builds each chapter around one or more worked examples demonstrating the main idea of the chapter, usingthe examples to illustrate the relative strengths and weaknesses of various approaches. Adds coverage of statistical sequence labeling, information extraction, question answering and summarization, advanced topics in speech recognition, speech synthesis. Revises coverage of language modeling, formal grammars, statistical parsing, machine translation, and dialog processing. A useful reference for professionals in any of the areas of speech and language processing.


Special Offers and Product Promotions

  • Join Amazon Student in Canada


Frequently Bought Together

Speech and Language Processing (2nd Edition) + Natural Language Processing with Python
Price For Both: CDN$ 203.39

One of these items ships sooner than the other.


Customers Who Bought This Item Also Bought


Product Details


Product Description

About the Author

Dan Jurafsky is an associate professor in the Department of Linguistics, and by courtesy in Department of Computer Science, at Stanford University. Previously, he was on the faculty of the University of Colorado, Boulder, in the Linguistics and Computer Science departments and the Institute of Cognitive Science. He was born in Yonkers, New York, and received a B.A. in Linguistics in 1983 and a Ph.D. in Computer Science in 1992, both from the University of California at Berkeley. He received the National Science Foundation CAREER award in 1998 and the MacArthur Fellowship in 2002. He has published over 90 papers on a wide range of topics in speech and language processing.

 

James H. Martin is a professor in the Department of Computer Science and in the Department of Linguistics, and a fellow in the Institute of Cognitive Science at the University of Colorado at Boulder. He was born in New York City, received a B.S. in Comoputer Science from Columbia University in 1981 and a Ph.D. in Computer Science from the University of California at Berkeley in  1988. He has authored over 70 publications in computer science including the book A Computational Model of Metaphor Interpretation.


Sell a Digital Version of This Book in the Kindle Store

If you are a publisher or author and hold the digital rights to a book, you can sell a digital version of it in our Kindle Store. Learn more

What Other Items Do Customers Buy After Viewing This Item?


Customer Reviews

5 star
0
4 star
0
2 star
0
1 star
0
3.0 out of 5 stars
3.0 out of 5 stars
Most helpful customer reviews
3.0 out of 5 stars Encyclopedic Treatment of NLP Feb. 23 2013
By John M. Ford TOP 100 REVIEWER
Format:Hardcover
Daniel Jurafsky and James Martin have assembled an incredible mass of information about natural language processing. The authors note that speech and language processing have largely non-overlapping histories that have relatively recently began to grow together. They have written this book to meet the need for a well-integrated discussion, historical and technical, of both fields.

In twenty-five chapters, the book covers the breadth of computational linguistics with an overall logical organization. Five chapter groupings organize material on Words, Speech, Syntax, Semantics and Pragmatics, and Applications. The four Applications chapters address Information Extraction, Question Answering and Summarization, Dialogue and Conversational Agents, and Machine Translation. The book covers a lot of ground, and a fifty-page bibliography directs readers to vast expanses beyond the book's horizon. The aging content problem present in all such books is addressed through the book's web site and numerous links to other sites, tools, and demonstrations. There is a lot of stuff.

While it is an achievement to assemble such a collection of relevant information, the book could be more useful than it is. An experienced editor could rearrange content into a more readable flow of information and increase the clarity of some of the authors' examples and explanations. As is, the book is a useful reference for researchers and practitioners already working in the field. A more clear presentation would lower the experience requirement and make its store of information available to students and non-specialists as well.
Read more ›
Was this review helpful to you?
Most Helpful Customer Reviews on Amazon.com (beta)
Amazon.com: 4.1 out of 5 stars  15 reviews
40 of 40 people found the following review helpful
3.0 out of 5 stars Good description of the problems in the field, but look elsewhere for practical solutions April 2 2009
By P. Nadkarni - Published on Amazon.com
Format:Hardcover
The authors have the challenge of covering a vast area, and they do a good job of highlighting the hard problems within individual sub-fields, such as machine translation. The availability of an accompanying Web site is a strong plus, as is the extensive bibliography, which also includes links to freely available software and resources.

Now for the negatives.

While I would still buy and recommend this book, you will need to supplement it with other material; in addition to the accurate "broad and shallow" comment made by another reviewer, I would add that much of the material, as presented, is aimed at the comprehension level of a computer-science PhD and doesn't really meet the definition of a textbook for either undergraduate or graduate students. It is not that the material is intrinsically difficult: one recurring problem in the book is the vast number of forward references, where a topic is introduced very briefly but not explained until 20-50 pages later. In most cases, if you don't understand a passage in the text, I would advise that you keep skimming ahead - you may be rewarded because in several cases, the book covers a particular approach for 2-3 pages before telling you that its underlying assumptions are flawed, and that modern methods for addressing the problem use alternative approaches.

In other cases, the authors try to explain topics that might deserve entire chapters in about ten lines - a poster child is the explanation on page 736 of how Support Vector Machines can be used for multiclass problems. To someone who is familiar with SVMs, this material is unnecessary, while those who are not will not be enlightened by knowing that SVMS are "binary approaches based on the discovery of separating hyperplanes". I understand that this is not a text on machine learning approaches, even though machine-learning approaches have revolutionized NLP, but if the authors are clearly in no position to do justice to a particular topic in limited space, I would have preferred that they do the reader the courtesy of acknowledging the same, and simply point to a useful source, preferably online. (While the Wikipedia entry on SVMs is, as of this writing,incomprehensible to non-Math PhDs, the 2nd Google link, at [...] provides a reasonable overview.)

On the other hand, in a book that has to cover a vast area in limited space, there is a surprising amount of repetition. The page-long explanation of F-measure, a statistic used to evaluate the accuracy of a method, is repeated in three places almost verbatim, on pg. 455, 479 and 733; the repetition 24 pages apart (in chapters 13 and 14) should be considered astonishing given that the same author in the two-author collaboration clearly wrote both passages.

Finally, given the way algorithms are described - some reviewers point to errors in some of the descriptions, but I can't verify this - you would be hard-pressed to complete many of the exercises that follow each chapter, in terms of being able to implement a working program.

A final word of advice to the authors: I really do want to see a Third Edition, but I would recommend that you beta-test your material on a sample of your target audience, and incorporate their feedback. When you write a textbook, you really need to make a serious effort to communicate: if smart undergraduates or grad students tell you certain material is hard to follow, the fault almost certainly lies with you and not them.
4 of 4 people found the following review helpful
5.0 out of 5 stars Great introductions and reference book Aug. 9 2008
By carheg - Published on Amazon.com
Format:Hardcover
I read the first edition of that book and it is terrific. The second edition is much more adapted to current research. Statistical methods in NLP are more detailed and some syntax-based approaches are presented. My specific interest is in machine translation and dialogue systems. Both chapters are extensively rewritten and much more elaborated. I believe this book is perfect for everyone who starts in speech and language processing. With precision, coherent examples and some humor, this book give a great introduction into this topic as well as material for already experienced readers.
3 of 3 people found the following review helpful
3.0 out of 5 stars Encyclopedic Treatment of NLP April 25 2012
By John M. Ford - Published on Amazon.com
Format:Hardcover|Verified Purchase
Daniel Jurafsky and James Martin have assembled an incredible mass of information about natural language processing. The authors note that speech and language processing have largely non-overlapping histories that have relatively recently began to grow together. They have written this book to meet the need for a well-integrated discussion, historical and technical, of both fields.

In twenty-five chapters, the book covers the breadth of computational linguistics with an overall logical organization. Five chapter groupings organize material on Words, Speech, Syntax, Semantics and Pragmatics, and Applications. The four Applications chapters address Information Extraction, Question Answering and Summarization, Dialogue and Conversational Agents, and Machine Translation. The book covers a lot of ground, and a fifty-page bibliography directs readers to vast expanses beyond the book's horizon. The aging content problem present in all such books is addressed through the book's web site and numerous links to other sites, tools, and demonstrations. There is a lot of stuff.

While it is an achievement to assemble such a collection of relevant information, the book could be more useful than it is. An experienced editor could rearrange content into a more readable flow of information and increase the clarity of some of the authors' examples and explanations. As is, the book is a useful reference for researchers and practitioners already working in the field. A more clear presentation would lower the experience requirement and make its store of information available to students and non-specialists as well.

Readers looking for an introduction to natural language processing might find Manning and Schütze's Foundations of Statistical Natural Language Processing, easier to understand. It is over ten years old, but worth reading for an understanding of basic concepts that are still relevant in the field.
4 of 5 people found the following review helpful
5.0 out of 5 stars Excellent Introduction to NLP June 29 2010
By A student - Published on Amazon.com
Format:Hardcover
I'm in middle of reading this book as an introduction to NLP without a teacher, and I find it very clear, easy to read, and informative. I can't say that I know it covers the field well because I don't know about the field, but it seems to me to be quite thorough. Definitely recommended.
1 of 1 people found the following review helpful
5.0 out of 5 stars Jurafsky and Martin March 14 2014
By Margaret Magnus - Published on Amazon.com
Format:Hardcover|Verified Purchase
I give J&M five stars and they deserve it, and here’s why. If you want learn to write natural language software, no other single book is as good – at least I’ve not found it. In fact, I bet they invented the genre. Pulling this together is not easy, and they do a creditable job. I know a lot more than I did before I read this book, and I’ve been writing linguistic software for over 30 years. As a linguist writing software (as opposed to the other way around), one can feel just a tad under siege these days. Google advertises that they don’t have a single linguist on staff, and MS is ubiquitously quoted for saying that the quality of their software decreases for every linguist they hire… J&M, I’m happy to say, are above the fray. (What is ‘supervised’ machine learning? Oh yeah, that’s where your input was created by a linguist. Supervised or not, you’re just playing number games on the foundation of a theoretical framework invented by linguists.) They provide a balanced account with historical perspective. I like them. They’re cool.

So on to picking nits... which is way more fun. What I really wanted is to read this book and then be able to sit down and write my own Python implementation of the forward/backward algorithm to train an HMM. I bobbed along through the book, perhaps experiencing a little bit of fuzziness around those probabilities, and came full stop at ‘not quite ksi’ right smack in the middle of my HMM forward/backward section. I’d done a practice run by training a neural net in Andrew Ng’s machine learning course with Coursera. But I stared pretty hard for 3-4 hours at pages 189 and 190. And I mean I get it basically… Alpha and beta represent the accumulated wisdom coming from the front and from the back… And then you take a kind of average to go from not quite ksi to ksi. But there are too many assumptions hidden in P(X,Y|Z)/P(Y|Z). And this is an iterative algorithm, so how do you seed the counts? And I’m very annoyed by the phrase ‘note the different conditioning of O’. Okay, I can see the O is on the wrong side of the line. What does that mean? When I came to the next impasse, I didn’t try as hard. It’s already clear I’ll have to go elsewhere for the silver bullet. (The next impasse, btw was the cepstrum – what do you mean you leave the graph the same and just replace the x-axis with something totally unrelated? I’m no Stanford professor, but what kind of math is that? I’m sure it means something to somebody, but not to me.)

And drop the pseudo-code. If you’re deadly serious about teaching me the HMM, then write out a working implementation in full in a real language like C or Python with the variables all initialized so I can copy and paste the code into my debugger and watch what happens to the numbers as I step through. I suspect J&M of compromising the pedagogical value of the book by deliberately withholding information from those brilliant Stanford students of theirs so they have something to quiz them on at the end of the chapter. But this is a mistake. Give us the answers. Give us all the answers. Give us the actual code for the HMM and then explain it. I will read the explanation. I’ll have to read the explanation, because my neck is on the line if my code blows up. There will still be plenty of questions left over for those students.
Search Customer Reviews
Only search this product's reviews

Look for similar items by category


Feedback