Elements of Information Theory(English, Hardcover, Cover Thomas M.)
Quick Overview
Product Price Comparison
Elements Of Information Theory is a thought provoking mix of Statistics, Mathematics, Physics, and Information Theory. Summary Of The Book Elements Of Information Theory is a textbook containing all the essential topics and concepts required by students and teachers to complete a full course on information theory. The second edition has been rigorously edited and reorganized to suit the demands of modern academia. It is intended for undergraduate students with some basic knowledge of the topic. Topics such as entropy, hypothesis testing, channel capacity, data compression, and rate distortion are presented in adequate detail. Every chapter and subtopic begins by laying down the theory and building up to real life applications of the concepts covered. The telegraphic summaries at the end of each chapter and the historical notes help readers retain the information they have read. Given the structuring of the book, it is ideal not only for students of information theory but for anyone connected with the field such as electrical engineers, statisticians and students of telecommunications who wish to pursue graduate level studies. The focus is to give readers a strong foundation from which any further study on the subject can take off without a hitch. There are also historical notes at the end of every chapter to summarize its contents. This edition comes with 200 new problems to bring the mathematical side of the text up to speed with the rest of the book. Elements Of Information Theory has been a classic book for information theory since its publication. It has been well received by the academic world and is regularly used as a core textbook for courses on information theory. About The Authors Thomas M. Cover was a professor of Stanford University belonging to the departments of Electrical Engineering and Statistics. He co-edited the book Open Problems In Communication And Computation. He was the president of the IEEE Information Theory society and a fellow of the Institute for Mathematical Statistics. For his work in information theory, he was selected as the Shannon Lecturer in 1990 and was awarded the IEEE Richard W. Hamming medal. Cover was also the author of over 100 papers which were published during the time he spent as a professor at Stanford. Joy A. Thomas is a Chief Scientist at Stratify. He received his Ph.D at Stanford and worked at IBM for over nine years. He was awarded the IEEE Charles LeGeyt Fortescue Fellowship.