Unitext Ser.: Information Theory : Three Theorems of Claude Shannon by Antoine Chambert-Loir (2023, Trade Paperback)

Late Knight Books (93)
100% positive feedback
Price:
$53.54
+ $10.00 shipping
Estimated delivery Fri, Sep 12 - Sat, Sep 20
Returns:
No returns, but backed by eBay Money back guarantee.
Condition:
Brand New
The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem).

About this product

Product Identifiers

PublisherSpringer International Publishing A&G
ISBN-103031215605
ISBN-139783031215605
eBay Product ID (ePID)26058385943

Product Key Features

Number of PagesXii, 209 Pages
Publication NameInformation Theory : Three Theorems of Claude Shannon
LanguageEnglish
Publication Year2023
SubjectInformation Theory, Computer Science, General, Applied
TypeTextbook
AuthorAntoine Chambert-Loir
Subject AreaMathematics, Computers, Science
SeriesUnitext Ser.
FormatTrade Paperback

Dimensions

Item Weight15.1 Oz
Item Length9.3 in
Item Width6.1 in

Additional Product Features

Reviews"This book can be especially useful for those who are just getting to know the basics of information theory." (Eszter Gselmann, zbMATH 1526.94001, 2024)
Dewey Edition23
Series Volume Number144
Number of Volumes1 vol.
IllustratedYes
Dewey Decimal003.54
Original LanguageFrench
Table Of ContentElements of Theory of Probability.- Entropy and Mutual Information.- Coding.- Sampling.- Solutions to Exercises.- Bibliography.- Notation.- Index.
SynopsisThis book provides an introduction to information theory, focussing on Shannon's three foundational theorems of 1948-1949. Shannon's first two theorems, based on the notion of entropy in probability theory, specify the extent to which a message can be compressed for fast transmission and how to erase errors associated with poor transmission. The third theorem, using Fourier theory, ensures that a signal can be reconstructed from a sufficiently fine sampling of it. These three theorems constitute the roadmap of the book. The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory. Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level.
LC Classification NumberQA76.9.M35
No ratings or reviews yet
Be the first to write a review