Gå direkte til innholdet
Entropy and Information Theory
Entropy and Information Theory
Spar

Entropy and Information Theory

Forfatter:
Engelsk
Les i Adobe DRM-kompatibelt e-bokleserDenne e-boka er kopibeskyttet med Adobe DRM som påvirker hvor du kan lese den. Les mer
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition:Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.
ISBN
9781441979704
Språk
Engelsk
Utgivelsesdato
27.1.2011
Tilgjengelige elektroniske format
  • PDF - Adobe DRM
Les e-boka her
  • E-bokleser i mobil/nettbrett
  • Lesebrett
  • Datamaskin