This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.
Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.
E E Entropy and Information Theory
Robert M. Gray
Entropy and Information Theory Second Edition
Robert M. Gray Department of Electrical Engineering Stanford University Stanford, CA 94305-9510 USA
[email protected]
ISBN 978-1-4419-7969-8 e-ISBN 978-1-4419-7970-4 DOI 10.1007/978-1-4419-7970-4 Springer New York Dordrecht Heidelberg London Library of Congress Control Number: 2011920808 © Springer Science+Business Media, LLC 2011 All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science+Business Media, LLC, 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com)
to Tim, Lori, Julia, Peter, Gus, Amy, and Alice and in memory of Tino
Preface
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels, with a strong emphasis on source coding and stationary codes. The eventual goal is a general development of Shannon’s mathematical theory of communication for single user systems, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems, especially the notions of sources, channels, codes, entropy, information, and the entropy ergodic theorem. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and relative entropy (discrimination, Kullback-Leibler information, informational divergence), along with the limiting normalized versions of these quantities such as entropy rate and information rate. In addition to information