1 327 kr
Beställningsvara. Skickas inom 7-10 vardagar. Fri frakt över 249 kr.
Beskrivning
Produktinformation
- Utgivningsdatum:2006-09-08
- Mått:163 x 242 x 40 mm
- Vikt:1 120 g
- Format:Inbunden
- Språk:Engelska
- Antal sidor:784
- Upplaga:2
- Förlag:John Wiley & Sons Inc
- ISBN:9780471241959
Utforska kategorier
Mer om författaren
THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation. JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford, Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights, New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship.
Recensioner i media
"As expected, the quality of exposition continues to be a high point of the book. Clear explanations, nice graphical illustrations, and illuminating mathematical derivations make the book particularly useful as a textbook on information theory." (Journal of the American Statistical Association, March 2008) "This book is recommended reading, both as a textbook and as a reference." (Computing Reviews.com, December 28, 2006)
Innehållsförteckning
- Contents vPreface to the Second Edition xvPreface to the First Edition xviiAcknowledgments for the Second Edition xxiAcknowledgments for the First Edition xxiii1 Introduction and Preview 11.1 Preview of the Book 52 Entropy, Relative Entropy, and Mutual Information 132.1 Entropy 132.2 Joint Entropy and Conditional Entropy 162.3 Relative Entropy and Mutual Information 192.4 Relationship Between Entropy and Mutual Information 202.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information 222.6 Jensen’s Inequality and Its Consequences 252.7 Log Sum Inequality and Its Applications 302.8 Data-Processing Inequality 342.9 Sufficient Statistics 352.10 Fano’s Inequality 37Summary 41Problems 43Historical Notes 543 Asymptotic Equipartition Property 573.1 Asymptotic Equipartition Property Theorem 583.2 Consequences of the AEP: Data Compression 603.3 High-Probability Sets and the Typical Set 62Summary 64Problems 64Historical Notes 694 Entropy Rates of a Stochastic Process 714.1 Markov Chains 714.2 Entropy Rate 744.3 Example: Entropy Rate of a Random Walk on a Weighted Graph 784.4 Second Law of Thermodynamics 814.5 Functions of Markov Chains 84Summary 87Problems 88Historical Notes 1005 Data Compression 1035.1 Examples of Codes 1035.2 Kraft Inequality 1075.3 Optimal Codes 1105.4 Bounds on the Optimal Code Length 1125.5 Kraft Inequality for Uniquely Decodable Codes 1155.6 Huffman Codes 1185.7 Some Comments on Huffman Codes 1205.8 Optimality of Huffman Codes 1235.9 Shannon–Fano–Elias Coding 1275.10 Competitive Optimality of the Shannon Code 1305.11 Generation of Discrete Distributions from Fair Coins 134Summary 141Problems 142Historical Notes 1576 Gambling and Data Compression 1596.1 The Horse Race 1596.2 Gambling and Side Information 1646.3 Dependent Horse Races and Entropy Rate 1666.4 The Entropy of English 1686.5 Data Compression and Gambling 1716.6 Gambling Estimate of the Entropy of English 173Summary 175Problems 176Historical Notes 1827 Channel Capacity 1837.1 Examples of Channel Capacity 1847.1.1 Noiseless Binary Channel 1847.1.2 Noisy Channel with Nonoverlapping Outputs 1857.1.3 Noisy Typewriter 1867.1.4 Binary Symmetric Channel 1877.1.5 Binary Erasure Channel 1887.2 Symmetric Channels 1897.3 Properties of Channel Capacity 1917.4 Preview of the Channel Coding Theorem 1917.5 Definitions 1927.6 Jointly Typical Sequences 1957.7 Channel Coding Theorem 1997.8 Zero-Error Codes 2057.9 Fano’s Inequality and the Converse to the Coding Theorem 2067.10 Equality in the Converse to the Channel Coding Theorem 2087.11 Hamming Codes 2107.12 Feedback Capacity 2167.13 Source–Channel Separation Theorem 218Summary 222Problems 223Historical Notes 2408 Differential Entropy 2438.1 Definitions 2438.2 AEP for Continuous Random Variables 2458.3 Relation of Differential Entropy to Discrete Entropy 2478.4 Joint and Conditional Differential Entropy 2498.5 Relative Entropy and Mutual Information 2508.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information 252Summary 256Problems 256Historical Notes 2599 Gaussian Channel 2619.1 Gaussian Channel: Definitions 2639.2 Converse to the Coding Theorem for Gaussian Channels 2689.3 Bandlimited Channels 2709.4 Parallel Gaussian Channels 2749.5 Channels with Colored Gaussian Noise 2779.6 Gaussian Channels with Feedback 280Summary 289Problems 290Historical Notes 29910 Rate Distortion Theory 30110.1 Quantization 30110.2 Definitions 30310.3 Calculation of the Rate Distortion Function 30710.3.1 Binary Source 30710.3.2 Gaussian Source 31010.3.3 Simultaneous Description of Independent Gaussian Random Variables 31210.4 Converse to the Rate Distortion Theorem 31510.5 Achievability of the Rate Distortion Function 31810.6 Strongly Typical Sequences and Rate Distortion 32510.7 Characterization of the Rate Distortion Function 32910.8 Computation of Channel Capacity and the Rate Distortion Function 332Summary 335Problems 336Historical Notes 34511 Information Theory and Statistics 34711.1 Method of Types 34711.2 Law of Large Numbers 35511.3 Universal Source Coding 35711.4 Large Deviation Theory 36011.5 Examples of Sanov’s Theorem 36411.6 Conditional Limit Theorem 36611.7 Hypothesis Testing 37511.8 Chernoff–Stein Lemma 38011.9 Chernoff Information 38411.10 Fisher Information and the Cramér–Rao Inequality 392Summary 397Problems 399Historical Notes 40812 Maximum Entropy 40912.1 Maximum Entropy Distributions 40912.2 Examples 41112.3 Anomalous Maximum Entropy Problem 41312.4 Spectrum Estimation 41512.5 Entropy Rates of a Gaussian Process 41612.6 Burg’s Maximum Entropy Theorem 417Summary 420Problems 421Historical Notes 42513 Universal Source Coding 42713.1 Universal Codes and Channel Capacity 42813.2 Universal Coding for Binary Sequences 43313.3 Arithmetic Coding 43613.4 Lempel–Ziv Coding 44013.4.1 Sliding Window Lempel–Ziv Algorithm 44113.4.2 Tree-Structured Lempel–Ziv Algorithms 44213.5 Optimality of Lempel–Ziv Algorithms 44313.5.1 Sliding Window Lempel–Ziv Algorithms 44313.5.2 Optimality of Tree-Structured Lempel–Ziv Compression 448Summary 456Problems 457Historical Notes 46114 Kolmogorov Complexity 46314.1 Models of Computation 46414.2 Kolmogorov Complexity: Definitions and Examples 46614.3 Kolmogorov Complexity and Entropy 47314.4 Kolmogorov Complexity of Integers 47514.5 Algorithmically Random and Incompressible Sequences 47614.6 Universal Probability 48014.7 Kolmogorov complexity 48214.8 Ω 48414.9 Universal Gambling 48714.10 Occam’s Razor 48814.11 Kolmogorov Complexity and Universal Probability 49014.12 Kolmogorov Sufficient Statistic 49614.13 Minimum Description Length Principle 500Summary 501Problems 503Historical Notes 50715 Network Information Theory 50915.1 Gaussian Multiple-User Channels 51315.1.1 Single-User Gaussian Channel 51315.1.2 Gaussian Multiple-Access Channel with m Users 51415.1.3 Gaussian Broadcast Channel 51515.1.4 Gaussian Relay Channel 51615.1.5 Gaussian Interference Channel 51815.1.6 Gaussian Two-Way Channel 51915.2 Jointly Typical Sequences 52015.3 Multiple-Access Channel 52415.3.1 Achievability of the Capacity Region for the Multiple-Access Channel 53015.3.2 Comments on the Capacity Region for the Multiple-Access Channel 53215.3.3 Convexity of the Capacity Region of the Multiple-Access Channel 53415.3.4 Converse for the Multiple-Access Channel 53815.3.5 m-User Multiple-Access Channels 54315.3.6 Gaussian Multiple-Access Channels 54415.4 Encoding of Correlated Sources 54915.4.1 Achievability of the Slepian–Wolf Theorem 55115.4.2 Converse for the Slepian–Wolf Theorem 55515.4.3 Slepian–Wolf Theorem for Many Sources 55615.4.4 Interpretation of Slepian–Wolf Coding 55715.5 Duality Between Slepian–Wolf Encoding and Multiple-Access Channels 55815.6 Broadcast Channel 56015.6.1 Definitions for a Broadcast Channel 56315.6.2 Degraded Broadcast Channels 56415.6.3 Capacity Region for the Degraded Broadcast Channel 56515.7 Relay Channel 57115.8 Source Coding with Side Information 57515.9 Rate Distortion with Side Information 58015.10 General Multiterminal Networks 587Summary 594Problems 596Historical Notes 60916 Information Theory and Portfolio Theory 61316.1 The Stock Market: Some Definitions 61316.2 Kuhn–Tucker Characterization of the Log-Optimal Portfolio 61716.3 Asymptotic Optimality of the Log-Optimal Portfolio 61916.4 Side Information and the Growth Rate 62116.5 Investment in Stationary Markets 62316.6 Competitive Optimality of the Log-Optimal Portfolio 62716.7 Universal Portfolios 62916.7.1 Finite-Horizon Universal Portfolios 63116.7.2 Horizon-Free Universal Portfolios 63816.8 Shannon–McMillan–Breiman Theorem (General AEP) 644Summary 650Problems 652Historical Notes 65517 Inequalities in Information Theory 65717.1 Basic Inequalities of Information Theory 65717.2 Differential Entropy 66017.3 Bounds on Entropy and Relative Entropy 66317.4 Inequalities for Types 66517.5 Combinatorial Bounds on Entropy 66617.6 Entropy Rates of Subsets 66717.7 Entropy and Fisher Information 67117.8 Entropy Power Inequality and Brunn–Minkowski Inequality 67417.9 Inequalities for Determinants 67917.10 Inequalities for Ratios of Determinants 683Summary 686Problems 686Historical Notes 687Bibliography 689List of Symbols 723Index 727
Du kanske också är intresserad av
Optimering : metoder, modeller och teori för linjära, olinjära och kombinatoriska problem
Kaj Holmberg
892 kr
- Signerad!
SIGNERAD - Där färgen får styra : En berättelse i akvarell
Maximilian Svensson, Linda Newnham
319 kr
- Signerad!
- Signerad!
- -30%
- Nyhet