Information and communication theory / Stefan Host

By: Hot, Stefan [author]
Language: English Series: SERIES IEEE Series on Digital & Mobile CommunicationPublisher: Piscataway, NJ: Wiley-IEEE Press, 2019Description: 1 online resourceContent type: text Media type: computer Carrier type: online resourceISBN: 9781119433828Genre/Form: Electronic books. Online resources: Full text available at Wiley Online Library Click here to view
Contents:
Preface ixChapter 1 Introduction 1Chapter 2 Probability Theory 52.1 Probabilities 52.2 Random Variable 72.3 Expectation and Variance 92.4 The Law of Large Numbers 172.5 Jensen's Inequality 212.6 Random Processes 252.7 Markov Process 28Problems 33Chapter 3 Information Measures 373.1 Information 373.2 Entropy 413.3 Mutual Information 483.4 Entropy of Sequences 58Problems 63Chapter 4 Optimal Source Coding 694.1 Source Coding 694.2 Kraft Inequality 714.3 Optimal Codeword Length 804.4 Huffman Coding 844.5 Arithmetic Coding 95Problems 101Chapter 5 Adaptive Source Coding 1055.1 The Problem with Unknown Source Statistics 1055.2 Adaptive Huffman Coding 1065.3 The Lempel-Ziv Algorithms 1125.4 Applications of Source Coding 125Problems 129Chapter 6 Asymptotic Equipartition Property and Channel Capacity 1336.1 Asymptotic Equipartition Property 1336.2 Source Coding Theorem 1386.3 Channel Coding 1416.4 Channel Coding Theorem 1446.5 Derivation of Channel Capacity for DMC 155Problems 164Chapter 7 Channel Coding 1697.1 Error-Correcting Block Codes 1707.2 Convolutional Code 1887.3 Error-Detecting Codes 203Problems 210Chapter 8 Information Measures For Continuous Variables 2138.1 Differential Entropy and Mutual Information 2138.2 Gaussian Distribution 224Problems 232Chapter 9 Gaussian Channel 2379.1 Gaussian Channel 2379.2 Parallel Gaussian Channels 2449.3 Fundamental Shannon Limit 256Problems 260Chapter 10 Discrete Input Gaussian Channel 26510.1 M-PAM Signaling 26510.2 A Note on Dimensionality 27110.3 Shaping Gain 27610.4 SNR Gap 281Problems 285Chapter 11 Information Theory and Distortion 28911.1 Rate-Distortion Function 28911.2 Limit For Fix Pb 30011.3 Quantization 30211.4 Transform Coding 306Problems 319Appendix A Probability Distributions 323A.1 Discrete Distributions 323A.2 Continuous Distributions 327Appendix B Sampling Theorem 337B.1 The Sampling Theorem 337Bibliography 343Index 347
Summary: DESCRIPTION An important text that offers an in-depth guide to how information theory sets the boundaries for data communication In an accessible and practical style, Information and Communication Theory explores the topic of information theory and includes concrete tools that are appropriate for real-life communication systems. The text investigates the connection between theoretical and practical applications through a wide-variety of topics including an introduction to the basics of probability theory, information, (lossless) source coding, typical sequences as a central concept, channel coding, continuous random variables, Gaussian channels, discrete input continuous channels, and a brief look at rate distortion theory. The author explains the fundamental theory together with typical compression algorithms and how they are used in reality. He moves on to review source coding and how much a source can be compressed, and also explains algorithms such as the LZ family with applications to e.g. zip or png. In addition to exploring the channel coding theorem, the book includes illustrative examples of codes. This comprehensive text: Provides an adaptive version of Huffman coding that estimates source distribution Contains a series of problems that enhance an understanding of information presented in the text Covers a variety of topics including optimal source coding, channel coding, modulation and much more Includes appendices that explore probability distributions and the sampling theorem Written for graduate and undergraduate students studying information theory, as well as professional engineers, master’s students, Information and Communication Theory offers an introduction to how information theory sets the boundaries for data communication.
Tags from this library: No tags from this library for this title. Log in to add tags.
    Average rating: 0.0 (0 votes)
Item type Current location Home library Call number Status Date due Barcode Item holds
EBOOK EBOOK COLLEGE LIBRARY
COLLEGE LIBRARY
LIC Gateway
001. 218 H7971 2019 (Browse shelf) Available CL-50357
Total holds: 0

Author Bios
STEFAN HÖST is a Senior Lecturer in the Department of Electrical and Information Technology at Lund University, where he is active in the Broadband Communication research group. As a teacher he has been responsible for several courses within communication technology at the master's level. His main research area is within access networks seen from the physical layer. He is a frequent contributor to IEEE journals and to papers presented at multiple IEEE conferences.

Preface ixChapter 1 Introduction 1Chapter 2 Probability Theory 52.1 Probabilities 52.2 Random Variable 72.3 Expectation and Variance 92.4 The Law of Large Numbers 172.5 Jensen's Inequality 212.6 Random Processes 252.7 Markov Process 28Problems 33Chapter 3 Information Measures 373.1 Information 373.2 Entropy 413.3 Mutual Information 483.4 Entropy of Sequences 58Problems 63Chapter 4 Optimal Source Coding 694.1 Source Coding 694.2 Kraft Inequality 714.3 Optimal Codeword Length 804.4 Huffman Coding 844.5 Arithmetic Coding 95Problems 101Chapter 5 Adaptive Source Coding 1055.1 The Problem with Unknown Source Statistics 1055.2 Adaptive Huffman Coding 1065.3 The Lempel-Ziv Algorithms 1125.4 Applications of Source Coding 125Problems 129Chapter 6 Asymptotic Equipartition Property and Channel Capacity 1336.1 Asymptotic Equipartition Property 1336.2 Source Coding Theorem 1386.3 Channel Coding 1416.4 Channel Coding Theorem 1446.5 Derivation of Channel Capacity for DMC 155Problems 164Chapter 7 Channel Coding 1697.1 Error-Correcting Block Codes 1707.2 Convolutional Code 1887.3 Error-Detecting Codes 203Problems 210Chapter 8 Information Measures For Continuous Variables 2138.1 Differential Entropy and Mutual Information 2138.2 Gaussian Distribution 224Problems 232Chapter 9 Gaussian Channel 2379.1 Gaussian Channel 2379.2 Parallel Gaussian Channels 2449.3 Fundamental Shannon Limit 256Problems 260Chapter 10 Discrete Input Gaussian Channel 26510.1 M-PAM Signaling 26510.2 A Note on Dimensionality 27110.3 Shaping Gain 27610.4 SNR Gap 281Problems 285Chapter 11 Information Theory and Distortion 28911.1 Rate-Distortion Function 28911.2 Limit For Fix Pb 30011.3 Quantization 30211.4 Transform Coding 306Problems 319Appendix A Probability Distributions 323A.1 Discrete Distributions 323A.2 Continuous Distributions 327Appendix B Sampling Theorem 337B.1 The Sampling Theorem 337Bibliography 343Index 347

DESCRIPTION
An important text that offers an in-depth guide to how information theory sets the boundaries for data communication

In an accessible and practical style, Information and Communication Theory explores the topic of information theory and includes concrete tools that are appropriate for real-life communication systems. The text investigates the connection between theoretical and practical applications through a wide-variety of topics including an introduction to the basics of probability theory, information, (lossless) source coding, typical sequences as a central concept, channel coding, continuous random variables, Gaussian channels, discrete input continuous channels, and a brief look at rate distortion theory.

The author explains the fundamental theory together with typical compression algorithms and how they are used in reality. He moves on to review source coding and how much a source can be compressed, and also explains algorithms such as the LZ family with applications to e.g. zip or png. In addition to exploring the channel coding theorem, the book includes illustrative examples of codes. This comprehensive text:

Provides an adaptive version of Huffman coding that estimates source distribution
Contains a series of problems that enhance an understanding of information presented in the text
Covers a variety of topics including optimal source coding, channel coding, modulation and much more
Includes appendices that explore probability distributions and the sampling theorem
Written for graduate and undergraduate students studying information theory, as well as professional engineers, master’s students, Information and Communication Theory offers an introduction to how information theory sets the boundaries for data communication.

600-699

There are no comments for this item.

to post a comment.