top of page

Information Theory

Course Outline

​

I. Introduction to Information Theory
A. Overview and Significance of Information Theory
B. Basic Concepts: Information, Entropy, Redundancy, Noise
C. Understanding Bits, Bytes, and Binary Data


II. Quantifying Information
A. Measure of Information: Self-Information, Mutual Information
B. Information Sources and Source Coding
C. Entropy and its Properties


III. Source Coding Theorem
A. Understanding the Source Coding Theorem
B. Data Compression: Lossless and Lossy Compression
C. Huffman Coding and its Applications


IV. Channel Capacity and Coding
A. Definition and Determination of Channel Capacity
B. Error Detection and Correction: Parity Check, Hamming Code
C. Channel Coding Theorem and its Implications


V. Noiseless Coding
A. Shannon's First Theorem
B. Kraft-McMillan Inequality
C. Optimal Codes: Prefix Codes, Suffix Codes, Uniquely Decodable Codes


VI. Noisy-Channel Coding
A. Shannon's Second Theorem
B. Error-Detecting and Error-Correcting Codes
C. Convolutional Codes and Reed-Solomon Codes


VII. Information Theory in Communication
A. Information Theoretical Analysis of Communication Systems
B. Digital Communication and Information Theory
C. Modulation and Demodulation: ASK, FSK, PSK


VIII. Advanced Concepts in Information Theory
A. Network Information Theory
B. Multi-User Information Theory
C. Quantum Information Theory


IX. Information Theory and Cryptography
A. Basic Cryptographic Concepts: Encryption, Decryption, Keys, Cipher
B. Information Theoretic Security: One-Time Pad, Perfect Secrecy
C. Limitations of Information Theory in Cryptography


X. Applications of Information Theory
A. Information Theory in Data Compression
B. Information Theory in Machine Learning and AI
C. Information Theory in Bioinformatics


XI. Review and Advanced Topics Discussion

​

Textbook: "Elements of Information Theory" by Thomas M. Cover and Joy A. Thomas.

1. Introduction to Information Theory

​

We will discuss the overview and significance of information theory, introduce some basic concepts, and dive into understanding bits, bytes, and binary data.

​

A. Overview and Significance of Information Theory

​

Information Theory is a branch of applied mathematics and electrical engineering involving the quantification of information. It was fundamentally established by Claude Shannon in 1948, and it's widely used in telecommunications, computer science, and data science.

​

This theory helps us determine the best ways to encode data so that it can be stored and transferred efficiently and accurately. It helps us understand the fundamental limits on signal processing operations such as compressing data and reliably storing and communicating data.

​

B. Basic Concepts: Information, Entropy, Redundancy, Noise

​

  • Information: Information in this context refers to the data that is being stored or transmitted. It could be in any form, such as text, numbers, images, etc.

  • Entropy: In information theory, entropy is a measure of the uncertainty in a random variable. In other words, it measures the impurity, disorder, or uncertainty. The concept of entropy is central to information theory.

  • Redundancy: Redundancy is the repetition of messages to help ensure accuracy of data. In communication, information redundancy helps to correct errors by allowing a receiver to detect and correct errors without needing to ask the sender for additional data.

  • Noise: In the context of information theory, noise refers to anything that disrupts or reduces the clarity of a message. It could be literal audible noise, electromagnetic interference, or even errors in data transmission.

​

C. Understanding Bits, Bytes, and Binary Data

​

Bits: A bit (binary digit) is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as either "1" or "0".

​

Bytes: A byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and it is the smallest addressable unit of memory in many computer systems.

​

Binary Data: Binary means composed of two pieces or two parts and may refer to different coding schemes where the data or signals are coded using two (bi-) possible states. In computing and telecommunications, binary codes are used for various methods of encoding data, such as character strings, into bit strings.

bottom of page