Common methods for communicating over such channels employ a feedback channel from receiver to sender that is used to control the retransmission of erased packets. For example, the receiver might send back messages that identify the missing packets, which are then retransmitted. Alternatively, the receiver might send back messages that acknowledge each received packet; the sender keeps track of which packets have been acknowledged and retransmits the others until all packets have been acknowledged.
These simple retransmission protocols have the advantage that they will work regardless of the erasure probability f, but purists who have learned their Shannon theory will feel that these retransmission protocols are wasteful. If the erasure probability f is large, the number of feedback messages sent by the first protocol will be large. Under the second protocol, it's likely that the receiver will end up receiving multiple redundant copies of some packets, and heavy use is made of the feedback channel. According to Shannon, there is no need for the feedback channel: the capacity of the forward channel is (1 – f)l bits, whether or not we have feedback.
Efficient coding. Shannon coding in the channel without interference.
Enhancements to soft K-means
Entropy. Basic properties of entropy.
Exact inference for continuous hypothesis spaces
Explain the difference in the levels of communication problems.
Finding the lowest-cost path
Functional diagram of the transmission of information, the purpose of its components.
Further applications of arithmetic coding
Gaussian distribution
Generalized parity-check matrices
Give the definition of a stationary random process in the narrow and broad sense.
Hash codes
How much can we compress?
How to measure the information content of a random variable?
How to measure the information content of a random variable?
Inferring the input to a real channel
Inferring the mean and variance of a Gaussian distribution
Information content defined in terms of lossy compression
Information content of independent random variables
Information conveyed by a channel
Information types
Introduction to convolutional codes
Joint entropy
Jointly-typical sequences
K-means clustering
Lempel–Ziv coding
Low-Density Parity-Check Codes
Low-Density Parity-Check Codes
Maximum Likelihood and Clustering
Maximum likelihood for a mixture of Gaussians
Maximum likelihood for one Gaussian
Message classification
Message Passing
More on trellises
More than two variables
Noise and distortion in the channels of information transmission.
Noisy channels
Optimal source coding with symbol codes: Huffman coding
Other roles for hash codes
Parity-check matrices of convolutional codes and turbo codes
Periods of information circulation
Pictorial demonstration of Gallager codes
Planning for collisions
Probabilities and Inference
Relative entropy and mutual information
Repeat–Accumulate Codes
Review of probability and information
Simple language models
Soft K-means clustering
Solving the decoding problems on a trellis
Source Models of discrete messages.
Symbol codes
The binary entropy function
The burglar alarm
The capacity of a continuous channel of information transfer.
The decoder
The differential entropy and its properties.
The entropy of a discrete source. Full and partial entropy.
The Gaussian channel
The general problem
The information-retrieval problem
The junction tree algorithm
The junction tree algorithm
The main types of signals used in the transmission of information.
The min–sum algorithm
The noisy-channel coding theorem
The Noisy-Channel Coding Theorem
The set, which objects of an information transmission system?
The sum–product algorithm
Turbo codes
Typicality
Units of information content
What are the capabilities of practical error-correcting codes?
What are the capabilities of practical error-correcting codes?
What are the different forms of representation models of signals.
What are the main problems of the theory of information?
What are the main stages of treatment information?
What Information system is?
What is meant by the message and the signal?
What is said to be centered random process?
What is the difference between a line and a channel of communication?
What is the difficulty of exact mathematical description of a random process?
What is the essence of fundamental differences in the interpretation of the concept of information?