2 Shannon Fano Encoding (Algorithm, Procedure & Example) Explained in Digital Communication
Summary
TLDRIn this video, the speaker explains the Shannon Fano encoding technique, demonstrating it through a complex example. The process includes listing symbols with their probabilities in descending order, dividing them into two subsets, and assigning binary codes to each symbol. The speaker walks through calculating the entropy (H), the average length of codewords (H_cap), efficiency, and redundancy. The example demonstrates how to use the algorithm to compute the codeword lengths and their corresponding efficiencies and redundancies, ultimately showing how Shannon Fano encoding minimizes redundancy and optimizes code efficiency.
Takeaways
- đ Shannon Fano encoding is explained by following steps: listing symbols by descending probability, dividing them into two equal subsets, assigning bits to the subsets, and repeating until all subsets are resolved.
- đ In the given example, symbols are assigned the probabilities: 1/4, 1/4, 1/8, 1/8, 1/16, 1/16, 1/16, 1/16, with symbols labeled as S1, S2, S3, S4, S5, S6, S7, and S8.
- đ The first step in Shannon Fano is to list symbols in descending order of their probabilities, which in this example was already provided in descending order.
- đ The message set is divided into two subsets, X and Y, based on equal probability, with bit 0 assigned to subset X and bit 1 to subset Y.
- đ This process continues recursively, dividing each subset further until you arrive at the final codeword for each symbol.
- đ The length of each codeword is determined after all the subsets have been divided, with some having codeword lengths of 2 bits, others of 3 or 4 bits.
- đ Codeword lengths are as follows: S1 = 2, S2 = 2, S3 = 3, S4 = 3, S5 = 4, S6 = 4, S7 = 4, S8 = 4.
- đ The efficiency of the code is calculated by first determining the entropy (H) of the system using the formula ÎŁ P(i) logâ (1/P(i)), which yields a result of 2.75 bits per symbol.
- đ The expected code length (HÌ) is calculated by multiplying the probability of each symbol by the length of its corresponding codeword, which also equals 2.75 bits per symbol.
- đ Since the entropy and expected code length are equal, the efficiency is 1, and there is no redundancy in the Shannon Fano encoding for this example.
Q & A
What is the first step in the Shannon-Fano encoding process?
-The first step is to list all the symbols and their corresponding probabilities in descending order.
How do you divide the symbols in Shannon-Fano encoding?
-Symbols are divided into two subsets (X and Y) such that their probabilities are as equal as possible. Subset X is assigned bit 0, and subset Y is assigned bit 1.
Why is the order of symbols important in the Shannon-Fano encoding algorithm?
-The order of symbols must be in descending probability to ensure that the most frequent symbols are given shorter codewords, leading to more efficient encoding.
What happens when there are more than two symbols in a subset?
-If a subset contains more than two symbols, it is recursively divided into smaller subsets, each assigned a bit, until each symbol has a unique code.
What is the purpose of calculating the efficiency in Shannon-Fano encoding?
-The efficiency measures how well the encoding algorithm is performing. It is calculated by comparing the entropy of the system (H) to the average codeword length (H'). The higher the efficiency, the better the encoding.
How is entropy (H) calculated in Shannon-Fano encoding?
-Entropy is calculated using the formula: H = â P_i * log2(1/P_i), where P_i is the probability of each symbol.
What is the significance of the average codeword length (H')?
-The average codeword length (H') is the weighted average of the lengths of all the codewords, where the weights are the probabilities of the corresponding symbols.
What is the redundancy in Shannon-Fano encoding, and how is it calculated?
-Redundancy is the difference between the maximum possible efficiency (1) and the actual efficiency of the encoding. It is calculated as: Redundancy = 1 - Efficiency.
Why was there no redundancy in the example provided?
-In the example, the efficiency was calculated to be 1, meaning the encoding is perfectly efficient and there is no redundancy in the code.
How does the Shannon-Fano algorithm compare to other encoding techniques like Huffman coding?
-Shannon-Fano and Huffman coding are both entropy-based algorithms that assign shorter codes to more frequent symbols. However, Huffman coding is generally more efficient because it always guarantees the optimal code length, while Shannon-Fano may not always result in the most efficient code.
Outlines

Cette section est réservée aux utilisateurs payants. Améliorez votre compte pour accéder à cette section.
Améliorer maintenantMindmap

Cette section est réservée aux utilisateurs payants. Améliorez votre compte pour accéder à cette section.
Améliorer maintenantKeywords

Cette section est réservée aux utilisateurs payants. Améliorez votre compte pour accéder à cette section.
Améliorer maintenantHighlights

Cette section est réservée aux utilisateurs payants. Améliorez votre compte pour accéder à cette section.
Améliorer maintenantTranscripts

Cette section est réservée aux utilisateurs payants. Améliorez votre compte pour accéder à cette section.
Améliorer maintenantVoir Plus de Vidéos Connexes

Release Meditation Technique - Instruction by Founder Brendon Burchard

how to write a *detailed* example in ielts writing task 2

Python Machine Learning Tutorial | Splitting Your Data | Databytes

Introduction to System Dynamics Models

Penjelasan Algoritma Rekursif | Algoritma Faktorial dan Pangkat | Algoritma Pertemuan 57

The Role of Working Memory for Learning
5.0 / 5 (0 votes)