DSA Studio
Search
Home
Sign in
Huffman Coding Checkpoint
Tree construction, canonical codes, and optimality.
1. Huffman coding is optimal for:
Known symbol probabilities
Any unknown distribution
Only uniform distributions
Only binary alphabets
2. If only one symbol appears, its Huffman code length should be:
0
1
2
Depends on frequency
3. Canonical Huffman codes are determined by:
Code lengths and symbol order
Tree shape
Random tie-breaking
Symbol frequencies only
4. Tie-breaking in Huffman matters because:
It affects determinism and reproducibility
It changes entropy
It violates Kraft's inequality
It changes symbol probabilities
5. Huffman codes are:
Prefix-free
Suffix-free
Fixed-length
Non-decodable
6. Canonical codes are useful because:
They reduce header size and allow fast decoding
They avoid the need for frequencies
They always beat entropy
They remove the need for bit I/O
7. True/False: Huffman coding can produce expected length smaller than entropy.
Submit quiz
Auto-advance on pass