DSA Studio
Search
Home
Sign in
Compression Foundations Checkpoint
Entropy, expected length, Kraft inequality, and modeling.
1. Shannon entropy is defined as:
-sum p_i log2(p_i)
sum p_i log2(p_i)
-sum log2(p_i)
sum p_i
2. For two equally likely symbols, entropy is:
0
1
2
log2(3)
3. Kraft's inequality for prefix-free codes requires:
sum 2^{-l_i} <= 1
sum 2^{-l_i} >= 1
sum l_i <= 1
sum l_i >= 1
4. Expected code length is:
sum p_i * l_i
sum l_i / p_i
-sum p_i log2(l_i)
sum l_i
5. Shannon's bound for optimal prefix codes is:
H <= L < H + 1
L <= H
L = H + 2
H < L always
6. Redundancy is:
L - H
H - L
H * L
L / H
7. Cross-entropy measures:
Expected length under a mismatched model
The minimum possible entropy
Only the variance of symbols
A property of prefix trees
8. True/False: If a single symbol has probability 1, the entropy is 0.
9. Four equally likely symbols have entropy:
1
2
3
4
10. Greedy match selection can be suboptimal because:
Encoding cost of the match might outweigh its length
Entropy is undefined for strings
Prefix codes cannot represent matches
LZ77 forbids overlaps
Submit quiz
Auto-advance on pass