Lecture 7: Noisy Channel Coding (II): The Capacity of a Noisy Channel thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Lecture 7: Noisy Channel Coding (II): The Capacity of a Noisy Channel

Published on Nov 05, 201213878 Views

Related categories

Chapter list

Inference & Information Measure For Noisy Channels00:00
Last Time00:10
Three Doors (1)00:28
Three Doors (2)00:36
Three Doors (3)00:45
Three Doors (4)00:53
Three Doors (5)01:09
Solving Three Doors (1)01:26
Solving Three Doors (2)02:49
Solving Three Doors (3)04:42
Solving Three Doors (4)06:09
Solving Three Doors (5)07:03
Inference For Channels07:38
Information Measures For Noisy Channels (1)08:19
Binary Symmetric Channel (1)08:43
Binary Symmetric Channel (2)09:10
Binary Erasure Channel (1)09:40
Binary Erasure Channel (2)10:32
Binary Erasure Channel (3)10:45
Z Channel (1)10:47
Z Channel (2)11:19
Noisy Typewriter (1)11:43
Information Measures For Noisy Channels (2)12:44
Binary Symmetric Channel (3)13:00
Binary Symmetric Channel (4)13:53
Inference Binary Symmetric Channel (1)14:04
Inference Binary Symmetric Channel (2)15:34
Inference Binary Symmetric Channel (3)17:52
Mutual Information (1)20:42
Mutual Information (2)21:55
Mutual Information (3)23:09
Mutual Information (4)24:56
Mutual Information (5)27:16
Mutual Information (6)28:18
Mutual Information (7)28:35
Capacity (1)29:24
Capacity (2)30:29
Capacity (3)30:37
Capacity (4)31:16
Capacity (5)33:16
Capacity (6)34:28
Capacity (7)35:15
Capacity (8)35:51
Capacity (9)36:11
Capacity (10)37:45
Ternary Confusion Channel (1)38:07
Ternary Confusion Channel (2)38:21
Ternary Confusion Channel (3)39:01
Ternary Confusion Channel (4)41:27
Ternary Confusion Channel (5)41:55
Ternary Confusion Channel (6)43:11
Ternary Confusion Channel (7)45:01
Noisy Typewriter (2)45:35
Homework46:05