Zoneout: Regularizing RNNs by Randomly Preserving Hidden Activations thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Zoneout: Regularizing RNNs by Randomly Preserving Hidden Activations

Published on Aug 23, 20165892 Views

Related categories

Chapter list

Zoneout - Regularizing RNNs by randomly preserving hidden activations00:00
Zoneout - 100:03
Zoneout - 200:12
Structure of the talk - 100:34
Structure of the talk - 200:42
Structure of the talk - 300:46
Structure of the talk - 400:48
Basic idea00:51
Recurrent neural networks01:21
1-layer RNN01:35
1-layer RNN with zoneout01:37
1-layer LSTM01:49
1-layer LSTM with zoneout - 101:53
1-layer LSTM with zoneout - 201:57
Implementing zoneout - 102:11
Implementing zoneout - 202:36
Zoneout trains a pseudo-ensemble03:20
Zoneout as per-unit stochastic depth - 103:48
Zoneout as per-unit stochastic depth - 204:22
Other related work04:39
Zoneout helps propagate gradients05:46
Permuted sequential MNIST - 107:04
Permuted sequential MNIST - 207:17
Permuted sequential MNIST - 307:40
Character-level Penn Treebank - 107:43
Character-level Penn Treebank - 208:22
Character-level Penn Treebank - 308:37
Character-level Penn Treebank - 408:40
Character-level Penn Treebank - 508:46
Character-level Penn Treebank - 609:05
Word-level Penn Treebank - 109:27
Word-level Penn Treebank - 209:37
Thank you!10:45