video thumbnail
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Visual features II

Published on 2015-09-134331 Views

Presentation

Visual features II00:01
What next?24:42:53
Vision beyond object recognition42:36:54
Random dot stereograms60:50:48
Some things are hard to infer from still images72:21:49
There are things images cannot teach you102:23:59
Learn relations by concatenating images? - 1124:02:01
Learn relations by concatenating images? - 2183:47:59
wTx ?210:39:37
Families of manifolds240:02:38
Bi-linear models - 1273:28:56
Bi-linear models - 2300:29:16
Example: Gated Boltzmann machine304:50:46
Example: Gated autoencoder313:51:54
Multiplicative interactions317:23:13
Factored Gated Autoencoder350:23:47
Toy examples414:16:53
Learned filters wx - 1416:19:20
Learned filters wx - 2419:38:13
Rotation filters - 1431:44:38
Rotation filters - 2436:16:40
Filters learned from split-screen shifts - 2445:55:56
Natural video filters - 1449:01:02
Filters learned from split-screen shifts - 1463:44:16
Natural video filters - 2465:04:32
Understanding gating - 2546:11:11
Understanding gating - 3574:15:20
Orthogonal transformations decompose into rotations590:24:02
To detect the rotation angle, compute a 2-d inner product - 2610:10:55
To detect the rotation angle, compute a 2-d inner product - 4640:45:38
Understanding gating - 1692:59:40
To detect the rotation angle, compute a 2-d inner product - 3699:16:25
The aperture problem - 1704:22:10
The aperture problem - 2711:54:33
The aperture problem - 4724:23:58
The aperture problem - 3743:12:33
The aperture problem - 5753:38:36
The aperture problem - 6754:13:26
To detect the rotation angle, compute a 2-d inner product - 1778:36:31
The aperture problem - 7788:09:53
To detect the rotation angle, pool over 2-d inner products840:31:17
Action recognition 2011855:40:54
Other applications883:06:54
Vanishing gradients964:07:37
Orthogonal weights create “dynamic memory” - 11010:02:27
Orthogonal weights create “dynamic memory” - 21024:12:19
Orthogonal weights create “dynamic memory” - 31029:02:16
Why memory needs gating1056:23:22
Predictive training1134:12:45
sine waves - 11165:08:55
sine waves - 21165:28:39
sine waves - 31168:10:10
sine waves - 41168:11:35
The model learns rotational derivatives1169:46:56
Learning higher-order derivatives (acceleration) - 11170:47:52
Learning higher-order derivatives (acceleration) - 21171:02:00
snap, crackle, pop1171:15:33
Annealed teacher forcing1178:16:32
chrips1178:29:30
Harmonics1180:05:28
NORB videos1180:15:08
Multi-step prediction helps1181:00:46
Recognizing accelerations1208:33:52
bouncing balls (Mnih et al), (Sutskever et al)1209:46:27
Learned filters1209:48:45
Gating units1256:04:27
bouncing balll with occlusion1279:38:57
Vanishing gradients - 11281:04:12
A 2-d subspace - 11302:33:45
Vanishing gradients - 21307:12:16
Autoencoders learn negative biases1362:06:32
Do autoencoders orthogonalize weights?1375:55:12
A 2-d subspace - 21379:33:47
Zero-bias ReLUs are hard to beat1381:16:33
The energy function of a ReLU autoencoder - 11382:05:32
The energy function of a ReLU autoencoder - 21382:31:15
Truncated rectified unit (Trec)1382:44:36
Truncated linear unit (TLin)1400:21:57
ZAE features from tiny images (Torralba et al.)1400:50:57
Perm-invariant CIFAR-101414:03:14
Perm-invariant CIFAR-10 patches1428:29:06
Rotation filters - 11428:48:49
Rotation filters - 21429:04:02
Deep fully-connected CIFAR-10 - 11430:49:40
Deep fully-connected CIFAR-10 - 21430:59:32
Deep fully-connected CIFAR-10 - 31435:59:41