Limitations of kernel and multiple kernel learning thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Limitations of kernel and multiple kernel learning

Published on Oct 17, 20115686 Views

Many low Vapnik-Chervonenkis (and hence statistically learnable) classes cannot be represented as linear classes in such a way that they can be learnt with large margin approaches. We review these res

Related categories

Chapter list

Limitations of Kernel and Multiple Kernel Learning00:00
Outline01:03
Relation to SIMBAD03:00
Context05:26
Impossibility of representing classes of functions06:48
What about Support Vector Machines?09:11
Random projections10:37
Concrete example - 112:51
Concrete example - 214:02
Converting learning to convex optimisation16:12
Main Rademacher theorem17:17
Empirical Rademacher theorem20:11
McDiarmid’s inequality21:04
Application to large margin classification22:09
Application to Boosting23:01
Rademacher complexity of convex hulls24:12
Rademacher complexity of convex hulls cont.24:54
Final Boosting bound27:09
Linear programme28:18
Linear programming boosting28:51
Alternative version29:40
Column generation31:35
LP Boost33:04
Implementing VC class through LP Boost33:54
Between Boosting and SVMs37:31
Multiple kernel learning - 139:00
Multiple kernel learning via LP Boost41:42
Multiple kernel learning - 242:35
Bounding MKL - 143:13
Bounding MKL - 244:36
Concluding remarks47:29