Deep Natural Language Understanding thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Deep Natural Language Understanding

Published on Aug 23, 201620505 Views

In this lecture, I start with a claim that natural language understanding can largely be approached as building a better language model and explain three widely-adopted approaches to language modellin

Related categories

Chapter list

Natural Language Understanding00:00
Language Understanding? Modelling?00:14
Language Understanding - 100:33
Language Understanding - 201:13
Language Understanding - 302:02
Language Understanding - 402:39
Language Modelling03:04
How likely is this sentence? - 103:08
How likely is this sentence? - 203:42
How likely is this sentence? - 304:47
Statistical LM06:06
n-gram Language Modelling08:05
How likely is this sentence? - 408:11
How likely is this sentence? - 509:36
How likely is this sentence? - 612:19
How likely is this sentence? - 713:34
How likely is this sentence? - 815:15
Neural Language Modelling17:06
Language Modelling - 117:12
Language Modelling - 218:00
Language Modelling - 422:33
Q&A - 130:22
Language Modelling - 530:43
Language Modelling - 332:24
Non-Markovian Language Modelling34:15
Language Modelling - 634:19
Language Modelling - 735:56
Language Modelling - 837:21
Language Modelling - 937:53
RNN Language Modelling38:46
Language Modelling - 1040:38
Language Modelling - 1244:01
Language Modelling - 1144:02
Language Modelling - 1345:08
Language Modelling - 1445:43
Language Modelling - 1546:00
Language Modelling - 1646:35
Training RNN-LM46:56
Language Modelling - 1747:03
Language Modelling - 1848:10
Language Modelling - 1949:07
Language Modelling - 2050:43
Q&A - 252:06
Gated Recurrent Units - 152:11
Gated Recurrent Units - 252:16
Gated Recurrent Units - 354:08
Gated Recurrent Units - 455:04
Gated Recurrent Units - 555:46
Gated Recurrent Units - 656:21
Gated Recurrent Units - 757:25
Gated Recurrent Units - 959:34
Gated Recurrent Units - 1001:00:03
Gated Recurrent Units - 1101:00:25
Q&A - 301:01:05
Gated Recurrent Units - 801:01:30
Machine Translation01:02:22
Neural Machine Translation - 101:03:13
Neural Machine Translation - 201:04:28
Untitled01:05:44
Neural Machine Translation - 401:07:40
Neural Machine Translation - 501:08:13
Neural Machine Translation - 601:08:13
Neural Machine Translation - 701:08:14
Neural Machine Translation - 801:08:14
Deep Natural Language Processing01:08:15
Deep Natural Language Processing (1) Character-level Modelling - 101:08:27
Deep Natural Language Processing (1) Character-level Modelling - 201:08:36
But, there are still too much explicit structures here…01:09:08
Why the hell are we using a sequence of words?01:09:11
There are legitimate reasons… sorta…01:09:13
But, are they really legit reasons?01:09:54
But, are they really legit reasons? - 201:10:17
But, are they really legit reasons? - 301:10:37
There are legitimate reasons… sorta…01:11:04
Problems with treating each and every token separately01:11:34
Obviously I’m not the first one to ask this question…01:12:28
Deep Natural Language Processing (1) Character-level Modelling - 301:12:42
Addresses01:13:25
So, we decided to answer this question ourselves… - 101:13:29
So, we decided to answer this question ourselves… - 201:13:29
Table01:14:08
The decoder implicitly learned word-like units automatically01:14:10
What have we learned?01:15:12
Deep Natural Language Processing (2) Multilingual Modelling01:15:13
Multilingual Translation01:15:44
Multilingual Translation: Benefits01:16:09
Dong ET AL. (ACL 2015)01:16:12
Luong ET AL. (ICLR, NOV 2015)01:16:13
Challenges01:16:14
Multi-Way, Multilingual Translation - 101:16:17
Multi-Way, Multilingual Translation - 201:16:29
Settings01:16:41
Deep Natural Language Processing (3) Larger-Context Modelling01:17:14
Context matters01:17:43
Larger-Context Language Modelling - 101:18:23
Larger-Context Language Modelling - 201:18:42
Larger-Context Language Modelling - 301:18:43
Larger-Context Language Modelling - 401:19:11
Larger-Context Machine Translation 01:19:56
Dialogue-level Machine Translation01:19:57
World-Context Machine Translation01:19:58
Thank You!01:20:29