
0.25
0.5
0.75
1.25
1.5
1.75
2
Deep Natural Language Understanding
Published on 2016-08-2320506 Views
In this lecture, I start with a claim that natural language understanding can largely be approached as building a better language model and explain three widely-adopted approaches to language modellin
Presentation
Natural Language Understanding00:00
Language Understanding? Modelling?04:02:52
Language Understanding - 109:23:42
Language Understanding - 220:23:37
Language Understanding - 333:55:15
Language Understanding - 444:20:41
Language Modelling51:14:55
How likely is this sentence? - 152:25:11
How likely is this sentence? - 261:43:15
How likely is this sentence? - 379:50:59
Statistical LM101:44:34
n-gram Language Modelling134:52:44
How likely is this sentence? - 4136:32:10
How likely is this sentence? - 5160:04:40
How likely is this sentence? - 6205:16:43
How likely is this sentence? - 7226:07:53
How likely is this sentence? - 8254:19:05
Neural Language Modelling285:01:00
Language Modelling - 1286:40:58
Language Modelling - 2300:13:25
Language Modelling - 4376:05:18
Q&A - 1506:16:18
Language Modelling - 5512:05:28
Language Modelling - 3540:11:02
Non-Markovian Language Modelling570:59:58
Language Modelling - 6571:59:09
Language Modelling - 7598:53:59
Language Modelling - 8622:40:37
Language Modelling - 9631:32:59
RNN Language Modelling646:09:30
Language Modelling - 10677:21:10
Language Modelling - 12733:42:02
Language Modelling - 11734:07:51
Language Modelling - 13752:26:05
Language Modelling - 14761:58:39
Language Modelling - 15766:54:20
Language Modelling - 16776:25:17
Training RNN-LM782:25:22
Language Modelling - 17784:15:41
Language Modelling - 18802:53:44
Language Modelling - 19818:39:26
Language Modelling - 20845:20:17
Q&A - 2868:21:56
Gated Recurrent Units - 1869:44:04
Gated Recurrent Units - 2871:13:59
Gated Recurrent Units - 3902:23:28
Gated Recurrent Units - 4917:55:38
Gated Recurrent Units - 5929:29:26
Gated Recurrent Units - 6939:23:24
Gated Recurrent Units - 7957:06:36
Gated Recurrent Units - 9992:58:37
Gated Recurrent Units - 101000:50:41
Gated Recurrent Units - 111007:05:34
Q&A - 31018:07:15
Gated Recurrent Units - 81025:05:12
Machine Translation1039:31:05
Neural Machine Translation - 11053:37:14
Neural Machine Translation - 21074:34:00
Untitled1095:36:18
Neural Machine Translation - 41127:55:26
Neural Machine Translation - 51136:56:43
Neural Machine Translation - 61137:07:09
Neural Machine Translation - 71137:16:45
Neural Machine Translation - 81137:25:55
Deep Natural Language Processing1137:35:57
Deep Natural Language Processing (1) Character-level Modelling - 11141:02:29
Deep Natural Language Processing (1) Character-level Modelling - 21143:27:41
But, there are still too much explicit structures here…1152:28:02
Why the hell are we using a sequence of words?1153:08:02
There are legitimate reasons… sorta…1153:39:09
But, are they really legit reasons?1165:05:49
But, are they really legit reasons? - 21171:27:30
But, are they really legit reasons? - 31177:07:49
There are legitimate reasons… sorta…1184:34:03
Problems with treating each and every token separately1192:54:18
Obviously I’m not the first one to ask this question…1207:55:40
Deep Natural Language Processing (1) Character-level Modelling - 31211:45:20
Addresses1223:41:53
So, we decided to answer this question ourselves… - 11224:46:57
So, we decided to answer this question ourselves… - 21224:59:21
Table1235:35:23
The decoder implicitly learned word-like units automatically1236:21:37
What have we learned?1253:23:54
Deep Natural Language Processing (2) Multilingual Modelling1253:40:12
Multilingual Translation1262:29:56
Multilingual Translation: Benefits1269:25:27
Dong ET AL. (ACL 2015)1270:07:23
Luong ET AL. (ICLR, NOV 2015)1270:25:20
Challenges1270:40:16
Multi-Way, Multilingual Translation - 11271:27:11
Multi-Way, Multilingual Translation - 21274:52:30
Settings1278:17:53
Deep Natural Language Processing (3) Larger-Context Modelling1287:17:15
Context matters1295:19:30
Larger-Context Language Modelling - 11306:37:21
Larger-Context Language Modelling - 21311:48:33
Larger-Context Language Modelling - 31311:57:06
Larger-Context Language Modelling - 41319:48:25
Larger-Context Machine Translation 1332:22:38
Dialogue-level Machine Translation1332:39:02
World-Context Machine Translation1332:55:35
Thank You!1341:39:42