
0.25
0.5
0.75
1.25
1.5
1.75
2
Sample Complexity Bounds for Differentially Private Learning
Published on 2011-08-023820 Views
We study the problem of privacy-preserving classification – namely, learning a classifier from sensitive data, while still preserving the privacy of individuals in the training set. In particular, we
Presentation
Sample complexity bounds for differentially private learning00:00
Part 1. Learning and privacy model06:55
Data analytics with sensitive information - 130:50
Data analytics with sensitive information - 211:41:50
Data analytics with sensitive information - 316:33:20
Data analytics with sensitive information - 426:22:23
Data analytics with sensitive information - 526:33:49
Example: genome-wide association studies31:04:53
Privacy-preserving machine learning42:39:48
Goal 1: Differential privacy - 149:03:25
Goal 1: Differential privacy - 265:30:49
Goal 2: Learning95:25:58
What was known - 1105:49:01
What was known - 2137:31:48
Part 2. Sample complexity bounds for differentially-private learning139:21:01
Our results141:14:22
No distribution-independent sample complexity upper bound - 1155:52:01
No distribution-independent sample complexity upper bound - 2180:13:54
No distribution-independent sample complexity upper bound - 3190:41:32
Some hope for differentially-private learning - 1211:59:12
Some hope for differentially-private learning - 2222:18:56
Upper bounds based on prior knowledge of unlabeled data distribution - 1223:17:10
Upper bounds based on prior knowledge of unlabeled data distribution - 2239:06:44
Upper bounds based on prior knowledge of unlabeled data distribution - 3250:10:53
Upper bounds based on prior knowledge of unlabeled data distribution - 4253:27:57
Upper bounds based on prior knowledge of unlabeled data distribution - 5269:01:46
Upper bounds based on prior knowledge of unlabeled data distribution - 6281:57:20
Recap & future work286:16:22
Thank you309:02:10