Privacy in Statistical Databases
|
|
- Simon Flowers
- 5 years ago
- Views:
Transcription
1 Privacy in Statistical Databases Individuals x 1 x 2 x n Server/agency ) answers. A queries Users Government, researchers, businesses or) Malicious adversary What information can be released? Two conflicting goals Utility: Users can extract global statistics Privacy: Individual information stays hidden 1
2 Approach: Indistinguishability x 1 x 2. x n A local random coins queries answers ) x 1 x 2. x n A local random coins queries ) answers x is a neighbor of x if they differ in one row 2
3 Approach: Indistinguishability x 1 x 2. x n A local random coins queries answers ) x 1 x 2. x n A local random coins queries ) answers x is a neighbor of x if they differ in one row Neighboring databases induce close distributions on transcripts 2
4 Approach: Indistinguishability x 1 x 2. x n A local random coins queries answers ) x 1 x 2. x n A local random coins queries ) answers x is a neighbor of x if they differ in one row Definition: A is indistinguishable if, for all neighbors x, x, for all subsets S of transcripts Neighboring databases induce close distributions on transcripts Pr[Ax) S] 1 + ɛ)pr[ax ) S] 2
5 Approach: Indistinguishability ε has to be non-negligible here Distance measure on distributions matters This is a condition on the algorithm random process) A Saying this output is safe doesn t take into account how it was computed Definition: A is indistinguishable if, for all neighbors x, x, for all subsets S of transcripts Neighboring databases induce close distributions on transcripts Pr[Ax) S] 1 + ɛ)pr[ax ) S] 3
6 Differential Privacy Another interpretation [DM]: You learn the same things about me regardless of whether I am in the database Suppose you know I am the height of median Canadian You could learn my height from database! But it didn t matter whether or not my data was part of it. Has my privacy been compromised? No! Definition: A is indistinguishable if, for all neighbors x, x, for all subsets S of transcripts Neighboring databases induce close distributions on transcripts Pr[Ax) S] 1 + ɛ)pr[ax ) S] 4
7 Bayesian Interpretation For any given prior distribution, obtain almost) same posterior regardless of whether person i s data is included Proof : Bayes rule: posteriorx) = Indistinguishability implies red terms are almost the same regardless of whether person i s data is included. Definition: A is indistinguishable if, for all neighbors x, x, for all subsets S of transcripts PrAx) = transcript) priorx) PrAy) = transcript) priory) y Neighboring databases induce close distributions on transcripts Pr[Ax) S] 1 + ɛ)pr[ax ) S] 5
8 Other Properties Composition : If two institutions independently release information about me, using ε-indistinguishable algorithms A1 and A2 then the joint process A1 A2x) is 2ε-indistinguishable Group privacy ; ε-indistinguishability provides kεindistinguishability for the information about groups of size k Definition: A is indistinguishable if, for all neighbors x, x, for all subsets S of transcripts Neighboring databases induce close distributions on transcripts Pr[Ax) S] 1 + ɛ)pr[ax ) S] 6
9 Examples of Differential Privacy Output perturbation Sum queries [BDMN] Global Sensitivity [DMNS] Local sensitivity [NRS] Sample-Aggregate [NRS] Randomly break database into smaller datasets Compute something on each set Aggregate results for privacy Exponential Sampling [MT] General, abstract transformation on probability distributions More in development... Sofya Raskhodnikova may guest lecture on this Eric Shou s presentation 7
10 Example: Output Perturbation Individuals Server/agency x 1 x 2. x n A local random coins Tell me fx) fx) + noise User Two versions: Interactive: user asks for specific function f Non-interactive: server decides ahead of time on f1, f2, f3,... Intuition: fx) can be released accurately when f is insensitive to individual entries x 1, x 2,..., x n 8
11 Global Sensitivity Individuals x1 x 2. x n Server/agency A local random coins Tell me fx) fx) + noise User Intuition: fx) can be released accurately when f is insensitive to individual entries x 1, x 2,..., x n Global Sensitivity: Example: GS f = GS average = 1 n max neighbors x,x fx) fx ) 1 Theorem ) GSf If Ax) = fx) + Lap then A is ε-indistinguishable. ε 9
12 Examples of low global sensitivity Example: Sample mean average): Suppose x i [0, 1] Consider two neighboring datasets: How does fx) vary? meanx) = 1 n This holds for any neighboring datasets, so n i=1 GS mean = 1 n if x i [0, 1] x i fx) fx ) = x i n x i n 1 n x = x 1, x 2,..., x i 1, x i, x i+1,..., x n ) x = x 1, x 2,..., x i 1, x i, x i+1,..., x n ) 10
13 Examples of low global sensitivity Example: Sample mean average): GS mean = 1 n if x i [0, 1] Add noise Lap 1 ɛn ) meanx) = 1 n to meanx) in order to guarantee privacy Comparison: to estimate a frequency e.g. proportion of diabetics) in underlying population, get inherent error Why? There is a theoretical lower bound on the variance of any estimator for the expectation of an underlying distribution Noise added for privacy) << inherent error) T x) = meanx) + Lap 1 nɛ ) is an efficient estimator for the expectation n i=1 x i Θ 1 n ) 11
14 Global Sensitivity: Noise Distribution Theorem ) GSf If Ax) = fx) + Lap then A is ε-indistinguishable. ε Laplace distribution Lapλ) has density hy) e y 1 λ hy) y 12
15 Global Sensitivity: Noise Distribution Theorem ) GSf If Ax) = fx) + Lap then A is ε-indistinguishable. ε Laplace distribution Lapλ) has density hy) e y 1 λ hy+δ) hy) GSf Sliding property of Lap ε ) : hy) hy+δ) y eε δ GS f for all y, δ 13
16 Global Sensitivity: Noise Distribution Theorem ) GSf If Ax) = fx) + Lap then A is ε-indistinguishable. ε Laplace distribution Lapλ) has density hy) e y 1 λ hy+δ) hy) GSf Sliding property of Lap ε Proof idea: ) : hy) hy+δ) Ax): blue curve Ax ): red curve y eε δ GS f for all y, δ δ = fx) fx ) GS f 14
17 Examples of low global sensitivity Many natural functions have low GS, e.g.: Histograms and contingency tables On board last time) Covariance matrix on board) Distance to a property Functions that can be approximated from a random sample [BDMN] Many data-mining and learning algorithms access the data via a sequence of low-sensitivity questions e.g. perceptron, some EM algorithms, SQ learning algorithms This encompasses a huge variety of common algorithms My favorite: k-means clustering algorithm Problem: more queries more noise 15
18 Examples of Differential Privacy Output perturbation Sum queries [BDMN] Global Sensitivity [DMNS] Local sensitivity [NRS] Sample-Aggregate [NRS] Randomly break database into smaller datasets Compute something on each set Aggregate results for privacy Exponential Sampling [MT] General, abstract transformation on probability distributions More in development... Sofya Raskhodnikova may guest lecture on this Eric Shou s presentation 16
19 High Global Sensitivity: Median Example 1: median of x 1,..., x n [0, 1] x = 0 0 }{{} n }{{} n 1 2 x = 0 0 }{{} n }{{} n 1 2 medianx) = 0 medianx ) = 1 GS median = 1 Noise magnitude: 1. Too much noise! ε But for most neighbor databases x, x, medianx) medianx ) is small. Can we add less noise on good instances? 14 17
20 High Global Sensitivity: Cluster centers Database entries: points in a metric space. x x Global sensitivity of cluster centers is roughly the diameter of the space. But intuitively, if clustering is good, cluster centers should be insensitive. 18
21 High Global Sensitivity: Cluster centers Database entries: points in a metric space. x x Global sensitivity of cluster centers is roughly the diameter of the space. But intuitively, if clustering is good, cluster centers should be insensitive. 19
22 High Global Sensitivity: Cluster centers Database entries: points in a metric space. x x Global sensitivity of cluster centers is roughly the diameter of the space. But intuitively, if clustering is good, cluster centers should be insensitive. 20
23 Global versus local y y fy ) fy) fx ) x x f fx) adding noise Ax ) Ax) D n R d Distributions on R d Global sensitivity is worst case over inputs Local sensitivity: LS f x) = max x neighbor of x fx) fx ) 1 Reminder: GS f x) = max LS f x) x Goal: add less noise when local sensitivity is lower 21
24 Local Sensitivity LS f x) = Local sensitivity LS f x) = max fx) x neighbor of x fx ) 1 max x : neighbor of x fx) fx ) 1 Reminder: GS f = max LS f x) x Example: median for 0 x 1 x n 1, odd n x x m 1 x m x m+1 x 1 n median new median when x n = 0 new median when x 1 = 1 LS median x) = maxx m x m 1, x m+1 x m ) Goal: Release fx) with less noise when LS f x) is lower. 22
25 Instance-based noise: first attempt Can we have noise magnitude LS f x) instead of GS f? Problem: Noise magnitude might reveal information. Example: median x = 0 0 }{{} n } {{ 1} x = } 0 {{ 0} n 3 n medianx) = 0 medianx ) = }{{} n 3 2 LS median x) = 0 LS median x ) = 1 Pr[Ax) = 0] = 1 Pr[Ax) = 0] = 0 A is not ε-indistinguishable Lesson: Noise magnitude must be an insensitive function. 23
26 Smooth Bounds on Sensitivity Design sensitivity function Sx) Sx) is an ε-smooth upper bound on LS f x) if: for all x: Sx) LS f x) for all neighbors x, x : Sx) e ε Sx ) LS f x) x ) Sx) If Ax) = fx) + noise ε Theorem then A is ε -indistinguishable. Example: GS f is always a smooth bound on LS f x) 24
27 Smooth Bounds on Sensitivity Design sensitivity function Sx) Sx) is an ε-smooth upper bound on LS f x) if: for all x: Sx) LS f x) for all neighbors x, x : Sx) e ε Sx ) Sx) LS f x) x ) Sx) If Ax) = fx) + noise ε Theorem then A is ε -indistinguishable. Example: GS f is always a smooth bound on LS f x) 25
28 Smooth Bounds on Sensitivity Smooth sensitivity S f x)= max y LSf y)e ε distx,y)) Lemma For every ε-smooth bound S: S f x) Sx) for all x. Intuition: little noise when far from sensitive instances 26
29 Smooth Bounds on Sensitivity Smooth sensitivity S f x)= max y LSf y)e ε distx,y)) Lemma For every ε-smooth bound S: S f x) Sx) for all x. Intuition: little noise when far from sensitive instances y y x x D n 26
30 Smooth Bounds on Sensitivity Smooth sensitivity S f x)= max y LSf y)e ε distx,y)) Lemma For every ε-smooth bound S: S f x) Sx) for all x. Intuition: little noise when far from sensitive instances high local sensitivity y y low local sensitivity x x D n 26
31 Smooth Bounds on Sensitivity Smooth sensitivity S f x)= max y LSf y)e ε distx,y)) Lemma For every ε-smooth bound S: S f x) Sx) for all x. Intuition: little noise when far from sensitive instances high local sensitivity low smooth sensitivity y y low local sensitivity x x D n 26
32 Algorithmic Questions Applying this framework requires computing smooth bounds on sensitivity When can compute smooth bounds efficiently? How can we avoid this for complicated functions? What computational tasks can we solve privately? E.g.: learning and statistical inference are not exactly function evaluation 27
33 Computing Smooth Sensitivity Recall: Smooth sensitivity S f x) = max y Observation S fx) = max k=0,1,...,n e kε LS k fx) Example: median where LS k fx) = LSf y)e ε distx,y)) max LS f y). y:distx,y) k LS k median x) = max x m+t+k+1 x m+t ) t=0,1,...,k+1 x x m k 1 x m x m+k x n S median x) is computable in On2 ) time. 28
34 Computing Smooth Sensitivity Recall: Smooth sensitivity S f x) = max y Observation S fx) = max k=0,1,...,n e kε LS k fx) Example: median where LS k fx) = LSf y)e ε distx,y)) max LS f y). y:distx,y) k LS k median x) = max x m+t+k+1 x m+t ) t=0,1,...,k+1 x x m k 1 x m x m+k x n 29
35 Computing Smooth Sensitivity Recall: Smooth sensitivity S f x) = max y Observation S fx) = max k=0,1,...,n e kε LS k fx) Example: median where LS k fx) = LSf y)e ε distx,y)) max LS f y). y:distx,y) k LS k median x) = max x m+t+k+1 x m+t ) t=0,1,...,k+1 x x m k 1 x m x m+k x n 30
36 Computing Smooth Sensitivity Recall: Smooth sensitivity S f x) = max y Observation S fx) = max k=0,1,...,n e kε LS k fx) Example: median where LS k fx) = LSf y)e ε distx,y)) max LS f y). y:distx,y) k LS k median x) = max x m+t+k+1 x m+t ) t=0,1,...,k+1 x x m k 1 x m x m+k x n 31
37 Computing Smooth Sensitivity Recall: Smooth sensitivity S f x) = max y Observation S fx) = max k=0,1,...,n e kε LS k fx) Example: median where LS k fx) = LSf y)e ε distx,y)) max LS f y). y:distx,y) k LS k median x) = max x m+t+k+1 x m+t ) t=0,1,...,k+1 x x m k 1 x m x m+k x n 32
38 Computing Smooth Sensitivity Recall: Smooth sensitivity S f x) = max y Observation S fx) = max k=0,1,...,n e kε LS k fx) Example: median where LS k fx) = LSf y)e ε distx,y)) max LS f y). y:distx,y) k LS k median x) = max x m+t+k+1 x m+t ) t=0,1,...,k+1 x x m k 1 x m x m+k x n S median x) is computable in On2 ) time. 33
39 Computing Smooth Sensitivity If input is drawn i.i.d. from any distribution p on [0, Λ], then with high probability: S medianx) = O ) 1 nɛ + Λ Compare to: S meanx) Λ n e ɛn 2 So median has much better dependency on Λ, the diameter of the inputs domain Median is also more robust to outliers in data 34
40 Sample-and-Aggregate Methodology Intuition: Replace f with a less sensitive function f. fx) = gfsample 1 ), fsample 2 ),..., fsample s )) x x i1,..., x it x j1,..., x jt... x k1,..., x kt f6 f6 f6 6 aggregation function g noise calibrated to sensitivity of f + output 35
41 Sample-and-Aggregate Methodology Intuition: Replace f with a less sensitive function f. fx) = gfsample 1 ), fsample 2 ),..., fsample s )) x x i1,..., x it x j1,..., x jt... x k1,..., x kt Informally: Whatever information you can get f6 f6 f6 from a random subset of data points, 6 aggregation function g you can get with strong guarantees of privacy noise calibrated to sensitivity of f + output 36
42 Examples of Differential Privacy Output perturbation Sum queries [BDMN] Global Sensitivity [DMNS] Local sensitivity [NRS] Sample-Aggregate [NRS] Randomly break database into smaller datasets Compute something on each set Aggregate results for privacy Exponential Sampling [MT] General, abstract transformation on probability distributions More in development... Sofya Raskhodnikova may guest lecture on this Eric Shou s presentation 37
43 Lower Bounds Lower Bounds on privacy in general Dinur-Nissim: answering many questions accurately leads to a catastrophic breach of privacy many = On) accurately = with error on 1/2 ) catastrophic breach = adversary computes almost all values in the original database exactly We will see these bounds in more detail as an optional topic Lower bounds on differential privacy We can show stronger lower bounds on the power of noninteractive D.P. protocols future lecture) 38
Differential Privacy II: Basic Tools
Differential Privacy II: Basic Tools Adam Smith Computer Science & Engineering Department Penn State IPAM Data 2010 June 8, 2009 Image Oakland Community College 32 33 Techniques for differential privacy
More informationPrivacy in Statistical Databases
Privacy in Statistical Databases Individuals x 1 x 2 x n Server/agency ( ) answers. A queries Users Government, researchers, businesses (or) Malicious adversary What information can be released? Two conflicting
More informationDifferential Privacy
CS 380S Differential Privacy Vitaly Shmatikov most slides from Adam Smith (Penn State) slide 1 Reading Assignment Dwork. Differential Privacy (invited talk at ICALP 2006). slide 2 Basic Setting DB= x 1
More informationWhat Can We Learn Privately?
What Can We Learn Privately? Sofya Raskhodnikova Penn State University Joint work with Shiva Kasiviswanathan Homin Lee Kobbi Nissim Adam Smith Los Alamos UT Austin Ben Gurion Penn State To appear in SICOMP
More informationCalibrating Noise to Sensitivity in Private Data Analysis
Calibrating Noise to Sensitivity in Private Data Analysis Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith Mamadou H. Diallo 1 Overview Issues: preserving privacy in statistical databases Assumption:
More informationSmooth Sensitivity and Sampling in Private Data Analysis
Smooth Sensitivity and Sampling in Private Data Analysis Kobbi Nissim Sofya Raskhodnikova Adam Smith Draft Full Version, v1.0 May 17, 2011 Abstract We introduce a new, generic framework for private data
More informationLecture 11- Differential Privacy
6.889 New Developments in Cryptography May 3, 2011 Lecture 11- Differential Privacy Lecturer: Salil Vadhan Scribes: Alan Deckelbaum and Emily Shen 1 Introduction In class today (and the next two lectures)
More informationPrivacy-Preserving Data Mining
CS 380S Privacy-Preserving Data Mining Vitaly Shmatikov slide 1 Reading Assignment Evfimievski, Gehrke, Srikant. Limiting Privacy Breaches in Privacy-Preserving Data Mining (PODS 2003). Blum, Dwork, McSherry,
More informationDifferential Privacy and its Application in Aggregation
Differential Privacy and its Application in Aggregation Part 1 Differential Privacy presenter: Le Chen Nanyang Technological University lechen0213@gmail.com October 5, 2013 Introduction Outline Introduction
More informationPrivacy of Numeric Queries Via Simple Value Perturbation. The Laplace Mechanism
Privacy of Numeric Queries Via Simple Value Perturbation The Laplace Mechanism Differential Privacy A Basic Model Let X represent an abstract data universe and D be a multi-set of elements from X. i.e.
More informationCSCI 5520: Foundations of Data Privacy Lecture 5 The Chinese University of Hong Kong, Spring February 2015
CSCI 5520: Foundations of Data Privacy Lecture 5 The Chinese University of Hong Kong, Spring 2015 3 February 2015 The interactive online mechanism that we discussed in the last two lectures allows efficient
More informationLecture 1 Introduction to Differential Privacy: January 28
Introduction to Differential Privacy CSE711: Topics in Differential Privacy Spring 2016 Lecture 1 Introduction to Differential Privacy: January 28 Lecturer: Marco Gaboardi Scribe: Marco Gaboardi This note
More informationA Note on Differential Privacy: Defining Resistance to Arbitrary Side Information
A Note on Differential Privacy: Defining Resistance to Arbitrary Side Information Shiva Prasad Kasiviswanathan Adam Smith Department of Computer Science and Engineering Pennsylvania State University e-mail:
More informationCSE 598A Algorithmic Challenges in Data Privacy January 28, Lecture 2
CSE 598A Algorithmic Challenges in Data Privacy January 28, 2010 Lecture 2 Lecturer: Sofya Raskhodnikova & Adam Smith Scribe: Bing-Rong Lin Theme: Using global sensitivity of f. Suppose we have f of interest.
More informationAnswering Many Queries with Differential Privacy
6.889 New Developments in Cryptography May 6, 2011 Answering Many Queries with Differential Privacy Instructors: Shafi Goldwasser, Yael Kalai, Leo Reyzin, Boaz Barak, and Salil Vadhan Lecturer: Jonathan
More informationLocally Differentially Private Protocols for Frequency Estimation. Tianhao Wang, Jeremiah Blocki, Ninghui Li, Somesh Jha
Locally Differentially Private Protocols for Frequency Estimation Tianhao Wang, Jeremiah Blocki, Ninghui Li, Somesh Jha Differential Privacy Differential Privacy Classical setting Differential Privacy
More information1 Differential Privacy and Statistical Query Learning
10-806 Foundations of Machine Learning and Data Science Lecturer: Maria-Florina Balcan Lecture 5: December 07, 015 1 Differential Privacy and Statistical Query Learning 1.1 Differential Privacy Suppose
More informationDifferential Privacy and Pan-Private Algorithms. Cynthia Dwork, Microsoft Research
Differential Privacy and Pan-Private Algorithms Cynthia Dwork, Microsoft Research A Dream? C? Original Database Sanitization Very Vague AndVery Ambitious Census, medical, educational, financial data, commuting
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationCMPUT651: Differential Privacy
CMPUT65: Differential Privacy Homework assignment # 2 Due date: Apr. 3rd, 208 Discussion and the exchange of ideas are essential to doing academic work. For assignments in this course, you are encouraged
More informationDifferential Privacy and Robust Statistics
Differential Privacy and Robust Statistics Cynthia Dwork Jing Lei November 14, 2008 Abstract In the past several years a promising approach to private data analysis has emerged, based on the notion of
More informationLecture 20: Introduction to Differential Privacy
6.885: Advanced Topics in Data Processing Fall 2013 Stephen Tu Lecture 20: Introduction to Differential Privacy 1 Overview This lecture aims to provide a very broad introduction to the topic of differential
More informationLecture : Probabilistic Machine Learning
Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning
More informationLecture 5: GPs and Streaming regression
Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X
More informationAnswering Numeric Queries
Answering Numeric Queries The Laplace Distribution: Lap(b) is the probability distribution with p.d.f.: p x b) = 1 x exp 2b b i.e. a symmetric exponential distribution Y Lap b, E Y = b Pr Y t b = e t Answering
More informationProbability and Inference. POLI 205 Doing Research in Politics. Populations and Samples. Probability. Fall 2015
Fall 2015 Population versus Sample Population: data for every possible relevant case Sample: a subset of cases that is drawn from an underlying population Inference Parameters and Statistics A parameter
More informationInformation Retrieval and Web Search Engines
Information Retrieval and Web Search Engines Lecture 4: Probabilistic Retrieval Models April 29, 2010 Wolf-Tilo Balke and Joachim Selke Institut für Informationssysteme Technische Universität Braunschweig
More informationEnabling Accurate Analysis of Private Network Data
Enabling Accurate Analysis of Private Network Data Michael Hay Joint work with Gerome Miklau, David Jensen, Chao Li, Don Towsley University of Massachusetts, Amherst Vibhor Rastogi, Dan Suciu University
More informationSparse Linear Models (10/7/13)
STA56: Probabilistic machine learning Sparse Linear Models (0/7/) Lecturer: Barbara Engelhardt Scribes: Jiaji Huang, Xin Jiang, Albert Oh Sparsity Sparsity has been a hot topic in statistics and machine
More informationLecture 3: Probabilistic Retrieval Models
Probabilistic Retrieval Models Information Retrieval and Web Search Engines Lecture 3: Probabilistic Retrieval Models November 5 th, 2013 Wolf-Tilo Balke and Kinda El Maarry Institut für Informationssysteme
More informationECE521 week 3: 23/26 January 2017
ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationMachine Learning, Fall 2009: Midterm
10-601 Machine Learning, Fall 009: Midterm Monday, November nd hours 1. Personal info: Name: Andrew account: E-mail address:. You are permitted two pages of notes and a calculator. Please turn off all
More informationAnalyzing Graphs with Node Differential Privacy
Analyzing Graphs with Node Differential Privacy Shiva Prasad Kasiviswanathan Kobbi Nissim Sofya Raskhodnikova Adam Smith March 19, 2013 Abstract We develop algorithms for the private analysis of network
More information1 Hoeffding s Inequality
Proailistic Method: Hoeffding s Inequality and Differential Privacy Lecturer: Huert Chan Date: 27 May 22 Hoeffding s Inequality. Approximate Counting y Random Sampling Suppose there is a ag containing
More informationData Mining Prof. Pabitra Mitra Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur
Data Mining Prof. Pabitra Mitra Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur Lecture 21 K - Nearest Neighbor V In this lecture we discuss; how do we evaluate the
More informationBAYESIAN DECISION THEORY
Last updated: September 17, 2012 BAYESIAN DECISION THEORY Problems 2 The following problems from the textbook are relevant: 2.1 2.9, 2.11, 2.17 For this week, please at least solve Problem 2.3. We will
More informationA Bayesian Method for Guessing the Extreme Values in a Data Set
A Bayesian Method for Guessing the Extreme Values in a Data Set Mingxi Wu University of Florida May, 2008 Mingxi Wu (University of Florida) May, 2008 1 / 74 Outline Problem Definition Example Applications
More informationInformation-theoretic foundations of differential privacy
Information-theoretic foundations of differential privacy Darakhshan J. Mir Rutgers University, Piscataway NJ 08854, USA, mir@cs.rutgers.edu Abstract. We examine the information-theoretic foundations of
More informationStatistical Inference: Estimation and Confidence Intervals Hypothesis Testing
Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing 1 In most statistics problems, we assume that the data have been generated from some unknown probability distribution. We desire
More informationWhy (Bayesian) inference makes (Differential) privacy easy
Why (Bayesian) inference makes (Differential) privacy easy Christos Dimitrakakis 1 Blaine Nelson 2,3 Aikaterini Mitrokotsa 1 Benjamin Rubinstein 4 Zuhe Zhang 4 1 Chalmers university of Technology 2 University
More informationProbabilistic Graphical Models
School of Computer Science Probabilistic Graphical Models Infinite Feature Models: The Indian Buffet Process Eric Xing Lecture 21, April 2, 214 Acknowledgement: slides first drafted by Sinead Williamson
More informationLecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis
Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationReport on Differential Privacy
Report on Differential Privacy Lembit Valgma Supervised by Vesal Vojdani December 19, 2017 1 Introduction Over the past decade the collection and analysis of personal data has increased a lot. This has
More informationProbability and Information Theory. Sargur N. Srihari
Probability and Information Theory Sargur N. srihari@cedar.buffalo.edu 1 Topics in Probability and Information Theory Overview 1. Why Probability? 2. Random Variables 3. Probability Distributions 4. Marginal
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More informationDifferential Privacy with Bounded Priors: Reconciling Utility and Privacy in Genome-Wide Association Studies
Differential Privacy with ounded Priors: Reconciling Utility and Privacy in Genome-Wide Association Studies ASTRACT Florian Tramèr Zhicong Huang Jean-Pierre Hubaux School of IC, EPFL firstname.lastname@epfl.ch
More informationOnline Learning, Mistake Bounds, Perceptron Algorithm
Online Learning, Mistake Bounds, Perceptron Algorithm 1 Online Learning So far the focus of the course has been on batch learning, where algorithms are presented with a sample of training data, from which
More informationApproximate and Probabilistic Differential Privacy Definitions
pproximate and Probabilistic Differential Privacy Definitions Sebastian Meiser University College London, United Kingdom, e-mail: s.meiser@ucl.ac.uk July 20, 208 bstract This technical report discusses
More informationMACHINE LEARNING INTRODUCTION: STRING CLASSIFICATION
MACHINE LEARNING INTRODUCTION: STRING CLASSIFICATION THOMAS MAILUND Machine learning means different things to different people, and there is no general agreed upon core set of algorithms that must be
More informationMIXTURE MODELS AND EM
Last updated: November 6, 212 MIXTURE MODELS AND EM Credits 2 Some of these slides were sourced and/or modified from: Christopher Bishop, Microsoft UK Simon Prince, University College London Sergios Theodoridis,
More informationGaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008
Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:
More informationDifferentially Private Feature Selection via Stability Arguments, and the Robustness of the Lasso
Differentially Private Feature Selection via Stability Arguments, and the Robustness of the Lasso Adam Smith asmith@cse.psu.edu Pennsylvania State University Abhradeep Thakurta azg161@cse.psu.edu Pennsylvania
More informationProbabilistic classification CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2016
Probabilistic classification CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2016 Topics Probabilistic approach Bayes decision theory Generative models Gaussian Bayes classifier
More informationBased on slides by Richard Zemel
CSC 412/2506 Winter 2018 Probabilistic Learning and Reasoning Lecture 3: Directed Graphical Models and Latent Variables Based on slides by Richard Zemel Learning outcomes What aspects of a model can we
More informationLECTURE NOTE #3 PROF. ALAN YUILLE
LECTURE NOTE #3 PROF. ALAN YUILLE 1. Three Topics (1) Precision and Recall Curves. Receiver Operating Characteristic Curves (ROC). What to do if we do not fix the loss function? (2) The Curse of Dimensionality.
More informationThe Optimal Mechanism in Differential Privacy
The Optimal Mechanism in Differential Privacy Quan Geng Advisor: Prof. Pramod Viswanath 11/07/2013 PhD Final Exam of Quan Geng, ECE, UIUC 1 Outline Background on Differential Privacy ε-differential Privacy:
More informationInstance-based Learning CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2016
Instance-based Learning CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2016 Outline Non-parametric approach Unsupervised: Non-parametric density estimation Parzen Windows Kn-Nearest
More informationOn Differentially Private Frequent Itemsets Mining
On Differentially Private Frequent Itemsets Mining Chen Zeng University of Wisconsin-Madison zeng@cs.wisc.edu Jeffrey F. Naughton University of Wisconsin-Madison naughton@cs.wisc.edu Jin-Yi Cai University
More informationOn Node-differentially Private Algorithms for Graph Statistics
On Node-differentially Private Algorithms for Graph Statistics Om Dipakbhai Thakkar August 26, 2015 Abstract In this report, we start by surveying three papers on node differential privacy. First, we look
More informationAccuracy First: Selecting a Differential Privacy Level for Accuracy-Constrained Empirical Risk Minimization
Accuracy First: Selecting a Differential Privacy Level for Accuracy-Constrained Empirical Risk Minimization Katrina Ligett HUJI & Caltech joint with Seth Neel, Aaron Roth, Bo Waggoner, Steven Wu NIPS 2017
More informationDiscrete Mathematics and Probability Theory Fall 2015 Lecture 21
CS 70 Discrete Mathematics and Probability Theory Fall 205 Lecture 2 Inference In this note we revisit the problem of inference: Given some data or observations from the world, what can we infer about
More informationClassification 2: Linear discriminant analysis (continued); logistic regression
Classification 2: Linear discriminant analysis (continued); logistic regression Ryan Tibshirani Data Mining: 36-462/36-662 April 4 2013 Optional reading: ISL 4.4, ESL 4.3; ISL 4.3, ESL 4.4 1 Reminder:
More informationMICROPIPETTE CALIBRATIONS
Physics 433/833, 214 MICROPIPETTE CALIBRATIONS I. ABSTRACT The micropipette set is a basic tool in a molecular biology-related lab. It is very important to ensure that the micropipettes are properly calibrated,
More informationUnderstanding Generalization Error: Bounds and Decompositions
CIS 520: Machine Learning Spring 2018: Lecture 11 Understanding Generalization Error: Bounds and Decompositions Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the
More informationStatistical Techniques in Robotics (16-831, F12) Lecture#20 (Monday November 12) Gaussian Processes
Statistical Techniques in Robotics (6-83, F) Lecture# (Monday November ) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan Applications of Gaussian Processes (a) Inverse Kinematics
More informationComputational Cognitive Science
Computational Cognitive Science Lecture 9: A Bayesian model of concept learning Chris Lucas School of Informatics University of Edinburgh October 16, 218 Reading Rules and Similarity in Concept Learning
More informationBe able to define the following terms and answer basic questions about them:
CS440/ECE448 Section Q Fall 2017 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables, axioms of probability o Joint, marginal, conditional
More informationCOMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017
COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University FEATURE EXPANSIONS FEATURE EXPANSIONS
More informationDifferential Privacy in an RKHS
Differential Privacy in an RKHS Rob Hall (with Larry Wasserman and Alessandro Rinaldo) 2/20/2012 rjhall@cs.cmu.edu http://www.cs.cmu.edu/~rjhall 1 Overview Why care about privacy? Differential Privacy
More informationIntroduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Intro to Learning Theory Date: 12/8/16
600.463 Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Intro to Learning Theory Date: 12/8/16 25.1 Introduction Today we re going to talk about machine learning, but from an
More informationL11: Pattern recognition principles
L11: Pattern recognition principles Bayesian decision theory Statistical classifiers Dimensionality reduction Clustering This lecture is partly based on [Huang, Acero and Hon, 2001, ch. 4] Introduction
More informationNon-Parametric Bayes
Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian
More informationIntroduction to Bayesian Learning. Machine Learning Fall 2018
Introduction to Bayesian Learning Machine Learning Fall 2018 1 What we have seen so far What does it mean to learn? Mistake-driven learning Learning by counting (and bounding) number of mistakes PAC learnability
More informationIntroduction: MLE, MAP, Bayesian reasoning (28/8/13)
STA561: Probabilistic machine learning Introduction: MLE, MAP, Bayesian reasoning (28/8/13) Lecturer: Barbara Engelhardt Scribes: K. Ulrich, J. Subramanian, N. Raval, J. O Hollaren 1 Classifiers In this
More informationLecture 18 - Secret Sharing, Visual Cryptography, Distributed Signatures
Lecture 18 - Secret Sharing, Visual Cryptography, Distributed Signatures Boaz Barak November 27, 2007 Quick review of homework 7 Existence of a CPA-secure public key encryption scheme such that oracle
More information[Title removed for anonymity]
[Title removed for anonymity] Graham Cormode graham@research.att.com Magda Procopiuc(AT&T) Divesh Srivastava(AT&T) Thanh Tran (UMass Amherst) 1 Introduction Privacy is a common theme in public discourse
More informationStatistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes
Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan 1, M. Koval and P. Parashar 1 Applications of Gaussian
More informationDifferentially Private ANOVA Testing
Differentially Private ANOVA Testing Zachary Campbell campbza@reed.edu Andrew Bray abray@reed.edu Anna Ritz Biology Department aritz@reed.edu Adam Groce agroce@reed.edu arxiv:1711335v2 [cs.cr] 20 Feb 2018
More informationAcknowledgements. Today s Lecture. Attacking k-anonymity. Ahem Homogeneity. Data Privacy in Biomedicine: Lecture 18
Acknowledgeents Data Privacy in Bioedicine Lecture 18: Fro Identifier to Attribute Disclosure and Beyond Bradley Malin, PhD (b.alin@vanderbilt.edu) Professor of Bioedical Inforatics, Biostatistics, and
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II
1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector
More informationNotes on the Exponential Mechanism. (Differential privacy)
Notes on the Exponential Mechanism. (Differential privacy) Boston University CS 558. Sharon Goldberg. February 22, 2012 1 Description of the mechanism Let D be the domain of input datasets. Let R be the
More informationIntroduction to Machine Learning. Lecture 2
Introduction to Machine Learning Lecturer: Eran Halperin Lecture 2 Fall Semester Scribe: Yishay Mansour Some of the material was not presented in class (and is marked with a side line) and is given for
More informationCommunication-efficient and Differentially-private Distributed SGD
1/36 Communication-efficient and Differentially-private Distributed SGD Ananda Theertha Suresh with Naman Agarwal, Felix X. Yu Sanjiv Kumar, H. Brendan McMahan Google Research November 16, 2018 2/36 Outline
More informationLecture 5, CPA Secure Encryption from PRFs
CS 4501-6501 Topics in Cryptography 16 Feb 2018 Lecture 5, CPA Secure Encryption from PRFs Lecturer: Mohammad Mahmoody Scribe: J. Fu, D. Anderson, W. Chao, and Y. Yu 1 Review Ralling: CPA Security and
More informationMaryam Shoaran Alex Thomo Jens Weber. University of Victoria, Canada
Maryam Shoaran Alex Thomo Jens Weber University of Victoria, Canada Introduction Challenge: Evidence of Participation Sample Aggregates Zero-Knowledge Privacy Analysis of Utility of ZKP Conclusions 12/17/2015
More informationLecture 7 Introduction to Statistical Decision Theory
Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7
More informationPufferfish Privacy Mechanisms for Correlated Data. Shuang Song, Yizhen Wang, Kamalika Chaudhuri University of California, San Diego
Pufferfish Privacy Mechanisms for Correlated Data Shuang Song, Yizhen Wang, Kamalika Chaudhuri University of California, San Diego Sensitive Data with Correlation } Data from social networks, } eg. Spreading
More informationRAPPOR: Randomized Aggregatable Privacy- Preserving Ordinal Response
RAPPOR: Randomized Aggregatable Privacy- Preserving Ordinal Response Úlfar Erlingsson, Vasyl Pihur, Aleksandra Korolova Google & USC Presented By: Pat Pannuto RAPPOR, What is is good for? (Absolutely something!)
More informationToward Privacy in Public Databases (Extended Abstract)
Toward Privacy in Public Databases (Extended Abstract) Shuchi Chawla Carnegie Mellon University shuchi@cs.cmu.edu Adam Smith Massachusetts Institute of Technology asmith@theory.lcs.mit.edu Cynthia Dwork
More informationStatistical Data Analysis Stat 3: p-values, parameter estimation
Statistical Data Analysis Stat 3: p-values, parameter estimation London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway,
More information10-701/15-781, Machine Learning: Homework 4
10-701/15-781, Machine Learning: Homewor 4 Aarti Singh Carnegie Mellon University ˆ The assignment is due at 10:30 am beginning of class on Mon, Nov 15, 2010. ˆ Separate you answers into five parts, one
More informationLecture 9 and 10: Malicious Security - GMW Compiler and Cut and Choose, OT Extension
CS 294 Secure Computation February 16 and 18, 2016 Lecture 9 and 10: Malicious Security - GMW Compiler and Cut and Choose, OT Extension Instructor: Sanjam Garg Scribe: Alex Irpan 1 Overview Garbled circuits
More informationarxiv: v1 [cs.cr] 28 Apr 2015
Differentially Private Release and Learning of Threshold Functions Mark Bun Kobbi Nissim Uri Stemmer Salil Vadhan April 28, 2015 arxiv:1504.07553v1 [cs.cr] 28 Apr 2015 Abstract We prove new upper and lower
More informationImproved Direct Product Theorems for Randomized Query Complexity
Improved Direct Product Theorems for Randomized Query Complexity Andrew Drucker Nov. 16, 2010 Andrew Drucker, Improved Direct Product Theorems for Randomized Query Complexity 1/28 Big picture Usually,
More information1 Probabilities. 1.1 Basics 1 PROBABILITIES
1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability
More informationAnalysis of Fast Input Selection: Application in Time Series Prediction
Analysis of Fast Input Selection: Application in Time Series Prediction Jarkko Tikka, Amaury Lendasse, and Jaakko Hollmén Helsinki University of Technology, Laboratory of Computer and Information Science,
More informationSome Asymptotic Bayesian Inference (background to Chapter 2 of Tanner s book)
Some Asymptotic Bayesian Inference (background to Chapter 2 of Tanner s book) Principal Topics The approach to certainty with increasing evidence. The approach to consensus for several agents, with increasing
More informationDifferential Privacy and Verification. Marco Gaboardi University at Buffalo, SUNY
Differential Privacy and Verification Marco Gaboardi University at Buffalo, SUNY Given a program P, is it differentially private? P Verification Tool yes? no? Proof P Verification Tool yes? no? Given a
More informationTesting and Reconstruction of Lipschitz Functions with Applications to Data Privacy
Testing and Reconstruction of Lipschitz Functions with Applications to Data Privacy Madhav Jha Pennsylvania State University mxj201@cse.psu.edu Sofya Raskhodnikova Pennsylvania State University sofya@cse.psu.edu
More information