The Optimal Mechanism in Differential Privacy

Size: px
Start display at page:

Download "The Optimal Mechanism in Differential Privacy"

Transcription

1 The Optimal Mechanism in Differential Privacy Quan Geng Advisor: Prof. Pramod Viswanath 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 1

2 Outline Background on differential privacy Problem formulation Main result on the optimal mechanism Extensions to other settings Conclusion and future work 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 2

3 Motivation Vast amounts of personal information are collected Data analysis produces a lot of useful applications 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 3

4 Motivation Big Data Release data/info. Applications Analyst How to release the data while protecting individual s privacy? 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 4

5 Motivation: Netflix Prize (2006) Netflix hosted a contest to improve movie recommendation Released dataset 100 million ratings 500 thousand users 18 thousand movies 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 5 Image Credit: Arvind Narayanan

6 Motivation: Netflix Prize (2006) Netflix hosted a contest to improve movie recommendation Released dataset 100 million ratings 500 thousand users 18 thousand movies Anonymization to protect customer privacy 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 6 Image Credit: Arvind Narayanan

7 Motivation: Netflix Prize Dataset Release Image Credit: Arvind Narayanan [Narayanan, Shmatikov 2008] 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 7

8 Motivation How to protect PRIVACY resilient to attacks with arbitrary side information? 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 8

9 Motivation How to protect PRIVACY resilient to attacks with arbitrary side information? Ans: randomized releasing mechanism 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 9

10 Motivation How to protect PRIVACY resilient to attacks with arbitrary side information? Ans: randomized releasing mechanism How much randomness is needed? completely random (no utility) deterministic (no privacy) 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 10

11 Motivation How to protect PRIVACY resilient to attacks with arbitrary side information? Ans: randomized releasing mechanism How much randomness is needed? completely random (no utility) deterministic (no privacy) Differential Privacy [Dwork et. al. 06]: one way to quantify the level of randomness and privacy 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 11

12 Background on Differential Privacy Collect data Users database Query Analyst Database Curator Released info. Database Curator: How to answer a query, providing useful data information to the analyst, while still protecting the privacy of each user. 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 12

13 Background on Differential Privacy Collect data Users database Query Analyst Database Curator Released info. dataset: D query function: q (D) randomized released mechanism: K D Age A: 20 B: 35 C: 43 D: 30 q: How many people are older than 32? q(d) = 2 K(D) = 1 w.p. 1/8 K(D) = 2 w.p. 1/2 K(D) = 3 w.p. 1/4 K(D) = 4 w.p. 1/8 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 13

14 Background on Differential Privacy Two neighboring datasets: differ only at one element Age A: 20 B: 35 C: 43 D: 30 Age A: 20 B: 35 D: 30 D 1 D 2 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 14

15 Background on Differential Privacy Two neighboring datasets: differ only at one element Age A: 20 B: 35 C: 43 D: 30 Age A: 20 B: 35 D: 30 K D 1 K(D 2 ) D 1 D 2 Differential Privacy: presence or absence of any individual record in the dataset should not affect the released information significantly. 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 15

16 Background on Differential Privacy A randomized mechanism K gives ε-differential privacy, if for any two neighboring datasets D 1, D 2, and all S Range(K), Pr K D 1 S e ε Pr K D 2 S (make hypothesis testing hard) ε quantifies the level of privacy ε 0, high privacy ε +, low privacy a social question to choose ε (can be 0.01, 0.1, 1, 10 ) 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 16

17 Background on Differential Privacy Q: How much perturbation needed to achieve ε-dp? A: depends on how different q D 1, q(d 2 ) are 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 17

18 Background on Differential Privacy Q: How much perturbation needed to achieve ε-dp? A: depends on how different q D 1, q(d 2 ) are If q D 1 = q D 2, no perturbation is needed. q D 1, q(d 2 ) 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 18

19 Background on Differential Privacy Q: How much perturbation needed to achieve ε-dp? A: depends on how different q D 1, q(d 2 ) are f D1 (x) f D2 (x) If q D 1 q D 2 is small, use small perturbation q D 1 q(d 2 ) 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 19

20 Background on Differential Privacy Q: How much perturbation needed to achieve ε-dp? A: depends on how different q D 1, q(d 2 ) are f D1 (x) If q D 1 q D 2 is large, use large perturbation f D2 (x) q D 1 q(d 2 ) 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 20

21 Background on Differential Privacy Global Sensitivity Δ: how different when q is applied to neighboring datasets Δ max q D 1 q D 2 (D 1,D 2 ) are neighbors Example: for a count query, Δ = 1 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 21

22 Background on Differential Privacy Laplace Mechanism: K D = q D + Lap( Δ ε ), Lap Δ ε is a r.v. with p.d.f f x = 1 2λ e x λ, λ = Δ ε f D1 (x) f D2 (x) Basic tool in DP 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 22

23 Optimality of Existing work? Laplace Mechanism: K D = q D + Lap( Δ ε ), Two Questions: Is data-independent perturbation optimal? Assume data-independent perturbation, is Laplacian distribution optimal? 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 23

24 Optimality of Existing work? Laplace Mechanism: K D = q D + Lap( Δ ε ), Two questions: Is data-independent perturbation optimal? Assume data-independent perturbation, is Laplacian distribution optimal? Our results: data-independent perturbation is optimal Laplacian distribution is not optimal: the optimal is staircase distribution 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 24

25 Problem Formulation: DP constraint A generic randomized mechanism K is a family of noise probability measures (t is query output) K = {ν t t R } DP constraint: t 1, t 2 R, s. t. t 1 t 2 Δ, ν t1 S e ε ν t2 S + t 1 t 2, measurable set S 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 25

26 Problem Formulation: utility (cost) model Cost function on the noise L: R R Given query output t, Cost(ν t ) = L x ν t dx Example L(x) = x, noise magnitude L x = x 2, noise power Objective(minmax): minimize sup t R x R L x ν t (dx) 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 26

27 Problem Formulation: put things all together Minimize sup t R L x ν t (dx) x R s.t. ν t1 S e ϵ ν t2 S + t 1 t 2, measurable set S, t 1, t 2 R, s. t. t 1 t 2 Δ, 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 27

28 Main Result: ν t = ν In the optimal mechanism, ν t is independent of t (under a technical condition). 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 28

29 Main Result: ν t = ν In the optimal mechanism, ν t is independent of t (under a technical condition). ν t (S) -T T n 2T n T 2T t 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 29

30 Main Result: ν t = ν ν t (S) -T T n 2T n T 2T t In the optimal mechanism among T>0 n 1 K T,n, ν t is independent of t 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 30

31 Optimal noise probability distribution K D = q D + X, Minimize L x P dx s.t. Pr X S e ϵ Pr X S + d, d Δ, measurable set S 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 31

32 Optimal noise probability distribution K D = q D + X, L x satisfies: symmetric, and increasing of x sup L T+1 L T Minimize L x P dx s.t. Pr X S e ϵ Pr X S + d, d Δ, measurable set S < + 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 32

33 Main Result The optimal probability distribution has staircase-shaped p.d.f. 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 33

34 Main Result 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 34

35 Main Result R.V. generation Sign: binary r.v. Interval: geometric r.v. Segment: binary r.v. Uniform r.v. 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 35

36 Proof Ideas Key idea: Divide each interval [kδ,(k+1)δ) into i bins Approximate using piecewise constant p.d.f. a i = P(interval) length(interval) a 1 a 2 a 3 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 36

37 Proof Ideas p.d.f. should be decreasing a 2 a 3 a 2 a 3 a 1 a 1 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 37

38 Proof Ideas p.d.f. should be geometrically decreasing Divide each interval kδ, k + 1 Δ into i bins a k a k+i = e ε a 1 a 2 a 1 a 3 = e ε, a 2 a 4 = e ε a 3 a 4 Δ 2Δ 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 38

39 Proof Ideas The shape of p.d.f. in the first interval 0, Δ should be a step function a 1 a 2 a 3 a 4 Increase a 2, decrease a 3 a 1 a 2 a 3 a 4 Δ Δ 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 39

40 Application: L(x) = x γ = 1 1+e ε 2, and the minimum expectation of noise amplitude is V P γ = Δ e ε 2 e ε 1 ε 0, the additive gap 0 V Lap = Δ ε ε +, V P γ = Θ(Δe ε 2) 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 40

41 Numeric Comparison with Laplacian Mechanism 15 V Lap / V Optimal /29/

42 Numeric Comparison with Laplacian Mechanism 15 V Lap / V Optimal V Lap / V Optimal /29/

43 Application: L x = x 2 (b: = e ε ) Minimum noise power 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 43

44 Application: L x = x 2 (b: = e ε ) The minimum noise power is V Lap = 2Δ2 ε 2 ε 0, the additive gap cδ 2 ε +, V P γ = Θ(Δ 2 e 2ε 3 ) 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 44

45 Numeric Comparison with Laplacian Mechanism V Lap / V Optimal /29/

46 Numeric Comparison with Laplacian Mechanism V Lap / V Optimal V Lap / V Optimal /29/

47 Properties of γ L x = x m, Also holds for cost functions which are positive linear combinations of momentum functions. 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 47

48 Extension to Discrete Setting Query function: q(d) is integer-valued 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 48

49 Extension to Discrete Setting Query function: q(d) is integer-valued Adding data-independent noise is optimal 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 49

50 Extension to Discrete Setting Query function: q(d) is integer-valued Adding data-independent noise is optimal 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 50

51 Extension to Abstract Setting K: D R (can be arbitrary), with base measure μ. Cost function: C: D R [0, + ] Sensitivity Δ max C D 1, r C(D 2, r) r R,D 1,D 2 : D 1 D 2 1 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 51

52 Extension to Abstract Setting K: D R (can be arbitrary), with base measure μ. Cost function: C: D R [0, + ] Sensitivity Δ max C D 1, r C(D 2, r) r R,D 1,D 2 : D 1 D 2 1 Staircase mechanism: Random sampling over the output range with staircase distribution. 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 52

53 Extension to Abstract Setting K: D R (can be arbitrary), with base measure μ. Cost function: C: D R [0, + ] Sensitivity Δ max C D 1, r C(D 2, r) r R,D 1,D 2 : D 1 D 2 1 Staircase mechanism: Random sampling over the output range with staircase distribution. If R is real space, and C d, r = r q d mechanism, regular staircase 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 53

54 Conclusion Fundamental tradeoff between privacy and utility in DP Staircase Mechanism, optimal mechanism for single real-valued query Huge improvement in low privacy regime Extension to discrete setting and abstract setting 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 54

55 Future Work: Multidimensional setting: Query output can have multiple components q D = q 1 D, q 2 D,, q n D 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 55

56 Future Work: Multidimensional setting: Query output can have multiple components q D = q 1 D, q 2 D,, q n D If all components are uncorrelated, perturb each component independently to preserve DP. 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 56

57 Future Work: Multidimensional setting: Query output can have multiple components q D = q 1 D, q 2 D,, q n D For some query, all components are coupled together. Histogram query: one user can affect only one component 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 57

58 Future Work: Multidimensional setting: Query output can have multiple components q D = q 1 D, q 2 D,, q n D For some query, all components are coupled together. Global sensitivity Δ max q D 1 q D 2 1 D 1,D 2 are neighbors Histogram query: Δ = 1 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 58

59 Future Work: Multidimensional setting: Conjecture: for histogram-like query, optimal noise probability distribution is multidimensional staircase-shaped 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 59

60 Future Work: (ε, δ)-differential privacy (ε, δ)-differential privacy Pr K D 1 S e ε Pr K D 2 S + δ The standard approach is to add Gaussian noise. What is the optimal noise probability distribution in this setting? 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 60

61 Thank you! 3/29/2013 PhD Prelimary Exam of Quan Geng, ECE, UIUC 61

The Optimal Mechanism in Differential Privacy

The Optimal Mechanism in Differential Privacy The Optimal Mechanism in Differential Privacy Quan Geng Advisor: Prof. Pramod Viswanath 11/07/2013 PhD Final Exam of Quan Geng, ECE, UIUC 1 Outline Background on Differential Privacy ε-differential Privacy:

More information

Privacy of Numeric Queries Via Simple Value Perturbation. The Laplace Mechanism

Privacy of Numeric Queries Via Simple Value Perturbation. The Laplace Mechanism Privacy of Numeric Queries Via Simple Value Perturbation The Laplace Mechanism Differential Privacy A Basic Model Let X represent an abstract data universe and D be a multi-set of elements from X. i.e.

More information

THE OPTIMAL MECHANISM IN DIFFERENTIAL PRIVACY QUAN GENG DISSERTATION

THE OPTIMAL MECHANISM IN DIFFERENTIAL PRIVACY QUAN GENG DISSERTATION c 013 Quan Geng THE OPTIMAL MECHANISM IN DIFFERENTIAL PRIVACY BY QUAN GENG DISSERTATION Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Electrical and Computer

More information

Local Differential Privacy

Local Differential Privacy Local Differential Privacy Peter Kairouz Department of Electrical & Computer Engineering University of Illinois at Urbana-Champaign Joint work with Sewoong Oh (UIUC) and Pramod Viswanath (UIUC) / 33 Wireless

More information

Report on Differential Privacy

Report on Differential Privacy Report on Differential Privacy Lembit Valgma Supervised by Vesal Vojdani December 19, 2017 1 Introduction Over the past decade the collection and analysis of personal data has increased a lot. This has

More information

Maryam Shoaran Alex Thomo Jens Weber. University of Victoria, Canada

Maryam Shoaran Alex Thomo Jens Weber. University of Victoria, Canada Maryam Shoaran Alex Thomo Jens Weber University of Victoria, Canada Introduction Challenge: Evidence of Participation Sample Aggregates Zero-Knowledge Privacy Analysis of Utility of ZKP Conclusions 12/17/2015

More information

Optimal Noise Adding Mechanisms for Approximate Differential Privacy

Optimal Noise Adding Mechanisms for Approximate Differential Privacy Optimal Noise Adding Mechanisms for Approximate Differential Privacy Quan Geng, and Pramod Viswanath Coordinated Science Laboratory and Dept. of ECE University of Illinois, Urbana-Champaign, IL 680 Email:

More information

Answering Numeric Queries

Answering Numeric Queries Answering Numeric Queries The Laplace Distribution: Lap(b) is the probability distribution with p.d.f.: p x b) = 1 x exp 2b b i.e. a symmetric exponential distribution Y Lap b, E Y = b Pr Y t b = e t Answering

More information

Database Privacy: k-anonymity and de-anonymization attacks

Database Privacy: k-anonymity and de-anonymization attacks 18734: Foundations of Privacy Database Privacy: k-anonymity and de-anonymization attacks Piotr Mardziel or Anupam Datta CMU Fall 2018 Publicly Released Large Datasets } Useful for improving recommendation

More information

[Title removed for anonymity]

[Title removed for anonymity] [Title removed for anonymity] Graham Cormode graham@research.att.com Magda Procopiuc(AT&T) Divesh Srivastava(AT&T) Thanh Tran (UMass Amherst) 1 Introduction Privacy is a common theme in public discourse

More information

c 2016 Peter Kairouz

c 2016 Peter Kairouz c 206 Peter Kairouz THE FUNDAMENTAL LIMITS OF STATISTICAL DATA PRIVACY BY PETER KAIROUZ DISSERTATION Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Electrical

More information

Differentially Private ANOVA Testing

Differentially Private ANOVA Testing Differentially Private ANOVA Testing Zachary Campbell campbza@reed.edu Andrew Bray abray@reed.edu Anna Ritz Biology Department aritz@reed.edu Adam Groce agroce@reed.edu arxiv:1711335v2 [cs.cr] 20 Feb 2018

More information

CMPUT651: Differential Privacy

CMPUT651: Differential Privacy CMPUT65: Differential Privacy Homework assignment # 2 Due date: Apr. 3rd, 208 Discussion and the exchange of ideas are essential to doing academic work. For assignments in this course, you are encouraged

More information

Personalized Social Recommendations Accurate or Private

Personalized Social Recommendations Accurate or Private Personalized Social Recommendations Accurate or Private Presented by: Lurye Jenny Paper by: Ashwin Machanavajjhala, Aleksandra Korolova, Atish Das Sarma Outline Introduction Motivation The model General

More information

Modeling Data Correlations in Private Data Mining with Markov Model and Markov Networks. Yang Cao Emory University

Modeling Data Correlations in Private Data Mining with Markov Model and Markov Networks. Yang Cao Emory University Modeling Data Correlations in Private Data Mining with Markov Model and Markov Networks Yang Cao Emory University 207..5 Outline Data Mining with Differential Privacy (DP) Scenario: Spatiotemporal Data

More information

Extremal Mechanisms for Local Differential Privacy

Extremal Mechanisms for Local Differential Privacy Journal of Machine Learning Research 7 206-5 Submitted 4/5; Published 4/6 Extremal Mechanisms for Local Differential Privacy Peter Kairouz Department of Electrical and Computer Engineering University of

More information

Extremal Mechanisms for Local Differential Privacy

Extremal Mechanisms for Local Differential Privacy Extremal Mechanisms for Local Differential Privacy Peter Kairouz Sewoong Oh 2 Pramod Viswanath Department of Electrical & Computer Engineering 2 Department of Industrial & Enterprise Systems Engineering

More information

Differential Privacy and Pan-Private Algorithms. Cynthia Dwork, Microsoft Research

Differential Privacy and Pan-Private Algorithms. Cynthia Dwork, Microsoft Research Differential Privacy and Pan-Private Algorithms Cynthia Dwork, Microsoft Research A Dream? C? Original Database Sanitization Very Vague AndVery Ambitious Census, medical, educational, financial data, commuting

More information

Differential Privacy and its Application in Aggregation

Differential Privacy and its Application in Aggregation Differential Privacy and its Application in Aggregation Part 1 Differential Privacy presenter: Le Chen Nanyang Technological University lechen0213@gmail.com October 5, 2013 Introduction Outline Introduction

More information

Differentially Private Real-time Data Release over Infinite Trajectory Streams

Differentially Private Real-time Data Release over Infinite Trajectory Streams Differentially Private Real-time Data Release over Infinite Trajectory Streams Kyoto University, Japan Department of Social Informatics Yang Cao, Masatoshi Yoshikawa 1 Outline Motivation: opportunity &

More information

Pufferfish Privacy Mechanisms for Correlated Data. Shuang Song, Yizhen Wang, Kamalika Chaudhuri University of California, San Diego

Pufferfish Privacy Mechanisms for Correlated Data. Shuang Song, Yizhen Wang, Kamalika Chaudhuri University of California, San Diego Pufferfish Privacy Mechanisms for Correlated Data Shuang Song, Yizhen Wang, Kamalika Chaudhuri University of California, San Diego Sensitive Data with Correlation } Data from social networks, } eg. Spreading

More information

Lecture 1 Introduction to Differential Privacy: January 28

Lecture 1 Introduction to Differential Privacy: January 28 Introduction to Differential Privacy CSE711: Topics in Differential Privacy Spring 2016 Lecture 1 Introduction to Differential Privacy: January 28 Lecturer: Marco Gaboardi Scribe: Marco Gaboardi This note

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Nina Balcan] slide 1 Goals for the lecture you should understand

More information

Differentially private convex optimization with piecewise affine objectives

Differentially private convex optimization with piecewise affine objectives 53rd IEEE Conference on Decision and Control December 15-17, 2014. Los Angeles, California, USA Differentially private convex optimization with piecewise affine objectives Shuo Han, Ufuk Topcu, George

More information

Enabling Accurate Analysis of Private Network Data

Enabling Accurate Analysis of Private Network Data Enabling Accurate Analysis of Private Network Data Michael Hay Joint work with Gerome Miklau, David Jensen, Chao Li, Don Towsley University of Massachusetts, Amherst Vibhor Rastogi, Dan Suciu University

More information

The Composition Theorem for Differential Privacy

The Composition Theorem for Differential Privacy The Composition Theorem for Differential Privacy Peter Kairouz, Member, IEEE, Sewoong Oh, Member, IEEE, Pramod Viswanath, Fellow, IEEE, Abstract Sequential querying of differentially private mechanisms

More information

Lecture 11- Differential Privacy

Lecture 11- Differential Privacy 6.889 New Developments in Cryptography May 3, 2011 Lecture 11- Differential Privacy Lecturer: Salil Vadhan Scribes: Alan Deckelbaum and Emily Shen 1 Introduction In class today (and the next two lectures)

More information

Theoretical Results on De-Anonymization via Linkage Attacks

Theoretical Results on De-Anonymization via Linkage Attacks 377 402 Theoretical Results on De-Anonymization via Linkage Attacks Martin M. Merener York University, N520 Ross, 4700 Keele Street, Toronto, ON, M3J 1P3, Canada. E-mail: merener@mathstat.yorku.ca Abstract.

More information

Differential Privacy: An Economic Method for Choosing Epsilon

Differential Privacy: An Economic Method for Choosing Epsilon Differential Privacy: An Economic Method for Choosing Epsilon Justin Hsu Marco Gaboardi Andreas Haeberlen Sanjeev Khanna Arjun Narayan Benjamin C. Pierce Aaron Roth University of Pennsylvania, USA University

More information

What Can We Learn Privately?

What Can We Learn Privately? What Can We Learn Privately? Sofya Raskhodnikova Penn State University Joint work with Shiva Kasiviswanathan Homin Lee Kobbi Nissim Adam Smith Los Alamos UT Austin Ben Gurion Penn State To appear in SICOMP

More information

Spectral Hashing: Learning to Leverage 80 Million Images

Spectral Hashing: Learning to Leverage 80 Million Images Spectral Hashing: Learning to Leverage 80 Million Images Yair Weiss, Antonio Torralba, Rob Fergus Hebrew University, MIT, NYU Outline Motivation: Brute Force Computer Vision. Semantic Hashing. Spectral

More information

Locally Differentially Private Protocols for Frequency Estimation. Tianhao Wang, Jeremiah Blocki, Ninghui Li, Somesh Jha

Locally Differentially Private Protocols for Frequency Estimation. Tianhao Wang, Jeremiah Blocki, Ninghui Li, Somesh Jha Locally Differentially Private Protocols for Frequency Estimation Tianhao Wang, Jeremiah Blocki, Ninghui Li, Somesh Jha Differential Privacy Differential Privacy Classical setting Differential Privacy

More information

Differential Privacy: a short tutorial. Presenter: WANG Yuxiang

Differential Privacy: a short tutorial. Presenter: WANG Yuxiang Differential Privacy: a short tutorial Presenter: WANG Yuxiang Some slides/materials extracted from Aaron Roth s Lecture at CMU The Algorithmic Foundations of Data Privacy [Website] Cynthia Dwork s FOCS

More information

Differential Privacy

Differential Privacy CS 380S Differential Privacy Vitaly Shmatikov most slides from Adam Smith (Penn State) slide 1 Reading Assignment Dwork. Differential Privacy (invited talk at ICALP 2006). slide 2 Basic Setting DB= x 1

More information

How Well Does Privacy Compose?

How Well Does Privacy Compose? How Well Does Privacy Compose? Thomas Steinke Almaden IPAM/UCLA, Los Angeles CA, 10 Jan. 2018 This Talk Composition!! What is composition? Why is it important? Composition & high-dimensional (e.g. genetic)

More information

PCA, Kernel PCA, ICA

PCA, Kernel PCA, ICA PCA, Kernel PCA, ICA Learning Representations. Dimensionality Reduction. Maria-Florina Balcan 04/08/2015 Big & High-Dimensional Data High-Dimensions = Lot of Features Document classification Features per

More information

Privacy in Statistical Databases

Privacy in Statistical Databases Privacy in Statistical Databases Individuals x 1 x 2 x n Server/agency ( ) answers. A queries Users Government, researchers, businesses (or) Malicious adversary What information can be released? Two conflicting

More information

Privacy in Statistical Databases

Privacy in Statistical Databases Privacy in Statistical Databases Individuals x 1 x 2 x n Server/agency ) answers. A queries Users Government, researchers, businesses or) Malicious adversary What information can be released? Two conflicting

More information

Calibrating Noise to Sensitivity in Private Data Analysis

Calibrating Noise to Sensitivity in Private Data Analysis Calibrating Noise to Sensitivity in Private Data Analysis Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith Mamadou H. Diallo 1 Overview Issues: preserving privacy in statistical databases Assumption:

More information

Challenges in Privacy-Preserving Analysis of Structured Data Kamalika Chaudhuri

Challenges in Privacy-Preserving Analysis of Structured Data Kamalika Chaudhuri Challenges in Privacy-Preserving Analysis of Structured Data Kamalika Chaudhuri Computer Science and Engineering University of California, San Diego Sensitive Structured Data Medical Records Search Logs

More information

Quantitative Approaches to Information Protection

Quantitative Approaches to Information Protection Quantitative Approaches to Information Protection Catuscia Palamidessi INRIA Saclay 1 LOCALI meeting, Beijin 06/11/2013 Plan of the talk Motivations and Examples A General Quantitative Model Quantitative

More information

The Pennsylvania State University The Graduate School College of Engineering A FIRST PRINCIPLE APPROACH TOWARD DATA PRIVACY AND UTILITY

The Pennsylvania State University The Graduate School College of Engineering A FIRST PRINCIPLE APPROACH TOWARD DATA PRIVACY AND UTILITY The Pennsylvania State University The Graduate School College of Engineering A FIRST PRINCIPLE APPROACH TOWARD DATA PRIVACY AND UTILITY A Dissertation in Computer Science and Engineering by Bing-Rong Lin

More information

CSCI 5520: Foundations of Data Privacy Lecture 5 The Chinese University of Hong Kong, Spring February 2015

CSCI 5520: Foundations of Data Privacy Lecture 5 The Chinese University of Hong Kong, Spring February 2015 CSCI 5520: Foundations of Data Privacy Lecture 5 The Chinese University of Hong Kong, Spring 2015 3 February 2015 The interactive online mechanism that we discussed in the last two lectures allows efficient

More information

Differentially Private Linear Regression

Differentially Private Linear Regression Differentially Private Linear Regression Christian Baehr August 5, 2017 Your abstract. Abstract 1 Introduction My research involved testing and implementing tools into the Harvard Privacy Tools Project

More information

Private and Continual Release of Statistics

Private and Continual Release of Statistics Private and Continual Release of Statistics T.-H. Hubert Chan 1, Elaine Shi 2, and Dawn Song 3 1 The University of Hong Kong 2 PARC 3 UC Berkeley Abstract. We ask the question how can websites and data

More information

Accuracy First: Selecting a Differential Privacy Level for Accuracy-Constrained Empirical Risk Minimization

Accuracy First: Selecting a Differential Privacy Level for Accuracy-Constrained Empirical Risk Minimization Accuracy First: Selecting a Differential Privacy Level for Accuracy-Constrained Empirical Risk Minimization Katrina Ligett HUJI & Caltech joint with Seth Neel, Aaron Roth, Bo Waggoner, Steven Wu NIPS 2017

More information

Community Detection. fundamental limits & efficient algorithms. Laurent Massoulié, Inria

Community Detection. fundamental limits & efficient algorithms. Laurent Massoulié, Inria Community Detection fundamental limits & efficient algorithms Laurent Massoulié, Inria Community Detection From graph of node-to-node interactions, identify groups of similar nodes Example: Graph of US

More information

Sample complexity bounds for differentially private learning

Sample complexity bounds for differentially private learning Sample complexity bounds for differentially private learning Kamalika Chaudhuri University of California, San Diego Daniel Hsu Microsoft Research Outline 1. Learning and privacy model 2. Our results: sample

More information

outline Nonlinear transformation Error measures Noisy targets Preambles to the theory

outline Nonlinear transformation Error measures Noisy targets Preambles to the theory Error and Noise outline Nonlinear transformation Error measures Noisy targets Preambles to the theory Linear is limited Data Hypothesis Linear in what? Linear regression implements Linear classification

More information

Differential Privacy and Verification. Marco Gaboardi University at Buffalo, SUNY

Differential Privacy and Verification. Marco Gaboardi University at Buffalo, SUNY Differential Privacy and Verification Marco Gaboardi University at Buffalo, SUNY Given a program P, is it differentially private? P Verification Tool yes? no? Proof P Verification Tool yes? no? Given a

More information

Collaborative Filtering. Radek Pelánek

Collaborative Filtering. Radek Pelánek Collaborative Filtering Radek Pelánek 2017 Notes on Lecture the most technical lecture of the course includes some scary looking math, but typically with intuitive interpretation use of standard machine

More information

Information Preservation in Statistical Privacy and Bayesian Estimation of Unattributed Histograms

Information Preservation in Statistical Privacy and Bayesian Estimation of Unattributed Histograms Information Preservation in Statistical Privacy and Bayesian Estimation of Unattributed Histograms Bing-Rong Lin Dept. of Computer Science & Engineering Penn State University Daniel Kifer Dept. of Computer

More information

Corners, Blobs & Descriptors. With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros

Corners, Blobs & Descriptors. With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros Corners, Blobs & Descriptors With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros Motivation: Build a Panorama M. Brown and D. G. Lowe. Recognising Panoramas. ICCV 2003 How do we build panorama?

More information

1 Hoeffding s Inequality

1 Hoeffding s Inequality Proailistic Method: Hoeffding s Inequality and Differential Privacy Lecturer: Huert Chan Date: 27 May 22 Hoeffding s Inequality. Approximate Counting y Random Sampling Suppose there is a ag containing

More information

Differential Privacy with Bounded Priors: Reconciling Utility and Privacy in Genome-Wide Association Studies

Differential Privacy with Bounded Priors: Reconciling Utility and Privacy in Genome-Wide Association Studies Differential Privacy with ounded Priors: Reconciling Utility and Privacy in Genome-Wide Association Studies ASTRACT Florian Tramèr Zhicong Huang Jean-Pierre Hubaux School of IC, EPFL firstname.lastname@epfl.ch

More information

Lecture 20: Introduction to Differential Privacy

Lecture 20: Introduction to Differential Privacy 6.885: Advanced Topics in Data Processing Fall 2013 Stephen Tu Lecture 20: Introduction to Differential Privacy 1 Overview This lecture aims to provide a very broad introduction to the topic of differential

More information

Differentially Private Machine Learning

Differentially Private Machine Learning Differentially Private Machine Learning Theory, Algorithms, and Applications Kamalika Chaudhuri (UCSD) Anand D. Sarwate (Rutgers) Logistics and Goals Tutorial Time: 2 hr (15 min break after first hour)

More information

Communication-efficient and Differentially-private Distributed SGD

Communication-efficient and Differentially-private Distributed SGD 1/36 Communication-efficient and Differentially-private Distributed SGD Ananda Theertha Suresh with Naman Agarwal, Felix X. Yu Sanjiv Kumar, H. Brendan McMahan Google Research November 16, 2018 2/36 Outline

More information

1 Differential Privacy and Statistical Query Learning

1 Differential Privacy and Statistical Query Learning 10-806 Foundations of Machine Learning and Data Science Lecturer: Maria-Florina Balcan Lecture 5: December 07, 015 1 Differential Privacy and Statistical Query Learning 1.1 Differential Privacy Suppose

More information

Generative Models for Discrete Data

Generative Models for Discrete Data Generative Models for Discrete Data ddebarr@uw.edu 2016-04-21 Agenda Bayesian Concept Learning Beta-Binomial Model Dirichlet-Multinomial Model Naïve Bayes Classifiers Bayesian Concept Learning Numbers

More information

Privacy-Preserving Data Mining

Privacy-Preserving Data Mining CS 380S Privacy-Preserving Data Mining Vitaly Shmatikov slide 1 Reading Assignment Evfimievski, Gehrke, Srikant. Limiting Privacy Breaches in Privacy-Preserving Data Mining (PODS 2003). Blum, Dwork, McSherry,

More information

A Study of Privacy and Fairness in Sensitive Data Analysis

A Study of Privacy and Fairness in Sensitive Data Analysis A Study of Privacy and Fairness in Sensitive Data Analysis Moritz A.W. Hardt A Dissertation Presented to the Faculty of Princeton University in Candidacy for the Degree of Doctor of Philosophy Recommended

More information

Differentially Private Event Sequences over Infinite Streams

Differentially Private Event Sequences over Infinite Streams Differentially Private Event Sequences over Infinite Streams Georgios Kellaris 1 Stavros Papadopoulos 2 Xiaokui Xiao 3 Dimitris Papadias 1 1 HKUST 2 Intel Labs / MIT 3 Nanyang Technological University

More information

CS246 Final Exam, Winter 2011

CS246 Final Exam, Winter 2011 CS246 Final Exam, Winter 2011 1. Your name and student ID. Name:... Student ID:... 2. I agree to comply with Stanford Honor Code. Signature:... 3. There should be 17 numbered pages in this exam (including

More information

Slides 4: Continuous Random Variables

Slides 4: Continuous Random Variables Slides 4: Continuous Random Variables Continuous RVs Values, numerical, in a continuous range ( real ) Continuous transformations of RAND() Who needs them? What are the long run properties? What transform

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning 3. Instance Based Learning Alex Smola Carnegie Mellon University http://alex.smola.org/teaching/cmu2013-10-701 10-701 Outline Parzen Windows Kernels, algorithm Model selection

More information

Blowfish Privacy: Tuning Privacy-Utility Trade-offs using Policies

Blowfish Privacy: Tuning Privacy-Utility Trade-offs using Policies Blowfish Privacy: Tuning Privacy-Utility Trade-offs using Policies Xi He Duke University Durham, NC, USA hexi88@cs.duke.edu Ashwin Machanavajjhala Duke University Durham, NC, USA ashwin@cs.duke.edu Bolin

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

VISUALIZATION AND DIFFERENTIAL PRIVACY HYUN BIN LEE THESIS

VISUALIZATION AND DIFFERENTIAL PRIVACY HYUN BIN LEE THESIS c 2017 Hyun Bin Lee VISUALIZATION AND DIFFERENTIAL PRIVACY BY HYUN BIN LEE THESIS Submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Science in the Graduate

More information

Practical Differential Privacy via Grouping and Smoothing

Practical Differential Privacy via Grouping and Smoothing Practical Differential Privacy via Grouping and Smoothing Georgios Kellaris Stavros Papadopoulos Department of r Science and Engineering Hong Kong University of Science and Technology Clear Water Bay,

More information

Fast Algorithms for Segmented Regression

Fast Algorithms for Segmented Regression Fast Algorithms for Segmented Regression Jayadev Acharya 1 Ilias Diakonikolas 2 Jerry Li 1 Ludwig Schmidt 1 1 MIT 2 USC June 21, 2016 1 / 21 Statistical vs computational tradeoffs? General Motivating Question

More information

Preliminaries. Data Mining. The art of extracting knowledge from large bodies of structured data. Let s put it to use!

Preliminaries. Data Mining. The art of extracting knowledge from large bodies of structured data. Let s put it to use! Data Mining The art of extracting knowledge from large bodies of structured data. Let s put it to use! 1 Recommendations 2 Basic Recommendations with Collaborative Filtering Making Recommendations 4 The

More information

1 The Real Number System

1 The Real Number System 1 The Real Number System The rational numbers are beautiful, but are not big enough for various purposes, and the set R of real numbers was constructed in the late nineteenth century, as a kind of an envelope

More information

A Short Tutorial on Differential Privacy

A Short Tutorial on Differential Privacy A Short Tutorial on Differential Privacy Borja Balle Amazon Research Cambridge The Alan Turing Institute January 26, 2018 Outline 1. We Need Mathematics to Study Privacy? Seriously? 2. Differential Privacy:

More information

Data Obfuscation. Bimal Kumar Roy. December 17, 2015

Data Obfuscation. Bimal Kumar Roy. December 17, 2015 December 17, 2015 Problem description (informal) Owner with large database. Lends the database for public use user is allowed to run restriced set of queries on data items. Goal is to prevent the user

More information

Private and Continual Release of Statistics

Private and Continual Release of Statistics Private and Continual Release of Statistics T.-H. HUBERT CHAN, University of Hong Kong ELAINE SHI, Palo Alto Research Center DAWN SONG, University of California, Berkeley We ask the question: how can Web

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

TMA4267 Linear Statistical Models V2016

TMA4267 Linear Statistical Models V2016 TMA4267 Linear Statistical Models V2016 Introduction to the course Part 1: Multivariate random variables, and the multivariate normal distribution Mette Langaas Department of Mathematical Sciences, NTNU

More information

Release Connection Fingerprints in Social Networks Using Personalized Differential Privacy

Release Connection Fingerprints in Social Networks Using Personalized Differential Privacy Release Connection Fingerprints in Social Networks Using Personalized Differential Privacy Yongkai Li, Shubo Liu, Jun Wang, and Mengjun Liu School of Computer, Wuhan University, Wuhan, China Key Laboratory

More information

Algorithms for Collaborative Filtering

Algorithms for Collaborative Filtering Algorithms for Collaborative Filtering or How to Get Half Way to Winning $1million from Netflix Todd Lipcon Advisor: Prof. Philip Klein The Real-World Problem E-commerce sites would like to make personalized

More information

TMA4267 Linear Statistical Models V2016

TMA4267 Linear Statistical Models V2016 TMA4267 Linear Statistical Models V206 Introduction to the course Part : Multivariate random variables, and the multivariate normal distribution Mette Langaas Department of Mathematical Sciences, NTNU

More information

Importance Sampling and. Radon-Nikodym Derivatives. Steven R. Dunbar. Sampling with respect to 2 distributions. Rare Event Simulation

Importance Sampling and. Radon-Nikodym Derivatives. Steven R. Dunbar. Sampling with respect to 2 distributions. Rare Event Simulation 1 / 33 Outline 1 2 3 4 5 2 / 33 More than one way to evaluate a statistic A statistic for X with pdf u(x) is A = E u [F (X)] = F (x)u(x) dx 3 / 33 Suppose v(x) is another probability density such that

More information

PoS(CENet2017)018. Privacy Preserving SVM with Different Kernel Functions for Multi-Classification Datasets. Speaker 2

PoS(CENet2017)018. Privacy Preserving SVM with Different Kernel Functions for Multi-Classification Datasets. Speaker 2 Privacy Preserving SVM with Different Kernel Functions for Multi-Classification Datasets 1 Shaanxi Normal University, Xi'an, China E-mail: lizekun@snnu.edu.cn Shuyu Li Shaanxi Normal University, Xi'an,

More information

Differential Privacy in an RKHS

Differential Privacy in an RKHS Differential Privacy in an RKHS Rob Hall (with Larry Wasserman and Alessandro Rinaldo) 2/20/2012 rjhall@cs.cmu.edu http://www.cs.cmu.edu/~rjhall 1 Overview Why care about privacy? Differential Privacy

More information

Blind Attacks on Machine Learners

Blind Attacks on Machine Learners Blind Attacks on Machine Learners Alex Beatson 1 Zhaoran Wang 1 Han Liu 1 1 Princeton University NIPS, 2016/ Presenter: Anant Kharkar NIPS, 2016/ Presenter: Anant Kharkar 1 Outline 1 Introduction Motivation

More information

Privacy and Data Mining: New Developments and Challenges. Plan

Privacy and Data Mining: New Developments and Challenges. Plan Privacy and Data Mining: New Developments and Challenges Stan Matwin School of Information Technology and Engineering Universit[é y] [d of]ottawa, Canada stan@site.uottawa.ca Plan Why privacy?? Classification

More information

Differentially Private Chi-squared Test by Unit Circle Mechanism

Differentially Private Chi-squared Test by Unit Circle Mechanism A Support Vector Technique We describe Algorithm in detail Algorithm takes as input the sample set S, the query sequence F, the sensitivity of query, the threshold τ, and the stop parameter s Algorithm

More information

at Some sort of quantization is necessary to represent continuous signals in digital form

at Some sort of quantization is necessary to represent continuous signals in digital form Quantization at Some sort of quantization is necessary to represent continuous signals in digital form x(n 1,n ) x(t 1,tt ) D Sampler Quantizer x q (n 1,nn ) Digitizer (A/D) Quantization is also used for

More information

Predictive Discrete Latent Factor Models for large incomplete dyadic data

Predictive Discrete Latent Factor Models for large incomplete dyadic data Predictive Discrete Latent Factor Models for large incomplete dyadic data Deepak Agarwal, Srujana Merugu, Abhishek Agarwal Y! Research MMDS Workshop, Stanford University 6/25/2008 Agenda Motivating applications

More information

A Comparison of Particle Filters for Personal Positioning

A Comparison of Particle Filters for Personal Positioning VI Hotine-Marussi Symposium of Theoretical and Computational Geodesy May 9-June 6. A Comparison of Particle Filters for Personal Positioning D. Petrovich and R. Piché Institute of Mathematics Tampere University

More information

Universally Utility-Maximizing Privacy Mechanisms

Universally Utility-Maximizing Privacy Mechanisms Universally Utility-Maximizing Privacy Mechanisms Arpita Ghosh Tim Roughgarden Mukund Sundararajan August 15, 2009 Abstract A mechanism for releasing information about a statistical database with sensitive

More information

Approximate and Probabilistic Differential Privacy Definitions

Approximate and Probabilistic Differential Privacy Definitions pproximate and Probabilistic Differential Privacy Definitions Sebastian Meiser University College London, United Kingdom, e-mail: s.meiser@ucl.ac.uk July 20, 208 bstract This technical report discusses

More information

Privacy in Control and Dynamical Systems

Privacy in Control and Dynamical Systems Annu. Rev. Control Robot. Auton. Syst. 2018. 1:309 32 First published as a Review in Advance on January 12, 2018 The Annual Review of Control, Robotics, and Autonomous Systems is online at control.annualreviews.org

More information

Density Modeling and Clustering Using Dirichlet Diffusion Trees

Density Modeling and Clustering Using Dirichlet Diffusion Trees p. 1/3 Density Modeling and Clustering Using Dirichlet Diffusion Trees Radford M. Neal Bayesian Statistics 7, 2003, pp. 619-629. Presenter: Ivo D. Shterev p. 2/3 Outline Motivation. Data points generation.

More information

High Dimensional Geometry, Curse of Dimensionality, Dimension Reduction

High Dimensional Geometry, Curse of Dimensionality, Dimension Reduction Chapter 11 High Dimensional Geometry, Curse of Dimensionality, Dimension Reduction High-dimensional vectors are ubiquitous in applications (gene expression data, set of movies watched by Netflix customer,

More information

8 Basics of Hypothesis Testing

8 Basics of Hypothesis Testing 8 Basics of Hypothesis Testing 4 Problems Problem : The stochastic signal S is either 0 or E with equal probability, for a known value E > 0. Consider an observation X = x of the stochastic variable X

More information

Gaussian Random Variables Why we Care

Gaussian Random Variables Why we Care Gaussian Random Variables Why we Care I Gaussian random variables play a critical role in modeling many random phenomena. I By central limit theorem, Gaussian random variables arise from the superposition

More information

Introduction to statistics

Introduction to statistics Introduction to statistics Literature Raj Jain: The Art of Computer Systems Performance Analysis, John Wiley Schickinger, Steger: Diskrete Strukturen Band 2, Springer David Lilja: Measuring Computer Performance:

More information

Zero-Knowledge Private Computation of Node Bridgeness in Social Networks

Zero-Knowledge Private Computation of Node Bridgeness in Social Networks Zero-nowledge Private Computation of Node Bridgeness in Social Networks Maryam Shoaran 1 and Alex Thomo 2 1 University of Tabriz, Tabriz, Iran mshoaran@tabrizu.ac.ir 2 University of Victoria, Victoria,

More information

On Differentially Private Frequent Itemsets Mining

On Differentially Private Frequent Itemsets Mining On Differentially Private Frequent Itemsets Mining Chen Zeng University of Wisconsin-Madison zeng@cs.wisc.edu Jeffrey F. Naughton University of Wisconsin-Madison naughton@cs.wisc.edu Jin-Yi Cai University

More information