Approximating high-dimensional posteriors with nuisance parameters via integrated rotated Gaussian approximation (IRGA)
|
|
- Alisha Whitehead
- 5 years ago
- Views:
Transcription
1 Approximating high-dimensional posteriors with nuisance parameters via integrated rotated Gaussian approximation (IRGA) Willem van den Boom Department of Statistics and Applied Probability National University of Singapore Bayesian Computation for High-Dimensional Statistical Models Institute of Mathematical Sciences National University of Singapore September 10, 2018
2 Collaborators David B. Dunson Galen Reeves Department of Statistical Science Duke University 2
3 Outline IRGA with Bayesian variable selection as an example Approximation accuracy IRGA in a more general setup 3
4 IRGA with Bayesian variable selection as an example 4
5 Standard linear model n x p matrix y = X + n observations p unknown parameters Gaussian errors N(0, 2 I) The entries of β are conditionally independent given hyperparameters θ: py p( ) = p( j ) j=1 5
6 Example: Bayesian Variable Selection Entries of β conditionally independent given hyperparameters θ py p( ) = p( j ) j=1 Mixed discrete-continuous distribution for marginal prior probability equal to zero distribution of nonzero p( j ) =(1 ) ( j )+ N ( j 0, ), =(, ) 6
7 Inference Goal: compute posterior marginal distribution of first entry Z p( 1 y) = p( y)d p 2 e.g. for P(β 1 0 y) where p( y) = p( )p(y ) R p( )p(y )d Treat β p 2 as nuisance parameters Number of possible subsets with number of variables p {j β j 0} grows exponentially Posterior is a mixture of 2 p Gaussians Intractable for large p. Therefore, approximations are used for scalable inference. 7
8 Approximation methods Most popular approximation methods fit into two groups: Sampling based: MCMC, Gibbs sampling, Stochastic Search Variable Selection (George & McCulloch, 1993), or Bayesian Adaptive Sampling (Clyde et al., 2011). Exact solution asymptotically Hard to establish convergence due to exponential and discrete nature of the posterior Deterministic: Variational Bayes (Carbonetto & Stephens, 2012, Ormerod et al., 2017), Expectation propagation (Hernández-Lobato et al., 2015), or EM variable selection (Ročková & George, 2014). Often unclear what quality of approximation is achieved IRGA provides interpretable approximations to posterior marginals and can overcome some of these issues. 8
9 Overview of IRGA Goal: Marginal posterior, such as p(β 1 y) = π(β y) dβ p 2 Achieved by approximately integrating outβ p 2 Based on a data rotation Transparent approximation allows for theoretical analysis 9
10 Overview of IRGA 1. Rotate the data to isolate the parameter of interest Introduce an auxiliary variable which summarizes the influence of the nuisance parameters on β p 2 β 1 2. Use any means possible to compute/estimate the posterior mean and posterior variance of the auxiliary variable 3. Apply a Gaussian approximation to the auxiliary variable and solve the one-dimensional integration problem to obtain the posterior approximation for β 1 β 1 10
11 2 6 1: Rotate Apply rotation matrix 4 Q to the data which 5 zeros out all but the first entry in the first column of the data ỹ = Qy and X = QX Only the first observation depends on 1: px ỹ 1 = x 1,1 1 + x 1,j j + 1 j= x 1,1 x 1,2 x 1,p 0 x 2,2 x 2,p ỹ = x n,2 x n,p N(0, 2 I) ζ auxiliary variable captures influence of nuisance parameters 11
12 1: Rotate p unknown parameters n observations (the data) 1 y 1 2 y 2 3 y 3 4 y n p 12
13 1: Rotate y 1 2 y 2 3 y p y n 13
14 1: Rotate y 1 y 2 2 y y n p 14
15 1: Rotate ỹ 1 ỹ 2 2 ỹ ỹ n p 15
16 1: Rotate ζ ỹ 1 ỹ 2 ỹ ζ = p j=2 x 1,j β j auxiliary variable encapsulates influence of nuisance parameters 1 ỹ n p 16
17 2: Estimate / compute Compute the posterior mean and variance of the auxiliary variable: E[ζ ỹ n 2 ] Cov[ζ ỹn 2 ] Vector Approximate Message Passing (VAMP, Rangan et al., 2017) LASSO Variational Bayes [your favorite method] The quantities are independent of the target parameter β 1 17
18 3: Approximate Apply Gaussian approximation to auxiliary variable to compute posterior approximation prior distribution p(β 1 y) p(ỹ 1 ζ, β 1 ) p(β 1 ) p(ζ ỹ p 2 ) dζ Gaussian by assumption on noise replace with Gaussian using mean and variance from previous step Approximation p(ζ ỹ p ) can be accurate even if the prior 2 and posterior are highly non-gaussian: Most low-dimensional projections of high-dimensional distributions are close to Gaussian (projection pursuit, Diaconis & Freedman, 1984). p Additionally, CLT or Bernstein-von Mises: ζ = x 1,j β j j=2 18
19 Overview of IRGA 1. Rotate the data to isolate the parameter of interest Introduce an auxiliary variable ζ the nuisance parameters β p on β 2 1 which summarizes the influence of 2. Use any means possible to compute/estimate the posterior mean and posterior variance of p(ζ ỹ p ) 2 3. Apply a Gaussian approximation to p(ζ ỹ p ) 2 and solve the onedimensional integration problem to obtain the approximation for p(β 1 y) β 1 19
20 Approximation accuracy 20
21 21 Gaussian approximation accuracy Spike and slab σ 2 = 1 Theoretical Quantiles Sample Quantiles Spike and slab σ 2 = 0.5 Theoretical Quantiles Sample Quantiles Horseshoe σ 2 = 1 Theoretical Quantiles Sample Quantiles Horseshoe σ 2 = 0.5 Theoretical Quantiles Sample Quantiles Q-Q plots of from simulated data with X equal to 3,571 highly correlated SNPs from Friedman et al. (2010). p(ζ ỹ n 2 )
22 Quadratic Wasserstein distance W 2 {p 1 (x 1 ), p 2 (x 2 )} = inf E ( x 1 x 2 2 ) 1 2 where the infimum is over all joint distributions on such that x 1 p 1 ( ) and x 2 p 2 ( ). (x 1, x 2 ) 22
23 Gaussian approximation accuracy Theorem If p(β p is nearly isotropic and its distance from its mean 2 ỹp) 2 concentrates, then the Wasserstein distance W 2 {p(ζ ỹ n 2 ), p(ζ ỹn 2 ) } is small for most vectors ( x 1,j ) p. j=2 Isotropy is when independent draws from too correlated. p(β p 2 ỹp 2 ) are not Isotropic Non-isotropic 23
24 Kullback-Leibler divergence D {p 1 (x 1 ) p 2 (x 2 )} = p 1 (x) log p 1 (x) p 2 (x) dx 24
25 Posterior approximation accuracy Theorem E [ D {p(β 1 y) IRGA approximation Gaussian approximation p(β 1 y)} ỹ n 2] 1 2σ 2 W2 {p(ζ ỹ 2 n 2 ), p(ζ ỹn 2 ) } where y β N (Xβ, σ 2 I n ) with β p(β) 25
26 Posterior approximation accuracy Theorem E [ D {p(β 1 y) IRGA approximation Gaussian approximation p(β 1 y)} ỹ n 2] 1 2σ 2 W2 {p(ζ ỹ 2 n 2 ), p(ζ ỹn 2 ) } where y β N (Xβ, σ 2 I n ) with β p(β) 26
27 Application Error in posterior inclusion prob Difference from Gibbs estimate Predictor x Integrated rotated Gaussian approximation Expectation propagation (Hernández-Lobato et al., 2015) Variational Bayes (Carbonetto & Stephens, 2012) Diabetes data (Efron et al., 2004) 27
28 IRGA in a more general setup 28
29 Standard linear model n x p matrix y = X + p unknown parameters 29
30 <latexit sha1_base64="4vfe4dnx9/wwclgiw3s9xapw2dy=">aaacbxicbvdlssnafj3uv42vqktddbahiprebn0irufcvrapbekztg/aozmhmxohhg7c+ctuxcji1n9w5984abpq1goxezjnxmbu8wloplksb6owsli0vfjcndfwnza3sts7trklgkkdrjwsby9i4cyehmkkqzswqakpq8sbxmv+6wgezff4p0yxuahph8xnlcgtduv7i3xhth0pfmhh+lpyf6sba7fkplplvtwaam8toydllkpelx05vygmayskcijlx7zi5azekey5je0nkratoir96ggakgckm06ugondrfswhwldocit9fdgsgipr4gnjwoibnlwy8t/ve6i/hm3zwgckajp9ce/4vhfoise95gaqvhie0if03/fdeaeouohz+oq7nmt50nzpgpbvfv2tfy7zomooj10gcrirmeohm5qhtuqry/ogb2in+pjedhejy/pamhid3brhxifp43clha=</latexit> <latexit sha1_base64="4vfe4dnx9/wwclgiw3s9xapw2dy=">aaacbxicbvdlssnafj3uv42vqktddbahiprebn0irufcvrapbekztg/aozmhmxohhg7c+ctuxcji1n9w5984abpq1goxezjnxmbu8wloplksb6owsli0vfjcndfwnza3sts7trklgkkdrjwsby9i4cyehmkkqzswqakpq8sbxmv+6wgezff4p0yxuahph8xnlcgtduv7i3xhth0pfmhh+lpyf6sba7fkplplvtwaam8toydllkpelx05vygmayskcijlx7zi5azekey5je0nkratoir96ggakgckm06ugondrfswhwldocit9fdgsgipr4gnjwoibnlwy8t/ve6i/hm3zwgckajp9ce/4vhfoise95gaqvhie0if03/fdeaeouohz+oq7nmt50nzpgpbvfv2tfy7zomooj10gcrirmeohm5qhtuqry/ogb2in+pjedhejy/pamhid3brhxifp43clha=</latexit> <latexit sha1_base64="4vfe4dnx9/wwclgiw3s9xapw2dy=">aaacbxicbvdlssnafj3uv42vqktddbahiprebn0irufcvrapbekztg/aozmhmxohhg7c+ctuxcji1n9w5984abpq1goxezjnxmbu8wloplksb6owsli0vfjcndfwnza3sts7trklgkkdrjwsby9i4cyehmkkqzswqakpq8sbxmv+6wgezff4p0yxuahph8xnlcgtduv7i3xhth0pfmhh+lpyf6sba7fkplplvtwaam8toydllkpelx05vygmayskcijlx7zi5azekey5je0nkratoir96ggakgckm06ugondrfswhwldocit9fdgsgipr4gnjwoibnlwy8t/ve6i/hm3zwgckajp9ce/4vhfoise95gaqvhie0if03/fdeaeouohz+oq7nmt50nzpgpbvfv2tfy7zomooj10gcrirmeohm5qhtuqry/ogb2in+pjedhejy/pamhid3brhxifp43clha=</latexit> <latexit sha1_base64="4vfe4dnx9/wwclgiw3s9xapw2dy=">aaacbxicbvdlssnafj3uv42vqktddbahiprebn0irufcvrapbekztg/aozmhmxohhg7c+ctuxcji1n9w5984abpq1goxezjnxmbu8wloplksb6owsli0vfjcndfwnza3sts7trklgkkdrjwsby9i4cyehmkkqzswqakpq8sbxmv+6wgezff4p0yxuahph8xnlcgtduv7i3xhth0pfmhh+lpyf6sba7fkplplvtwaam8toydllkpelx05vygmayskcijlx7zi5azekey5je0nkratoir96ggakgckm06ugondrfswhwldocit9fdgsgipr4gnjwoibnlwy8t/ve6i/hm3zwgckajp9ce/4vhfoise95gaqvhie0if03/fdeaeouohz+oq7nmt50nzpgpbvfv2tfy7zomooj10gcrirmeohm5qhtuqry/ogb2in+pjedhejy/pamhid3brhxifp43clha=</latexit> <latexit sha1_base64="f5smxrusgpqnj32wfaywghhhei4=">aaacenicbvc7sgnbfl3rm66vqkxnybdeiuza+aaxkihlfncesjhmtibjknnzdwzwcevat7dxv2wsvcxsroz8gyepiizeuha451zupteiovpacx6sqemz2bn5zik9uls8sppdw79rusij9ujei1koskkccepppjktx5limoc0fltpenrpnkrfingtozgthrgpwimrra1vy+6ehye/xlovbolv9zyvvmyhveiui3wdjurgnhpytr/qjhchihfyar8/aecxlv326xfjqio04vipiuveuppiqrnhtgv7iaixjm3cpbudbtalq2k/uxdtg6aogpe0lttqs6mtkq6v6osbcfzuvonaj/xpqys6cvbnmygttquzlgokhjm0vqehopouan4xabpjzk2itlderjs32uyj7njksedt5q/z7qwbk5zcodkwcvuway7sqweuoageehiez3ifn+vjerhery+bdcoazmzan7k+fgg9np/t</latexit> <latexit sha1_base64="9hskttzqesa0izxf6o1+tbaq3dm=">aaacenicbvdlssnafj3uv42vqks3g0uqfyvx4wpeoiauqxhbagkztkft0mkkzkyeevipbvwc/8gncxuxbly5ef/gsdtfbb1w4xdoudx7rh8xkpvl/ri5qemz2bn8vlmwuls8ulhdu5zhldbxcmhcuforjixy4iiqgklfgqdaz6tqd08zvxphhkqhv1k9ihgbanpaohgpttuko2eh0a2q6vh+cpnejnxvncas3qbqvegopm1fq2t1c04cewikxx/muft4bvyahs+3gei4ifxhhqss21akvaqjrtejqengkkqid1gb1dxksc/2kn6mfg5ppglboddnfeyzoxmjcqtsbb52zjfkcs0j/9pqswrtewnluawix4nfrzhbntz7egxsqbbipq0qfltfcnehcysvfqopn2cpr54ezm7pogrf2mxycrhuhmyatbanblahyuacviadmlght+afvbopxrpxzrwprdljolmo/ptx+qutuafh</latexit> <latexit sha1_base64="9hskttzqesa0izxf6o1+tbaq3dm=">aaacenicbvdlssnafj3uv42vqks3g0uqfyvx4wpeoiauqxhbagkztkft0mkkzkyeevipbvwc/8gncxuxbly5ef/gsdtfbb1w4xdoudx7rh8xkpvl/ri5qemz2bn8vlmwuls8ulhdu5zhldbxcmhcuforjixy4iiqgklfgqdaz6tqd08zvxphhkqhv1k9ihgbanpaohgpttuko2eh0a2q6vh+cpnejnxvncas3qbqvegopm1fq2t1c04cewikxx/muft4bvyahs+3gei4ifxhhqss21akvaqjrtejqengkkqid1gb1dxksc/2kn6mfg5ppglboddnfeyzoxmjcqtsbb52zjfkcs0j/9pqswrtewnluawix4nfrzhbntz7egxsqbbipq0qfltfcnehcysvfqopn2cpr54ezm7pogrf2mxycrhuhmyatbanblahyuacviadmlght+afvbopxrpxzrwprdljolmo/ptx+qutuafh</latexit> <latexit sha1_base64="6bcbuzr6v+q9ehf2nba8s4thee4=">aaacenicbvbns8naen34wetx1aoxxskih5j48enufmrjfwmltsyb7bzdutne3ylqqv6df/+kfw8qxj1589+4axuorq8ghu/nmdmviaxxyns/1tz8wulscmglulq2vrfz2tq+01gikhnpjclvcihmgkvmagfbgrfijaweqwf9i9yvpzklesrvyrazpyrdytucejbsq3r4eya9keavcnkb7d6vhvcqafyqyq+iscs0l+2kpqsejc6ylneytvbp22thnamzbcqi1k3hjsfpiqjobcukxqjztgifdfntuenmyj8d/pthfao0csdspitgoto5kzjq60eymm78rj3t5ej/xjobzomfchknwcqdleokaptv84bwmytgqqwmivrxcyumpaiibrnj0ytgtl88s9yjymnfuxbk1fnxggw0i/bqaxlqmaqik1rdlqlocb2gn/rupvuv1of1owqds8yzo+gprk9fudoebg==</latexit> Linear model with nuisance parameter n x p matrix n x q matrix y = X + F (Z)+ p < n unknown parameters nuisance parameter F : R n q! R n 30
31 <latexit sha1_base64="dynxuojsdpvdu8vb72qkenvwvqa=">aaacuxiclzfnb9qweiadukbdvpzy5gj1bsqirjkquqneoqkpai9vov26yp2uhk+tdws7kt1bwln5kftgv8hj5tapopbkll49m+pxjnnscgtr9dsih6w8fpr4da335omz5y/6l9e/26iyji9yiqsztqnlumg+aggsj0vdquolp0+vvjtx85/cwfhom1iupfe01yitjijh0/4v962+ommltjsyyzjyofsh+c2xqnleyc6odmc1ktwd4nrya1kxbnnb8l5db5s/3m2rfvn6xnglbxw0lt0grurzidumtdid/nfl03/f7/shsr7ry9ofrmoofb5v4s4muketaf+azapwka6bswrtji5ksbw1ijjkdy9ulpeuxdgct7zvvhgbuhb1nx7jyqxnhffha27pzqphlbullfrmzjr7n9bav8umfwqfeyd0wqhxbnkoqysgajf/igfccazy4q1lrvi3yjanhjlwv93zs4jvjnzfjlahe8p4685g/3o3jvx0gm2gtrsjxbspdtejgiew7azjkav5+clmw3l4uuwng67mfbql0p4bktznfa==</latexit> <latexit sha1_base64="dynxuojsdpvdu8vb72qkenvwvqa=">aaacuxiclzfnb9qweiadukbdvpzy5gj1bsqirjkquqneoqkpai9vov26yp2uhk+tdws7kt1bwln5kftgv8hj5tapopbkll49m+pxjnnscgtr9dsih6w8fpr4da335omz5y/6l9e/26iyji9yiqsztqnlumg+aggsj0vdquolp0+vvjtx85/cwfhom1iupfe01yitjijh0/4v962+ommltjsyyzjyofsh+c2xqnleyc6odmc1ktwd4nrya1kxbnnb8l5db5s/3m2rfvn6xnglbxw0lt0grurzidumtdid/nfl03/f7/shsr7ry9ofrmoofb5v4s4muketaf+azapwka6bswrtji5ksbw1ijjkdy9ulpeuxdgct7zvvhgbuhb1nx7jyqxnhffha27pzqphlbullfrmzjr7n9bav8umfwqfeyd0wqhxbnkoqysgajf/igfccazy4q1lrvi3yjanhjlwv93zs4jvjnzfjlahe8p4685g/3o3jvx0gm2gtrsjxbspdtejgiew7azjkav5+clmw3l4uuwng67mfbql0p4bktznfa==</latexit> <latexit sha1_base64="dynxuojsdpvdu8vb72qkenvwvqa=">aaacuxiclzfnb9qweiadukbdvpzy5gj1bsqirjkquqneoqkpai9vov26yp2uhk+tdws7kt1bwln5kftgv8hj5tapopbkll49m+pxjnnscgtr9dsih6w8fpr4da335omz5y/6l9e/26iyji9yiqsztqnlumg+aggsj0vdquolp0+vvjtx85/cwfhom1iupfe01yitjijh0/4v962+ommltjsyyzjyofsh+c2xqnleyc6odmc1ktwd4nrya1kxbnnb8l5db5s/3m2rfvn6xnglbxw0lt0grurzidumtdid/nfl03/f7/shsr7ry9ofrmoofb5v4s4muketaf+azapwka6bswrtji5ksbw1ijjkdy9ulpeuxdgct7zvvhgbuhb1nx7jyqxnhffha27pzqphlbullfrmzjr7n9bav8umfwqfeyd0wqhxbnkoqysgajf/igfccazy4q1lrvi3yjanhjlwv93zs4jvjnzfjlahe8p4685g/3o3jvx0gm2gtrsjxbspdtejgiew7azjkav5+clmw3l4uuwng67mfbql0p4bktznfa==</latexit> <latexit sha1_base64="dynxuojsdpvdu8vb72qkenvwvqa=">aaacuxiclzfnb9qweiadukbdvpzy5gj1bsqirjkquqneoqkpai9vov26yp2uhk+tdws7kt1bwln5kftgv8hj5tapopbkll49m+pxjnnscgtr9dsih6w8fpr4da335omz5y/6l9e/26iyji9yiqsztqnlumg+aggsj0vdquolp0+vvjtx85/cwfhom1iupfe01yitjijh0/4v962+ommltjsyyzjyofsh+c2xqnleyc6odmc1ktwd4nrya1kxbnnb8l5db5s/3m2rfvn6xnglbxw0lt0grurzidumtdid/nfl03/f7/shsr7ry9ofrmoofb5v4s4muketaf+azapwka6bswrtji5ksbw1ijjkdy9ulpeuxdgct7zvvhgbuhb1nx7jyqxnhffha27pzqphlbullfrmzjr7n9bav8umfwqfeyd0wqhxbnkoqysgajf/igfccazy4q1lrvi3yjanhjlwv93zs4jvjnzfjlahe8p4685g/3o3jvx0gm2gtrsjxbspdtejgiew7azjkav5+clmw3l4uuwng67mfbql0p4bktznfa==</latexit> <latexit sha1_base64="vgixoijal4spmy24zpirblyelfk=">aaab63icbva9swnbej2lxzf+rs0voqycvbizubugjwucngkkr9jb7cvl9vao3tkhhimtbsxubp0p+r12/gb/hjupqhmfddzem2fmxpairtfxvqzc0vlk6lp+vbcxubw9u9zdu9nxqijzacxi1qiizojl5ifhwrqjyiqkbksh/euxx79nsvny3uigyx5eupkhnbi0ktd6yejaxzjtdiawf4k7i6xk4aj2/xg0qraln61otnoisascan10nqt9jcjkvlbhozvqlhdaj13wnfssigk/mxw7te+m0rhdwjmsae/u3xmzibqerihpjaj29lw3fv/zmimgf37gzziik3s6keyfjbe9/tzucmuoioehhcpubrvpjyhc0ertmcg48y8veu+sffl2ayamk5gidwdwdkfgwjlu4aaq4aefdk/waq+wtj6tn+t92pqzzjp78afwxw8+nji3</latexit> sha1_base64="xmnlatczj/0h8crjzuzzz+slrua=">aaab63icbvbnt8jaej36ififevtssew8kdalein68yijfrjoyhazwobtttmdmidhn3jxomarf8ib/8yfeldwjzo8vdetmxlrjouhz/t2vlbx1jc2s1vl7z3dvf3kwegdsxpnmecpthurygalubiqiimttcnlionnahgz9zupqi1i1t2nmgwt1lcifpyrlyloexlrvqpezzvbxsz+qapqongtfhv6kc8tvmqlm6btexmfy6zjcimtcic3mde+zh1sw6pygiycz46dukdw6blxqm0pcmfq74kxs4wzjzhttbgnzki3ff/z2jnfl+fyqcwnvhy+km6ls6k7/dztcy2c5mgsxrwwt7p8wdtjzpmp2xd8xzexsxbeu6r5d161fl2kuyjjoiez8oec6naldqiag4bneiu3rzkvzrvzmw9dcyqzi/gd5/mhvaioja==</latexit> <latexit sha1_base64="s1pfrennkgbghukbb0henxrw2t4=">aaab63icbzc7sgnbfibpxlumt6ilfonbsaq7nmoxtlfmwdwbzamzk9lkyozsmnnwceuewczcxdz38dns7hwuj5dce38y+pj/c5hztphkydb1v5zcyura+kzxs7s1vbo7v94/uddjphn3wsit3qqp4vio7qnayvup5jqojw+gw5tj3nzg2ohe3eeo5ufm+0peglg0lt8jodjuuejw3animnhzqnsopxrfafdvlj87vyrlmvfijdwm7bkpbjnvkjjk41inmzylbej7vg1r0zibij8ooyan1umrknh2ksrt93dhtmnjrnfok2oka7oytcz/snag0wwqc5vmybwbfrrlkmbcjputntccorxzoewloytha6opq3ufkj2ct7jymvjn1auq1/aqtwuyqqhhcajn4mef1oaw6uadawgp8awvjnkenffnbvzacoy9h/bhzvsppx2qug==</latexit> <latexit sha1_base64="86la2fyceuu8lzvatfuadn2atbs=">aaab63icbzc7sgnbfibpeo3xfrvuzdaivmhxru2cnpyjucaqlgf2mkmgzm4um2efsks0trfqsfud8hx2pomv4ersaoipax//fw5zzgktkqy67peztlyyurae28hvbm3v7bb29u9nngrgfrblwnddarguivsoupj6ojmnqslryf9mnnceudyivnc4shgq0a4shceowstvhhxpq1b0s+5ezbg8grtlr6pq9+pxqniqfdbbmusjrpbjakzdcxmmmqprmmmh+wzqeejzn3z5w6kietdbnhl2se6t0yadwnunkezc3x0zjywzrkgtjcj2zhw2nv/lgil2lonmqcrfrtj0o04qcczkvdlpc80zyoefyrswsxlwo5oytpfj2yn48ysvgn9euip5va9yvoapcnaij3aghlxagw6haj4wepael/dqkofzexpep6vlzqznap7i+fgbglysia==</latexit> <latexit sha1_base64="86la2fyceuu8lzvatfuadn2atbs=">aaab63icbzc7sgnbfibpeo3xfrvuzdaivmhxru2cnpyjucaqlgf2mkmgzm4um2efsks0trfqsfud8hx2pomv4ersaoipax//fw5zzgktkqy67peztlyyurae28hvbm3v7bb29u9nngrgfrblwnddarguivsoupj6ojmnqslryf9mnnceudyivnc4shgq0a4shceowstvhhxpq1b0s+5ezbg8grtlr6pq9+pxqniqfdbbmusjrpbjakzdcxmmmqprmmmh+wzqeejzn3z5w6kietdbnhl2se6t0yadwnunkezc3x0zjywzrkgtjcj2zhw2nv/lgil2lonmqcrfrtj0o04qcczkvdlpc80zyoefyrswsxlwo5oytpfj2yn48ysvgn9euip5va9yvoapcnaij3aghlxagw6haj4wepael/dqkofzexpep6vlzqznap7i+fgbglysia==</latexit> <latexit sha1_base64="jpanwujdoqhzfi77wyjqcntcajk=">aaab63icbvbnt8jaej3if+ix6tflizhxrfov6o3oxsmmvkigidtlchu222z3akiiv8glbzve/upe/dcu0iocl5nk5b2zzmylmikmed63u1pb39jckm9xdnb39g+qh0epjs01x4cnmtxtibmuqmfagis2m40siss2othtzg89otyivq80zjbm2ecjwhbgvgq6erlrvwte3zvdxsv+qwpqonmrfnx7kc8tvmqlm6bjexmfe6zjcintsjc3mde+ygpswkpygiaczi+dumdw6btxqm0pcufq74kjs4wzj5httbgnzbi3e//zojnfv+feqcwnvhyxkm6ls6k7+9ztc42c5ngsxrwwt7p8ydtjzpop2bd85zdxsxbrv677936tcvokuyytoivz8oesgnahtqiag4bneiu3rzkvzrvzswgtocxmmfyb8/kdl8iodq==</latexit> <latexit sha1_base64="s1pfrennkgbghukbb0henxrw2t4=">aaab63icbzc7sgnbfibpxlumt6ilfonbsaq7nmoxtlfmwdwbzamzk9lkyozsmnnwceuewczcxdz38dns7hwuj5dce38y+pj/c5hztphkydb1v5zcyura+kzxs7s1vbo7v94/uddjphn3wsit3qqp4vio7qnayvup5jqojw+gw5tj3nzg2ohe3eeo5ufm+0peglg0lt8jodjuuejw3animnhzqnsopxrfafdvlj87vyrlmvfijdwm7bkpbjnvkjjk41inmzylbej7vg1r0zibij8ooyan1umrknh2ksrt93dhtmnjrnfok2oka7oytcz/snag0wwqc5vmybwbfrrlkmbcjputntccorxzoewloytha6opq3ufkj2ct7jymvjn1auq1/aqtwuyqqhhcajn4mef1oaw6uadawgp8awvjnkenffnbvzacoy9h/bhzvsppx2qug==</latexit> <latexit sha1_base64="86la2fyceuu8lzvatfuadn2atbs=">aaab63icbzc7sgnbfibpeo3xfrvuzdaivmhxru2cnpyjucaqlgf2mkmgzm4um2efsks0trfqsfud8hx2pomv4ersaoipax//fw5zzgktkqy67peztlyyurae28hvbm3v7bb29u9nngrgfrblwnddarguivsoupj6ojmnqslryf9mnnceudyivnc4shgq0a4shceowstvhhxpq1b0s+5ezbg8grtlr6pq9+pxqniqfdbbmusjrpbjakzdcxmmmqprmmmh+wzqeejzn3z5w6kietdbnhl2se6t0yadwnunkezc3x0zjywzrkgtjcj2zhw2nv/lgil2lonmqcrfrtj0o04qcczkvdlpc80zyoefyrswsxlwo5oytpfj2yn48ysvgn9euip5va9yvoapcnaij3aghlxagw6haj4wepael/dqkofzexpep6vlzqznap7i+fgbglysia==</latexit> <latexit sha1_base64="86la2fyceuu8lzvatfuadn2atbs=">aaab63icbzc7sgnbfibpeo3xfrvuzdaivmhxru2cnpyjucaqlgf2mkmgzm4um2efsks0trfqsfud8hx2pomv4ersaoipax//fw5zzgktkqy67peztlyyurae28hvbm3v7bb29u9nngrgfrblwnddarguivsoupj6ojmnqslryf9mnnceudyivnc4shgq0a4shceowstvhhxpq1b0s+5ezbg8grtlr6pq9+pxqniqfdbbmusjrpbjakzdcxmmmqprmmmh+wzqeejzn3z5w6kietdbnhl2se6t0yadwnunkezc3x0zjywzrkgtjcj2zhw2nv/lgil2lonmqcrfrtj0o04qcczkvdlpc80zyoefyrswsxlwo5oytpfj2yn48ysvgn9euip5va9yvoapcnaij3aghlxagw6haj4wepael/dqkofzexpep6vlzqznap7i+fgbglysia==</latexit> <latexit sha1_base64="jpanwujdoqhzfi77wyjqcntcajk=">aaab63icbvbnt8jaej3if+ix6tflizhxrfov6o3oxsmmvkigidtlchu222z3akiiv8glbzve/upe/dcu0iocl5nk5b2zzmylmikmed63u1pb39jckm9xdnb39g+qh0epjs01x4cnmtxtibmuqmfagis2m40siss2othtzg89otyivq80zjbm2ecjwhbgvgq6erlrvwte3zvdxsv+qwpqonmrfnx7kc8tvmqlm6bjexmfe6zjcintsjc3mde+ygpswkpygiaczi+dumdw6btxqm0pcufq74kjs4wzj5httbgnzbi3e//zojnfv+feqcwnvhyxkm6ls6k7+9ztc42c5ngsxrwwt7p8ydtjzpop2bd85zdxsxbrv677936tcvokuyytoivz8oesgnahtqiag4bneiu3rzkvzrvzswgtocxmmfyb8/kdl8iodq==</latexit> <latexit sha1_base64="39wqq+cbicqvo/g1cciy5gfbvo4=">aaab63icbvc7tsnaefyhvwivacuujyikqsimaboigspewgqpsalz5zyccj5bd2ukyoubacga0fipfacdhz/c5vfawkgrjwz2tbstpliydn0vp7c0vlk6vlwvbwxube+ud/dutzjpxn2wyetfhdrwkrt3uadkd6nmna4lb4adq7hfvofaietd4ddlqux7skscubss337gsdvlilt1jyclxjursu3wo/enapvo+bpdtvgwc4vmumnanptikfongkk+kruzw1pkbrthw5yqgnmt5jnjr+tykl0sjdqwqjjrf0/kndzmgie2m6byn/pewpzpa2uynqe5ugmgxlhpoiitbbmy/px0heym5dasyrswtxlwp5oytpmubaje/mulxd+txls9hg3jeqyowgecwql4cay1uiy6+mbawcm8w4ujncfn1xmbthac2cw+/ihz/gnhvzdr</latexit> <latexit sha1_base64="vgixoijal4spmy24zpirblyelfk=">aaab63icbva9swnbej2lxzf+rs0voqycvbizubugjwucngkkr9jb7cvl9vao3tkhhimtbsxubp0p+r12/gb/hjupqhmfddzem2fmxpairtfxvqzc0vlk6lp+vbcxubw9u9zdu9nxqijzacxi1qiizojl5ifhwrqjyiqkbksh/euxx79nsvny3uigyx5eupkhnbi0ktd6yejaxzjtdiawf4k7i6xk4aj2/xg0qraln61otnoisascan10nqt9jcjkvlbhozvqlhdaj13wnfssigk/mxw7te+m0rhdwjmsae/u3xmzibqerihpjaj29lw3fv/zmimgf37gzziik3s6keyfjbe9/tzucmuoioehhcpubrvpjyhc0ertmcg48y8veu+sffl2ayamk5gidwdwdkfgwjlu4aaq4aefdk/waq+wtj6tn+t92pqzzjp78afwxw8+nji3</latexit> sha1_base64="xmnlatczj/0h8crjzuzzz+slrua=">aaab63icbvbnt8jaej36ififevtssew8kdalein68yijfrjoyhazwobtttmdmidhn3jxomarf8ib/8yfeldwjzo8vdetmxlrjouhz/t2vlbx1jc2s1vl7z3dvf3kwegdsxpnmecpthurygalubiqiimttcnlionnahgz9zupqi1i1t2nmgwt1lcifpyrlyloexlrvqpezzvbxsz+qapqongtfhv6kc8tvmqlm6btexmfy6zjcimtcic3mde+zh1sw6pygiycz46dukdw6blxqm0pcmfq74kxs4wzjzhttbgnzki3ff/z2jnfl+fyqcwnvhy+km6ls6k7/dztcy2c5mgsxrwwt7p8wdtjzpmp2xd8xzexsxbeu6r5d161fl2kuyjjoiez8oec6naldqiag4bneiu3rzkvzrvzmw9dcyqzi/gd5/mhvaioja==</latexit> Overview of IRGA Apply rotation matrix Q = (R, S) where R (n x p) is an orthonormal basis for the column space of X and S (n x (n-p)) for the orthogonal complement. Then, the likelihood for Q T y factorizes as R T y,f N R T X + R T F (Z), 2 I p S T y,f N S T F (Z), 2 I n p Only the first distribution involves Auxiliary variable <latexit sha1_base64="39wqq+cbicqvo/g1cciy5gfbvo4=">aaab63icbvc7tsnaefyhvwivacuujyikqsimaboigspewgqpsalz5zyccj5bd2ukyoubacga0fipfacdhz/c5vfawkgrjwz2tbstpliydn0vp7c0vlk6vlwvbwxube+ud/dutzjpxn2wyetfhdrwkrt3uadkd6nmna4lb4adq7hfvofaietd4ddlqux7skscubss337gsdvlilt1jyclxjursu3wo/enapvo+bpdtvgwc4vmumnanptikfongkk+kruzw1pkbrthw5yqgnmt5jnjr+tykl0sjdqwqjjrf0/kndzmgie2m6byn/pewpzpa2uynqe5ugmgxlhpoiitbbmy/px0heym5dasyrswtxlwp5oytpmubaje/mulxd+txls9hg3jeqyowgecwql4cay1uiy6+mbawcm8w4ujncfn1xmbthac2cw+/ihz/gnhvzdr</latexit> captures influence of F on Approximate p(ζ S T y) by a Gaussian Compute p(β y) p(r T y β, ζ) p(β) p(ζ S T y) dζ 31
32 JBOZXCXMwAFdFRz6ECX5EvK8=">AAACoXicbVHbjtMwEHXCbSm3Ao9IaESFtCuhKomQ4HEFiwQPoALb7UpxWjmu01prJ8GeIIWQ/+I7eONvcNMg6JaRRj4+x55jz6SlkhaD4JfnX7l67fqNg5uDW7fv3L03vP/gzBaV4WLKC1WY85RZoWQupihRifPSCKZTJWbpxeuNPvsqjJVFfop1KRLNVrnMJGfoqMXwxxuqRIYxnHQrbcpDmgpkVMsl1EcU6HeXa4ZN2e4qRq7WSFvo9s3ndn4KNWzZZOCKfQGaGcabsG0iauVKs3nUwmwRzSPovcCZfXMld2ocPfvrty/CH9/FcBSMgy5gH4Q9GJE+JovhT7oseKVFjlwxa+MwKDFpmEHJlWgHtLKiZPyCrUTsYM60sEnTdbiFp45ZQlYYlzlCx/57o2Ha2lqn7qRmuLaXtQ35Py2uMHuZNDIvKxQ53xpllQIsYDMuWEojOKraAcaNdG8Fvmaur+iGOnBNCC9/eR+cReMwGIcfn4+OX/XtOCCPyBNySELyghyTt2RCpoR7j70T7733wR/57/yJ/2l71Pf6Ow/JTvjxb/+CzEQ=</latexit> JBOZXCXMwAFdFRz6ECX5EvK8=">AAACoXicbVHbjtMwEHXCbSm3Ao9IaESFtCuhKomQ4HEFiwQPoALb7UpxWjmu01prJ8GeIIWQ/+I7eONvcNMg6JaRRj4+x55jz6SlkhaD4JfnX7l67fqNg5uDW7fv3L03vP/gzBaV4WLKC1WY85RZoWQupihRifPSCKZTJWbpxeuNPvsqjJVFfop1KRLNVrnMJGfoqMXwxxuqRIYxnHQrbcpDmgpkVMsl1EcU6HeXa4ZN2e4qRq7WSFvo9s3ndn4KNWzZZOCKfQGaGcabsG0iauVKs3nUwmwRzSPovcCZfXMld2ocPfvrty/CH9/FcBSMgy5gH4Q9GJE+JovhT7oseKVFjlwxa+MwKDFpmEHJlWgHtLKiZPyCrUTsYM60sEnTdbiFp45ZQlYYlzlCx/57o2Ha2lqn7qRmuLaXtQ35Py2uMHuZNDIvKxQ53xpllQIsYDMuWEojOKraAcaNdG8Fvmaur+iGOnBNCC9/eR+cReMwGIcfn4+OX/XtOCCPyBNySELyghyTt2RCpoR7j70T7733wR/57/yJ/2l71Pf6Ow/JTvjxb/+CzEQ=</latexit> JBOZXCXMwAFdFRz6ECX5EvK8=">AAACoXicbVHbjtMwEHXCbSm3Ao9IaESFtCuhKomQ4HEFiwQPoALb7UpxWjmu01prJ8GeIIWQ/+I7eONvcNMg6JaRRj4+x55jz6SlkhaD4JfnX7l67fqNg5uDW7fv3L03vP/gzBaV4WLKC1WY85RZoWQupihRifPSCKZTJWbpxeuNPvsqjJVFfop1KRLNVrnMJGfoqMXwxxuqRIYxnHQrbcpDmgpkVMsl1EcU6HeXa4ZN2e4qRq7WSFvo9s3ndn4KNWzZZOCKfQGaGcabsG0iauVKs3nUwmwRzSPovcCZfXMld2ocPfvrty/CH9/FcBSMgy5gH4Q9GJE+JovhT7oseKVFjlwxa+MwKDFpmEHJlWgHtLKiZPyCrUTsYM60sEnTdbiFp45ZQlYYlzlCx/57o2Ha2lqn7qRmuLaXtQ35Py2uMHuZNDIvKxQ53xpllQIsYDMuWEojOKraAcaNdG8Fvmaur+iGOnBNCC9/eR+cReMwGIcfn4+OX/XtOCCPyBNySELyghyTt2RCpoR7j70T7733wR/57/yJ/2l71Pf6Ow/JTvjxb/+CzEQ=</latexit> JBOZXCXMwAFdFRz6ECX5EvK8=">AAACoXicbVHbjtMwEHXCbSm3Ao9IaESFtCuhKomQ4HEFiwQPoALb7UpxWjmu01prJ8GeIIWQ/+I7eONvcNMg6JaRRj4+x55jz6SlkhaD4JfnX7l67fqNg5uDW7fv3L03vP/gzBaV4WLKC1WY85RZoWQupihRifPSCKZTJWbpxeuNPvsqjJVFfop1KRLNVrnMJGfoqMXwxxuqRIYxnHQrbcpDmgpkVMsl1EcU6HeXa4ZN2e4qRq7WSFvo9s3ndn4KNWzZZOCKfQGaGcabsG0iauVKs3nUwmwRzSPovcCZfXMld2ocPfvrty/CH9/FcBSMgy5gH4Q9GJE+JovhT7oseKVFjlwxa+MwKDFpmEHJlWgHtLKiZPyCrUTsYM60sEnTdbiFp45ZQlYYlzlCx/57o2Ha2lqn7qRmuLaXtQ35Py2uMHuZNDIvKxQ53xpllQIsYDMuWEojOKraAcaNdG8Fvmaur+iGOnBNCC9/eR+cReMwGIcfn4+OX/XtOCCPyBNySELyghyTt2RCpoR7j70T7733wR/57/yJ/2l71Pf6Ow/JTvjxb/+CzEQ=</latexit> <latexit sha1_base64="n5zgcu6hmy5da0/qqa9eay68ez0=">aaacmnicbvbdsxtbfj21h+pq29g+9mvoecyvsbsk6pu0ig0fikjtqznpudu5mwzozc4zd4ww5d/54h/pq6h0wyqv/rfoyh6q9sda4zx7uxnovmrlkul+rusphj56vlyygq+tp3n6rlhx/ksvkiexiwtdug4ghrwy2cffgrulqzczxups5p3mpz5f51vhv9ckxl6bkvw5kkbbgjq+tbgwashfhgtbb8irewsgsqrdf54kjtmjou7off6gh2x9e73nw9jiwpc2/ziwsxbqncyxhtsassuzg98n6yi02qkhg8ypmsxkzdcs1ob9l01k6tfgsemn01huhkuqjzdcxqawdpp+pc885ztbgfk8cofz4np1340ajpctk4xjwrx/15uj//n6few7/vrzsik08uzqxmlobz8vyifkosq9cqsku+gvxi7bgarqcxxkso9gvk867dzekz1629x/t2hjhb1kr9gws9ko22cf2chrmmno2e92wf5e59hv6dk6uhldihy7l9gtrh+vadmaqm0=</latexit> <latexit sha1_base64="n5zgcu6hmy5da0/qqa9eay68ez0=">aaacmnicbvbdsxtbfj21h+pq29g+9mvoecyvsbsk6pu0ig0fikjtqznpudu5mwzozc4zd4ww5d/54h/pq6h0wyqv/rfoyh6q9sda4zx7uxnovmrlkul+rusphj56vlyygq+tp3n6rlhx/ksvkiexiwtdug4ghrwy2cffgrulqzczxups5p3mpz5f51vhv9ckxl6bkvw5kkbbgjq+tbgwashfhgtbb8irewsgsqrdf54kjtmjou7off6gh2x9e73nw9jiwpc2/ziwsxbqncyxhtsassuzg98n6yi02qkhg8ypmsxkzdcs1ob9l01k6tfgsemn01huhkuqjzdcxqawdpp+pc885ztbgfk8cofz4np1340ajpctk4xjwrx/15uj//n6few7/vrzsik08uzqxmlobz8vyifkosq9cqsku+gvxi7bgarqcxxkso9gvk867dzekz1629x/t2hjhb1kr9gws9ko22cf2chrmmno2e92wf5e59hv6dk6uhldihy7l9gtrh+vadmaqm0=</latexit> <latexit sha1_base64="n5zgcu6hmy5da0/qqa9eay68ez0=">aaacmnicbvbdsxtbfj21h+pq29g+9mvoecyvsbsk6pu0ig0fikjtqznpudu5mwzozc4zd4ww5d/54h/pq6h0wyqv/rfoyh6q9sda4zx7uxnovmrlkul+rusphj56vlyygq+tp3n6rlhx/ksvkiexiwtdug4ghrwy2cffgrulqzczxups5p3mpz5f51vhv9ckxl6bkvw5kkbbgjq+tbgwashfhgtbb8irewsgsqrdf54kjtmjou7off6gh2x9e73nw9jiwpc2/ziwsxbqncyxhtsassuzg98n6yi02qkhg8ypmsxkzdcs1ob9l01k6tfgsemn01huhkuqjzdcxqawdpp+pc885ztbgfk8cofz4np1340ajpctk4xjwrx/15uj//n6few7/vrzsik08uzqxmlobz8vyifkosq9cqsku+gvxi7bgarqcxxkso9gvk867dzekz1629x/t2hjhb1kr9gws9ko22cf2chrmmno2e92wf5e59hv6dk6uhldihy7l9gtrh+vadmaqm0=</latexit> <latexit sha1_base64="n5zgcu6hmy5da0/qqa9eay68ez0=">aaacmnicbvbdsxtbfj21h+pq29g+9mvoecyvsbsk6pu0ig0fikjtqznpudu5mwzozc4zd4ww5d/54h/pq6h0wyqv/rfoyh6q9sda4zx7uxnovmrlkul+rusphj56vlyygq+tp3n6rlhx/ksvkiexiwtdug4ghrwy2cffgrulqzczxups5p3mpz5f51vhv9ckxl6bkvw5kkbbgjq+tbgwashfhgtbb8irewsgsqrdf54kjtmjou7off6gh2x9e73nw9jiwpc2/ziwsxbqncyxhtsassuzg98n6yi02qkhg8ypmsxkzdcs1ob9l01k6tfgsemn01huhkuqjzdcxqawdpp+pc885ztbgfk8cofz4np1340ajpctk4xjwrx/15uj//n6few7/vrzsik08uzqxmlobz8vyifkosq9cqsku+gvxi7bgarqcxxkso9gvk867dzekz1629x/t2hjhb1kr9gws9ko22cf2chrmmno2e92wf5e59hv6dk6uhldihy7l9gtrh+vadmaqm0=</latexit> <latexit sha1_base64="acp/iepfd0w5i/saqobwiiubvow=">aaacbhicbvdlsgmxfm3uv62vuzfdbivqgpqzexrzfirlcvybnafk0kwbmmrckhfk6cknv+lghsju/qh3/o1powttpxdh5jx7yb0nkoxq43nftm5tfwnzk79d2nnd2z9wd49aokkvjk2cser1iqqjo4i0dtwmdkqiieemtkprzcxvpxclasluzviskkobodhfyfip5xblquqmoqtxykaph3lxrshyvdjzs17vmwouej8jjzch0xo/gn6cu06ewqxp3fu9acijuozirqafinveijxca9k1vcboddizhzgfp1bpwzhrtosbc/x3xarxrcc8sp0cmafe9mbif143nffvokfcpoyivpgothk0czwlavtuewzy2bkefbw7qjxecmfjcyvyepzlk1dj67zqe1x/7qjuu87iyimioafl4inluao3oagaainh8axewzvz5lw4787hojxnzdph4a+czx+mbzyz</latexit> <latexit sha1_base64="acp/iepfd0w5i/saqobwiiubvow=">aaacbhicbvdlsgmxfm3uv62vuzfdbivqgpqzexrzfirlcvybnafk0kwbmmrckhfk6cknv+lghsju/qh3/o1powttpxdh5jx7yb0nkoxq43nftm5tfwnzk79d2nnd2z9wd49aokkvjk2cser1iqqjo4i0dtwmdkqiieemtkprzcxvpxclasluzviskkobodhfyfip5xblquqmoqtxykaph3lxrshyvdjzs17vmwouej8jjzch0xo/gn6cu06ewqxp3fu9acijuozirqafinveijxca9k1vcboddizhzgfp1bpwzhrtosbc/x3xarxrcc8sp0cmafe9mbif143nffvokfcpoyivpgothk0czwlavtuewzy2bkefbw7qjxecmfjcyvyepzlk1dj67zqe1x/7qjuu87iyimioafl4inluao3oagaainh8axewzvz5lw4787hojxnzdph4a+czx+mbzyz</latexit> <latexit sha1_base64="acp/iepfd0w5i/saqobwiiubvow=">aaacbhicbvdlsgmxfm3uv62vuzfdbivqgpqzexrzfirlcvybnafk0kwbmmrckhfk6cknv+lghsju/qh3/o1powttpxdh5jx7yb0nkoxq43nftm5tfwnzk79d2nnd2z9wd49aokkvjk2cser1iqqjo4i0dtwmdkqiieemtkprzcxvpxclasluzviskkobodhfyfip5xblquqmoqtxykaph3lxrshyvdjzs17vmwouej8jjzch0xo/gn6cu06ewqxp3fu9acijuozirqafinveijxca9k1vcboddizhzgfp1bpwzhrtosbc/x3xarxrcc8sp0cmafe9mbif143nffvokfcpoyivpgothk0czwlavtuewzy2bkefbw7qjxecmfjcyvyepzlk1dj67zqe1x/7qjuu87iyimioafl4inluao3oagaainh8axewzvz5lw4787hojxnzdph4a+czx+mbzyz</latexit> <latexit sha1_base64="acp/iepfd0w5i/saqobwiiubvow=">aaacbhicbvdlsgmxfm3uv62vuzfdbivqgpqzexrzfirlcvybnafk0kwbmmrckhfk6cknv+lghsju/qh3/o1powttpxdh5jx7yb0nkoxq43nftm5tfwnzk79d2nnd2z9wd49aokkvjk2cser1iqqjo4i0dtwmdkqiieemtkprzcxvpxclasluzviskkobodhfyfip5xblquqmoqtxykaph3lxrshyvdjzs17vmwouej8jjzch0xo/gn6cu06ewqxp3fu9acijuozirqafinveijxca9k1vcboddizhzgfp1bpwzhrtosbc/x3xarxrcc8sp0cmafe9mbif143nffvokfcpoyivpgothk0czwlavtuewzy2bkefbw7qjxecmfjcyvyepzlk1dj67zqe1x/7qjuu87iyimioafl4inluao3oagaainh8axewzvz5lw4787hojxnzdph4a+czx+mbzyz</latexit> Posterior approximation accuracy Theorem IRGA approximation Gaussian approximation E D {p( y) k ˆp( y)} S T y apple W 2 2 p( S T y), ˆp( S T y) where y,f N X + F (Z), 2 I n with (,F) p( )p(f ) 32
33 Advantages of IRGA Approximation employed is transparent and allows for analysis yielding theoretical guarantees Can leverage your favorite method (e.g. lasso, approximate message passing, etc.) to produce accurate approximations of marginal posteriors 33
34 Related papers van den Boom, W., Reeves, G., and Dunson, D. B. (2015). Scalable approximations of marginal posteriors in variable selection. arxiv: van den Boom, W., and Dunson, D. B., and Reeves, G. (2015). Quantifying uncertainty in variable selection with arbitrary matrices. In IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing. van den Boom, W. (2018). Tailored Scalable Dimensionality Reduction. PhD thesis, Department of Statistical Science, Duke University. 34
35 References Carbonetto, P. & Stephens, M. (2012). Scalable variational inference for Bayesian variable selection in regression, and its accuracy in genetic association studies. Bayesian Analysis, 7: Clyde, M. A., Ghosh, J. & Littman, M. L. (2011). Bayesian adaptive sampling for variable selection and model averaging. Journal of Computational and Graphical Statistics, 20: Diaconis, P. & Freedman, D. (1984). Asymptotics of graphical projection pursuit. The Annals of Statistics, 12: Friedman, J., Hastie, T. & Tibshirani, R. (2010). Regularization paths for generalized linear models via coordinate descent. Journal of Statistical Software, 33(1): George, E. I. & McCulloch, R. E. (1993). Variable selection via Gibbs sampling. Journal of the American Statistical Association, 85: Hernández-Lobato, J. M., Hernández-Lobato, D. & Suárez, A. (2015). Expectation propagation in linear regression models with spike-and-slab priors. Machine Learning, 99: Ormerod, J. T., You, C. & Müller S. (2017). A variational Bayes approach to variable selection. Electronic Journal of Statistics, 11: Ročková, V. & George, E. I. (2014). EMVS: The EM approach to Bayesian variable selection. Journal of the American Statistical Association, 109:
Integrated Non-Factorized Variational Inference
Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014
More informationExpectation Propagation for Approximate Bayesian Inference
Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given
More informationPartial factor modeling: predictor-dependent shrinkage for linear regression
modeling: predictor-dependent shrinkage for linear Richard Hahn, Carlos Carvalho and Sayan Mukherjee JASA 2013 Review by Esther Salazar Duke University December, 2013 Factor framework The factor framework
More information13: Variational inference II
10-708: Probabilistic Graphical Models, Spring 2015 13: Variational inference II Lecturer: Eric P. Xing Scribes: Ronghuo Zheng, Zhiting Hu, Yuntian Deng 1 Introduction We started to talk about variational
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More informationBlack-box α-divergence Minimization
Black-box α-divergence Minimization José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Thang Bui, Richard Turner, Harvard University, University of Cambridge, Universidad Autónoma de Madrid.
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 7 Approximate
More informationExpectation Propagation Algorithm
Expectation Propagation Algorithm 1 Shuang Wang School of Electrical and Computer Engineering University of Oklahoma, Tulsa, OK, 74135 Email: {shuangwang}@ou.edu This note contains three parts. First,
More informationFoundations of Nonparametric Bayesian Methods
1 / 27 Foundations of Nonparametric Bayesian Methods Part II: Models on the Simplex Peter Orbanz http://mlg.eng.cam.ac.uk/porbanz/npb-tutorial.html 2 / 27 Tutorial Overview Part I: Basics Part II: Models
More informationGeometric ergodicity of the Bayesian lasso
Geometric ergodicity of the Bayesian lasso Kshiti Khare and James P. Hobert Department of Statistics University of Florida June 3 Abstract Consider the standard linear model y = X +, where the components
More informationSparse Linear Models (10/7/13)
STA56: Probabilistic machine learning Sparse Linear Models (0/7/) Lecturer: Barbara Engelhardt Scribes: Jiaji Huang, Xin Jiang, Albert Oh Sparsity Sparsity has been a hot topic in statistics and machine
More informationBAGUS: Bayesian Regularization for Graphical Models with Unequal Shrinkage
BAGUS: Bayesian Regularization for Graphical Models with Unequal Shrinkage Lingrui Gan, Naveen N. Narisetty, Feng Liang Department of Statistics University of Illinois at Urbana-Champaign Problem Statement
More informationLecture 3. Linear Regression II Bastian Leibe RWTH Aachen
Advanced Machine Learning Lecture 3 Linear Regression II 02.11.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de This Lecture: Advanced Machine Learning Regression
More informationVariational Inference. Sargur Srihari
Variational Inference Sargur srihari@cedar.buffalo.edu 1 Plan of discussion We first describe inference with PGMs and the intractability of exact inference Then give a taxonomy of inference algorithms
More informationBayesian Sparse Linear Regression with Unknown Symmetric Error
Bayesian Sparse Linear Regression with Unknown Symmetric Error Minwoo Chae 1 Joint work with Lizhen Lin 2 David B. Dunson 3 1 Department of Mathematics, The University of Texas at Austin 2 Department of
More informationMachine Learning Basics: Maximum Likelihood Estimation
Machine Learning Basics: Maximum Likelihood Estimation Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics 1. Learning
More informationStatistical Inference
Statistical Inference Liu Yang Florida State University October 27, 2016 Liu Yang, Libo Wang (Florida State University) Statistical Inference October 27, 2016 1 / 27 Outline The Bayesian Lasso Trevor Park
More informationLecture 13 : Variational Inference: Mean Field Approximation
10-708: Probabilistic Graphical Models 10-708, Spring 2017 Lecture 13 : Variational Inference: Mean Field Approximation Lecturer: Willie Neiswanger Scribes: Xupeng Tong, Minxing Liu 1 Problem Setup 1.1
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationICES REPORT Model Misspecification and Plausibility
ICES REPORT 14-21 August 2014 Model Misspecification and Plausibility by Kathryn Farrell and J. Tinsley Odena The Institute for Computational Engineering and Sciences The University of Texas at Austin
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality
More informationan introduction to bayesian inference
with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena
More informationBayesian Linear Regression
Bayesian Linear Regression Sudipto Banerjee 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. September 15, 2010 1 Linear regression models: a Bayesian perspective
More informationProbabilistic Machine Learning
Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent
More informationThe Variational Gaussian Approximation Revisited
The Variational Gaussian Approximation Revisited Manfred Opper Cédric Archambeau March 16, 2009 Abstract The variational approximation of posterior distributions by multivariate Gaussians has been much
More informationLecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu
Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes
More informationProbabilistic Graphical Models for Image Analysis - Lecture 1
Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.
More informationConsistent high-dimensional Bayesian variable selection via penalized credible regions
Consistent high-dimensional Bayesian variable selection via penalized credible regions Howard Bondell bondell@stat.ncsu.edu Joint work with Brian Reich Howard Bondell p. 1 Outline High-Dimensional Variable
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationLearning with Noisy Labels. Kate Niehaus Reading group 11-Feb-2014
Learning with Noisy Labels Kate Niehaus Reading group 11-Feb-2014 Outline Motivations Generative model approach: Lawrence, N. & Scho lkopf, B. Estimating a Kernel Fisher Discriminant in the Presence of
More informationPart 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Tom Heskes joint work with Marcel van Gerven
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 2: Bayesian Basics https://people.orie.cornell.edu/andrew/orie6741 Cornell University August 25, 2016 1 / 17 Canonical Machine Learning
More informationBayesian linear regression
Bayesian linear regression Linear regression is the basis of most statistical modeling. The model is Y i = X T i β + ε i, where Y i is the continuous response X i = (X i1,..., X ip ) T is the corresponding
More informationSTA414/2104 Statistical Methods for Machine Learning II
STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationMachine Learning using Bayesian Approaches
Machine Learning using Bayesian Approaches Sargur N. Srihari University at Buffalo, State University of New York 1 Outline 1. Progress in ML and PR 2. Fully Bayesian Approach 1. Probability theory Bayes
More informationA Bayesian Treatment of Linear Gaussian Regression
A Bayesian Treatment of Linear Gaussian Regression Frank Wood December 3, 2009 Bayesian Approach to Classical Linear Regression In classical linear regression we have the following model y β, σ 2, X N(Xβ,
More informationApproximate Message Passing
Approximate Message Passing Mohammad Emtiyaz Khan CS, UBC February 8, 2012 Abstract In this note, I summarize Sections 5.1 and 5.2 of Arian Maleki s PhD thesis. 1 Notation We denote scalars by small letters
More informationMachine Learning Techniques for Computer Vision
Machine Learning Techniques for Computer Vision Part 2: Unsupervised Learning Microsoft Research Cambridge x 3 1 0.5 0.2 0 0.5 0.3 0 0.5 1 ECCV 2004, Prague x 2 x 1 Overview of Part 2 Mixture models EM
More informationSCUOLA DI SPECIALIZZAZIONE IN FISICA MEDICA. Sistemi di Elaborazione dell Informazione. Regressione. Ruggero Donida Labati
SCUOLA DI SPECIALIZZAZIONE IN FISICA MEDICA Sistemi di Elaborazione dell Informazione Regressione Ruggero Donida Labati Dipartimento di Informatica via Bramante 65, 26013 Crema (CR), Italy http://homes.di.unimi.it/donida
More informationMultiple regression. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar
Multiple regression CM226: Machine Learning for Bioinformatics. Fall 2016 Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar Multiple regression 1 / 36 Previous two lectures Linear and logistic
More informationExpectation Propagation in Dynamical Systems
Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex
More informationRigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems
1 Rigorous Dynamics and Consistent Estimation in Arbitrarily Conditioned Linear Systems Alyson K. Fletcher, Mojtaba Sahraee-Ardakan, Philip Schniter, and Sundeep Rangan Abstract arxiv:1706.06054v1 cs.it
More informationAn Algorithm for Bayesian Variable Selection in High-dimensional Generalized Linear Models
Proceedings 59th ISI World Statistics Congress, 25-30 August 2013, Hong Kong (Session CPS023) p.3938 An Algorithm for Bayesian Variable Selection in High-dimensional Generalized Linear Models Vitara Pungpapong
More informationModel Selection Tutorial 2: Problems With Using AIC to Select a Subset of Exposures in a Regression Model
Model Selection Tutorial 2: Problems With Using AIC to Select a Subset of Exposures in a Regression Model Centre for Molecular, Environmental, Genetic & Analytic (MEGA) Epidemiology School of Population
More informationCSci 8980: Advanced Topics in Graphical Models Gaussian Processes
CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian
More informationOr How to select variables Using Bayesian LASSO
Or How to select variables Using Bayesian LASSO x 1 x 2 x 3 x 4 Or How to select variables Using Bayesian LASSO x 1 x 2 x 3 x 4 Or How to select variables Using Bayesian LASSO On Bayesian Variable Selection
More informationBayesian Model Averaging
Bayesian Model Averaging Hoff Chapter 9, Hoeting et al 1999, Clyde & George 2004, Liang et al 2008 October 24, 2017 Bayesian Model Choice Models for the variable selection problem are based on a subset
More informationChapter 17: Undirected Graphical Models
Chapter 17: Undirected Graphical Models The Elements of Statistical Learning Biaobin Jiang Department of Biological Sciences Purdue University bjiang@purdue.edu October 30, 2014 Biaobin Jiang (Purdue)
More informationProbabilistic Graphical Networks: Definitions and Basic Results
This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More informationLarge-scale Ordinal Collaborative Filtering
Large-scale Ordinal Collaborative Filtering Ulrich Paquet, Blaise Thomson, and Ole Winther Microsoft Research Cambridge, University of Cambridge, Technical University of Denmark ulripa@microsoft.com,brmt2@cam.ac.uk,owi@imm.dtu.dk
More informationBayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework
HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for
More informationNonparametric Bayes Density Estimation and Regression with High Dimensional Data
Nonparametric Bayes Density Estimation and Regression with High Dimensional Data Abhishek Bhattacharya, Garritt Page Department of Statistics, Duke University Joint work with Prof. D.Dunson September 2010
More informationGWAS V: Gaussian processes
GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationProbabilistic Graphical Models
2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector
More informationNonparametric Bayesian Methods (Gaussian Processes)
[70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent
More informationBayesian Approaches Data Mining Selected Technique
Bayesian Approaches Data Mining Selected Technique Henry Xiao xiao@cs.queensu.ca School of Computing Queen s University Henry Xiao CISC 873 Data Mining p. 1/17 Probabilistic Bases Review the fundamentals
More informationL 0 methods. H.J. Kappen Donders Institute for Neuroscience Radboud University, Nijmegen, the Netherlands. December 5, 2011.
L methods H.J. Kappen Donders Institute for Neuroscience Radboud University, Nijmegen, the Netherlands December 5, 2 Bert Kappen Outline George McCullochs model The Variational Garrote Bert Kappen L methods
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September
More informationBased on slides by Richard Zemel
CSC 412/2506 Winter 2018 Probabilistic Learning and Reasoning Lecture 3: Directed Graphical Models and Latent Variables Based on slides by Richard Zemel Learning outcomes What aspects of a model can we
More informationarxiv: v1 [stat.me] 6 Jul 2017
Sparsity information and regularization in the horseshoe and other shrinkage priors arxiv:77.694v [stat.me] 6 Jul 7 Juho Piironen and Aki Vehtari Helsinki Institute for Information Technology, HIIT Department
More information20: Gaussian Processes
10-708: Probabilistic Graphical Models 10-708, Spring 2016 20: Gaussian Processes Lecturer: Andrew Gordon Wilson Scribes: Sai Ganesh Bandiatmakuri 1 Discussion about ML Here we discuss an introduction
More information10708 Graphical Models: Homework 2
10708 Graphical Models: Homework 2 Due Monday, March 18, beginning of class Feburary 27, 2013 Instructions: There are five questions (one for extra credit) on this assignment. There is a problem involves
More informationBayesian Hidden Markov Models and Extensions
Bayesian Hidden Markov Models and Extensions Zoubin Ghahramani Department of Engineering University of Cambridge joint work with Matt Beal, Jurgen van Gael, Yunus Saatci, Tom Stepleton, Yee Whye Teh Modeling
More informationProbabilistic machine learning group, Aalto University Bayesian theory and methods, approximative integration, model
Aki Vehtari, Aalto University, Finland Probabilistic machine learning group, Aalto University http://research.cs.aalto.fi/pml/ Bayesian theory and methods, approximative integration, model assessment and
More informationA Survey of L 1. Regression. Céline Cunen, 20/10/2014. Vidaurre, Bielza and Larranaga (2013)
A Survey of L 1 Regression Vidaurre, Bielza and Larranaga (2013) Céline Cunen, 20/10/2014 Outline of article 1.Introduction 2.The Lasso for Linear Regression a) Notation and Main Concepts b) Statistical
More informationEfficient Variational Inference in Large-Scale Bayesian Compressed Sensing
Efficient Variational Inference in Large-Scale Bayesian Compressed Sensing George Papandreou and Alan Yuille Department of Statistics University of California, Los Angeles ICCV Workshop on Information
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML
More informationEstimating Sparse High Dimensional Linear Models using Global-Local Shrinkage
Estimating Sparse High Dimensional Linear Models using Global-Local Shrinkage Daniel F. Schmidt Centre for Biostatistics and Epidemiology The University of Melbourne Monash University May 11, 2017 Outline
More informationDiscussion of Predictive Density Combinations with Dynamic Learning for Large Data Sets in Economics and Finance
Discussion of Predictive Density Combinations with Dynamic Learning for Large Data Sets in Economics and Finance by Casarin, Grassi, Ravazzolo, Herman K. van Dijk Dimitris Korobilis University of Essex,
More informationNonparametric Bayes tensor factorizations for big data
Nonparametric Bayes tensor factorizations for big data David Dunson Department of Statistical Science, Duke University Funded from NIH R01-ES017240, R01-ES017436 & DARPA N66001-09-C-2082 Motivation Conditional
More informationMachine Learning for Economists: Part 4 Shrinkage and Sparsity
Machine Learning for Economists: Part 4 Shrinkage and Sparsity Michal Andrle International Monetary Fund Washington, D.C., October, 2018 Disclaimer #1: The views expressed herein are those of the authors
More informationVariational Learning : From exponential families to multilinear systems
Variational Learning : From exponential families to multilinear systems Ananth Ranganathan th February 005 Abstract This note aims to give a general overview of variational inference on graphical models.
More informationSparse Bayesian Logistic Regression with Hierarchical Prior and Variational Inference
Sparse Bayesian Logistic Regression with Hierarchical Prior and Variational Inference Shunsuke Horii Waseda University s.horii@aoni.waseda.jp Abstract In this paper, we present a hierarchical model which
More informationProbabilistic Reasoning in Deep Learning
Probabilistic Reasoning in Deep Learning Dr Konstantina Palla, PhD palla@stats.ox.ac.uk September 2017 Deep Learning Indaba, Johannesburgh Konstantina Palla 1 / 39 OVERVIEW OF THE TALK Basics of Bayesian
More informationLeast Absolute Shrinkage is Equivalent to Quadratic Penalization
Least Absolute Shrinkage is Equivalent to Quadratic Penalization Yves Grandvalet Heudiasyc, UMR CNRS 6599, Université de Technologie de Compiègne, BP 20.529, 60205 Compiègne Cedex, France Yves.Grandvalet@hds.utc.fr
More informationAdvances and Applications in Perfect Sampling
and Applications in Perfect Sampling Ph.D. Dissertation Defense Ulrike Schneider advisor: Jem Corcoran May 8, 2003 Department of Applied Mathematics University of Colorado Outline Introduction (1) MCMC
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project
More informationThe Minimum Message Length Principle for Inductive Inference
The Principle for Inductive Inference Centre for Molecular, Environmental, Genetic & Analytic (MEGA) Epidemiology School of Population Health University of Melbourne University of Helsinki, August 25,
More informationThe Behaviour of the Akaike Information Criterion when Applied to Non-nested Sequences of Models
The Behaviour of the Akaike Information Criterion when Applied to Non-nested Sequences of Models Centre for Molecular, Environmental, Genetic & Analytic (MEGA) Epidemiology School of Population Health
More informationSampling Methods (11/30/04)
CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with
More informationVariational Message Passing. By John Winn, Christopher M. Bishop Presented by Andy Miller
Variational Message Passing By John Winn, Christopher M. Bishop Presented by Andy Miller Overview Background Variational Inference Conjugate-Exponential Models Variational Message Passing Messages Univariate
More informationBayesian Nonparametric Regression for Diabetes Deaths
Bayesian Nonparametric Regression for Diabetes Deaths Brian M. Hartman PhD Student, 2010 Texas A&M University College Station, TX, USA David B. Dahl Assistant Professor Texas A&M University College Station,
More informationCS242: Probabilistic Graphical Models Lecture 4A: MAP Estimation & Graph Structure Learning
CS242: Probabilistic Graphical Models Lecture 4A: MAP Estimation & Graph Structure Learning Professor Erik Sudderth Brown University Computer Science October 4, 2016 Some figures and materials courtesy
More informationRegularized Regression A Bayesian point of view
Regularized Regression A Bayesian point of view Vincent MICHEL Director : Gilles Celeux Supervisor : Bertrand Thirion Parietal Team, INRIA Saclay Ile-de-France LRI, Université Paris Sud CEA, DSV, I2BM,
More informationDistributed Bayesian Learning with Stochastic Natural-gradient EP and the Posterior Server
Distributed Bayesian Learning with Stochastic Natural-gradient EP and the Posterior Server in collaboration with: Minjie Xu, Balaji Lakshminarayanan, Leonard Hasenclever, Thibaut Lienart, Stefan Webb,
More informationVariational Inference (11/04/13)
STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further
More informationModel Selection for Gaussian Processes
Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal
More informationAccounting for Complex Sample Designs via Mixture Models
Accounting for Complex Sample Designs via Finite Normal Mixture Models 1 1 University of Michigan School of Public Health August 2009 Talk Outline 1 2 Accommodating Sampling Weights in Mixture Models 3
More informationGaussian Mixture Model
Case Study : Document Retrieval MAP EM, Latent Dirichlet Allocation, Gibbs Sampling Machine Learning/Statistics for Big Data CSE599C/STAT59, University of Washington Emily Fox 0 Emily Fox February 5 th,
More informationBayesian shrinkage approach in variable selection for mixed
Bayesian shrinkage approach in variable selection for mixed effects s GGI Statistics Conference, Florence, 2015 Bayesian Variable Selection June 22-26, 2015 Outline 1 Introduction 2 3 4 Outline Introduction
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II
1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector
More informationLinear Regression and Discrimination
Linear Regression and Discrimination Kernel-based Learning Methods Christian Igel Institut für Neuroinformatik Ruhr-Universität Bochum, Germany http://www.neuroinformatik.rub.de July 16, 2009 Christian
More informationLecture 2: Statistical Decision Theory (Part I)
Lecture 2: Statistical Decision Theory (Part I) Hao Helen Zhang Hao Helen Zhang Lecture 2: Statistical Decision Theory (Part I) 1 / 35 Outline of This Note Part I: Statistics Decision Theory (from Statistical
More informationa b = a T b = a i b i (1) i=1 (Geometric definition) The dot product of two Euclidean vectors a and b is defined by a b = a b cos(θ a,b ) (2)
This is my preperation notes for teaching in sections during the winter 2018 quarter for course CSE 446. Useful for myself to review the concepts as well. More Linear Algebra Definition 1.1 (Dot Product).
More informationLatent factor density regression models
Biometrika (2010), 97, 1, pp. 1 7 C 2010 Biometrika Trust Printed in Great Britain Advance Access publication on 31 July 2010 Latent factor density regression models BY A. BHATTACHARYA, D. PATI, D.B. DUNSON
More informationg-priors for Linear Regression
Stat60: Bayesian Modeling and Inference Lecture Date: March 15, 010 g-priors for Linear Regression Lecturer: Michael I. Jordan Scribe: Andrew H. Chan 1 Linear regression and g-priors In the last lecture,
More informationVirtual Sensors and Large-Scale Gaussian Processes
Virtual Sensors and Large-Scale Gaussian Processes Ashok N. Srivastava, Ph.D. Principal Investigator, IVHM Project Group Lead, Intelligent Data Understanding ashok.n.srivastava@nasa.gov Coauthors: Kamalika
More information