Statistical Inference for Food Webs
|
|
- Isaac Clark
- 5 years ago
- Views:
Transcription
1 Statistical Inference for Food Webs Part I: Bayesian Melding Grace Chiu and Josh Gould Department of Statistics & Actuarial Science CMAR-Hobart Science Seminar, March 6,
2 Outline CMAR-Hobart Science Seminar, March 6,
3 Outline Overview: existing vs. statistical approaches for Trophic context Whole-system context: ecological network analysis (ENA) CMAR-Hobart Science Seminar, March 6,
4 Outline Overview: existing vs. statistical approaches for Trophic context Whole-system context: ecological network analysis (ENA) Present ENA techniques CMAR-Hobart Science Seminar, March 6,
5 Outline Overview: existing vs. statistical approaches for Trophic context Whole-system context: ecological network analysis (ENA) Present ENA techniques ENA statistical inference statistical perspective of mass balance Bayesian melding CMAR-Hobart Science Seminar, March 6,
6 Outline Overview: existing vs. statistical approaches for Trophic context Whole-system context: ecological network analysis (ENA) Present ENA techniques ENA statistical inference statistical perspective of mass balance Bayesian melding Example: Chesapeake Bay Mesohaline Network CMAR-Hobart Science Seminar, March 6,
7 Outline Overview: existing vs. statistical approaches for Trophic context Whole-system context: ecological network analysis (ENA) Present ENA techniques ENA statistical inference statistical perspective of mass balance Bayesian melding Example: Chesapeake Bay Mesohaline Network Summary and Conclusion CMAR-Hobart Science Seminar, March 6,
8 Aspects of a Food Web CMAR-Hobart Science Seminar, March 6,
9 Aspects of a Food Web CMAR-Hobart Science Seminar, March 6,
10 Aspects of a Food Web... that was only for trophic relations... CMAR-Hobart Science Seminar, March 6,
11 Aspects of a Food Web: (1) Trophic Food web, trophically CMAR-Hobart Science Seminar, March 6,
12 Aspects of a Food Web: (1) Trophic Food web, trophically structure of interdependence among species predator-prey links = feeding patterns links can be weighted: e.g. predation frequency CMAR-Hobart Science Seminar, March 6,
13 Aspects of a Food Web: (1) Trophic Food web, trophically structure of interdependence among species predator-prey links = feeding patterns links can be weighted: e.g. predation frequency Existing work for trophic analyses: examine interactions between providers (prey) and benefactors (predators) (semi-)quantitative techniques for systematic extraction of the meaning of these interactions CMAR-Hobart Science Seminar, March 6,
14 Aspects of a Food Web: (1) Trophic Food web, trophically structure of interdependence among species predator-prey links = feeding patterns links can be weighted: e.g. predation frequency Existing work for trophic analyses: examine interactions between providers (prey) and benefactors (predators) (semi-)quantitative techniques for systematic extraction of the meaning of these interactions e.g. trophic hierarchy / compartments CMAR-Hobart Science Seminar, March 6,
15 Aspects of a Food Web CMAR-Hobart Science Seminar, March 6,
16 Aspects of a Food Web and now, for whole-system relations... CMAR-Hobart Science Seminar, March 6,
17 Aspects of a Food Web CMAR-Hobart Science Seminar, March 6,
18 Aspects of a Food Web: (2) Whole System Ecological Network (whole-system food web) CMAR-Hobart Science Seminar, March 6,
19 Aspects of a Food Web: (2) Whole System Ecological Network (whole-system food web) trophic compartments and susbstance / energy throughput this interdependence is subject to system balance notion of balance based on thermodynamics CMAR-Hobart Science Seminar, March 6,
20 Aspects of a Food Web: (2) Whole System Ecological Network (whole-system food web) trophic compartments and susbstance / energy throughput this interdependence is subject to system balance notion of balance based on thermodynamics Existing Work: Ecological Network Analysis (ENA) deterministic biophysical theory in a balance model quantity of exchange of substance / energy among compartments extract characteristics of these quantities = describe interactions among compartments CMAR-Hobart Science Seminar, March 6,
21 Disclaimer Images of food webs were generated by a Google search. CMAR-Hobart Science Seminar, March 6,
22 Trophic analysis and ENA Main Goal: to understand / predict (e.g. over time) within-web interactions based on the quantities associated with the edges (arrows) between pairs of species / compartments CMAR-Hobart Science Seminar, March 6,
23 Trophic analysis and ENA Main Goal: to understand / predict (e.g. over time) within-web interactions based on the quantities associated with the edges (arrows) between pairs of species / compartments ISSUES: randomness of quantities ignored no formal statistical inference of interaction patterns or predictions for ENA, randomness = observed quantities don t balance some quantities are unobservable from field computer algorithms generate missing quantities to minimize imbalance (e.g. EcoSim / EcoPath) CMAR-Hobart Science Seminar, March 6,
24 Trophic analysis and ENA Main Goal: to understand / predict (e.g. over time) within-web interactions based on the quantities associated with the edges (arrows) between pairs of species / compartments ISSUES: randomness of quantities ignored no formal statistical inference of interaction patterns or predictions for ENA, randomness = observed quantities don t balance some quantities are unobservable from field computer algorithms generate missing quantities to minimize imbalance (e.g. EcoSim / EcoPath) = further complicating any inference attempts! CMAR-Hobart Science Seminar, March 6,
25 A statistician s ideas... CMAR-Hobart Science Seminar, March 6,
26 A statistician s ideas... Alternative Trophic Analysis: take a completely quantitative regression approach: accounts for randomness of data! accounts for substance exchange! and other variables proper inference possible from regression modelling includes prediction inference for pairwise links and compartments AND over time simple scatterplots to identify and interpret compartments CMAR-Hobart Science Seminar, March 6,
27 A statistician s ideas... Alternative Trophic Analysis: take a completely quantitative regression approach: accounts for randomness of data! accounts for substance exchange! and other variables proper inference possible from regression modelling includes prediction inference for pairwise links and compartments AND over time simple scatterplots to identify and interpret compartments that ll be Statistical Inference Part II CMAR-Hobart Science Seminar, March 6,
28 Statistical Inference Part I CMAR-Hobart Science Seminar, March 6,
29 Statistical Inference Part I ENA Statistical Inference: can overcome empirical imbalance by incorporating randomness through Bayesian Melding can fill in missing quantities by prediction inference within Bayesian framework can be extended to temporal model without explicit calibration (as opposed to, e.g. morphing multiple static analyses into a single dynamics model) CMAR-Hobart Science Seminar, March 6,
30 Statistical Inference Part I ENA Statistical Inference: can overcome empirical imbalance by incorporating randomness through Bayesian Melding can fill in missing quantities by prediction inference within Bayesian framework can be extended to temporal model without explicit calibration (as opposed to, e.g. morphing multiple static analysis into a single dynamics model) CMAR-Hobart Science Seminar, March 6,
31 Conventional ENA Simple e.g. from Ulanowicz (Comput. Biol. & Chem., 2004): Fix a transfer medium, e.g. nitrogen or heat. Let i, j = compartment label T ij = rate of transfer of medium from i to j X i = rate of exogenous input of medium to i E i = rate of external transfer of medium from i R i = rate of dissipation of medium from i CMAR-Hobart Science Seminar, March 6,
32 Conventional ENA Simple e.g. from Ulanowicz (Comput. Biol. & Chem., 2004): Fix a transfer medium, e.g. nitrogen or heat. Let i, j = compartment label T ij = rate of transfer of medium from i to j X i = rate of exogenous input of medium to i E i = rate of external transfer of medium from i R i = rate of dissipation of medium from i Balance Equation: total inflow rate = total outflow rate = X i + j T ji = j T ij + E i + R i CMAR-Hobart Science Seminar, March 6,
33 Conventional ENA Simple e.g. from Ulanowicz (Comput. Biol. & Chem., 2004): Fix a transfer medium, e.g. nitrogen or heat. Let i, j = compartment label T ij = rate of transfer of medium from i to j X i = rate of exogenous input of medium to i E i = rate of external transfer of medium from i R i = rate of dissipation of medium from i Balance Equation: total inflow rate = total outflow rate = X i + j T ji = j T ij + E i + R i Ideally, do this over all n compartments and K media. CMAR-Hobart Science Seminar, March 6,
34 Conventional ENA: Ideally System of n K equations: X (1) 1 + T (1) +1 = T (1) 1+ + E (1) 1 + R (1) 1. X (1) n + T (1) +n = T (1) n+ + E (1) n + R (1) n X (2) 1 + T (2) +1 = T (2) 1+ + E (1) 1 + R (2) 1. X (2) n + T (2) +n = T (2) n+ + E (1) n + R (2) n. X (K ) 1 + T (K ) +1 = T (K ) 1+ + E (K ) 1 + R (K ) 1. X (K ) n + T (K ) +n = T (K ) n+ + E (K ) n + R (K ) n CMAR-Hobart Science Seminar, March 6,
35 Conventional ENA X (1) 1 + T (1) +1 = T (1) 1+ + E (1) 1 + R (1) 1. X (1) n + T (1) +n = T (1) n+ + E (1) n + R (1) n CMAR-Hobart Science Seminar, March 6,
36 Conventional ENA X (1) 1 + T (1) +1 = T (1) 1+ + E (1) 1 + R (1) 1. X (1) n + T (1) +n = T (1) n+ + E (1) n + R (1) n Randomness = field data never satisfy equality CMAR-Hobart Science Seminar, March 6,
37 Conventional ENA X (1) 1 + T (1) +1 = T (1) 1+ + E (1) 1 + R (1) 1. X (1) n + T (1) +n = T (1) n+ + E (1) n + R (1) n Randomness = field data never satisfy equality Worse yet, only some quantities are observable from field... CMAR-Hobart Science Seminar, March 6,
38 Conventional ENA Example for n=4, K =1 Observe: X i, E i, R i for all i; T ij for all (i, j) except i=3 Unknown: T 31, T 32, T 34 = T 3+ X 1 + T 21 + T 31 + T 41 = T 12 + T 13 + T 14 + E 1 + R 1 X 2 + T 12 + T 32 + T 42 = T 21 + T 23 + T 24 + E 2 + R 2 X 3 + T 13 + T 23 + T 43 = T 31 + T 32 + T 34 + E 3 + R 3 X 4 + T 14 + T 24 + T 34 = T 41 + T 42 + T 43 + E 4 + R 4 CMAR-Hobart Science Seminar, March 6,
39 Conventional ENA Example for n=4, K =1 Observe: X i, E i, R i for all i; T ij for all (i, j) except i=3 Unknown: T 31, T 32, T 34 = T 3+ X 1 + T 21 + T 31 + T 41 = T 12 + T 13 + T 14 + E 1 + R 1 X 2 + T 12 + T 32 + T 42 = T 21 + T 23 + T 24 + E 2 + R 2 X 3 + T 13 + T 23 + T 43 = T 31 + T 32 + T 34 + E 3 + R 3 X 4 + T 14 + T 24 + T 34 = T 41 + T 42 + T 43 + E 4 + R 4 no balance = no theoretical solution for T 3+ CMAR-Hobart Science Seminar, March 6,
40 Conventional ENA Remedy: deduce T 3j s from observable auxiliary quantities CMAR-Hobart Science Seminar, March 6,
41 Conventional ENA Remedy: deduce T 3j s from observable auxiliary quantities e.g. theoretical relationship among T ij compartment production biomass. f (P ij, B ij,...) = T ij CMAR-Hobart Science Seminar, March 6,
42 Conventional ENA Remedy: deduce T 3j s from observable auxiliary quantities e.g. theoretical relationship among T ij compartment production biomass. f (P ij, B ij,...) = T ij even if a deduced T ij is free of uncertainty... CMAR-Hobart Science Seminar, March 6,
43 Conventional ENA Remedy: deduce T 3j s from observable auxiliary quantities e.g. theoretical relationship among T ij compartment production biomass. f (P ij, B ij,...) = T ij even if a deduced T ij is free of uncertainty... System of equations fails regardless... what then? CMAR-Hobart Science Seminar, March 6,
44 Conventional ENA Computer software to the rescue! CMAR-Hobart Science Seminar, March 6,
45 Conventional ENA Computer software to the rescue! tinker with quantities subject to certain criteria until equality (almost) holds criteria built into software can be mysterious to user restrict tinkering to deduced quantities only, if possible CMAR-Hobart Science Seminar, March 6,
46 Conventional ENA Computer software to the rescue! Lingo tinker with quantities subject to certain criteria until equality (almost) holds criteria built into software can be mysterious to user restrict tinkering to deduced quantities only, if possible (Program) Input: observed and deduced quantities (Program) Output: balanced quantities CMAR-Hobart Science Seminar, March 6,
47 Conventional ENA In light of im balance theory-based deduction coerced balance... CMAR-Hobart Science Seminar, March 6,
48 Conventional ENA In light of im(possible)balance theory-based deduction coerced balance... CMAR-Hobart Science Seminar, March 6,
49 Conventional ENA In light of im(possible)balance theory-based deduction coerced balance... Million $ question: How confident are we in the numbers?? CMAR-Hobart Science Seminar, March 6,
50 Conventional ENA: confidence Existing attempts: Sensitivity analyses perturb program input monitor behavior of program output CMAR-Hobart Science Seminar, March 6,
51 Conventional ENA: confidence Existing attempts: Sensitivity analyses perturb program input monitor behavior of program output but... How to make inference for underlying network structure? CMAR-Hobart Science Seminar, March 6,
52 Perspectives of Mass Balance CMAR-Hobart Science Seminar, March 6,
53 Perspectives of Mass Balance Physics: in = out CMAR-Hobart Science Seminar, March 6,
54 Perspectives of Mass Balance Physics: in = out Statistics: E( in ) = E( out ) CMAR-Hobart Science Seminar, March 6,
55 Perspectives of Mass Balance Physics: in = out Statistics: E( in ) = E( out ) Simplest example Let W i = X i + T +i w/ mean µ W U i = T i+ + E i w/ mean µ U R i w/ mean µ R CMAR-Hobart Science Seminar, March 6,
56 Perspectives of Mass Balance Physics: in = out Statistics: E( in ) = E( out ) Simplest example Let W i = X i + T +i w/ mean µ W U i = T i+ + E i w/ mean µ U R i w/ mean µ R = balance model: µ W = µ U + µ R CMAR-Hobart Science Seminar, March 6,
57 Perspectives of Mass Balance Physics: in = out Statistics: E( in ) = E( out ) Simplest example Let W i = X i + T +i w/ mean µ W U i = T i+ + E i w/ mean µ U R i w/ mean µ R = balance model: µ W = µ U + µ R single balance equation of unobservable quantities estimation and confidence statements via statistical inference CMAR-Hobart Science Seminar, March 6,
58 Bayesian Melding for ENA Inference CMAR-Hobart Science Seminar, March 6,
59 Bayesian Melding for ENA Inference Elements of deterministic modeling deterministic model M( ) model input θ model output φ CMAR-Hobart Science Seminar, March 6,
60 Bayesian Melding for ENA Inference Elements of deterministic modeling deterministic model M( ) model input θ model output φ M : θ φ or φ := M(θ) CMAR-Hobart Science Seminar, March 6,
61 Bayesian Melding for ENA Inference Elements of deterministic modeling deterministic model M( ) model input θ = E(observables) model output φ = E(unobservables) M : θ φ or φ := M(θ) Rationale for choice of input/output: CMAR-Hobart Science Seminar, March 6,
62 Bayesian Melding for ENA Inference Elements of deterministic modeling deterministic model M( ) model input θ = E(observables) model output φ = E(unobservables) M : θ φ or φ := M(θ) Rationale for choice of input/output: to make statistical inference, need assumptions about statistical behavior for quantities experience with observed quantities can be basis of such assumptions then statistical behavior of E(unobservables) is defined by M through those of E(observables) CMAR-Hobart Science Seminar, March 6,
63 Bayesian Melding for ENA Inference Among W i, U i, R i, suppose R i is unobservable CMAR-Hobart Science Seminar, March 6,
64 Bayesian Melding for ENA Inference Among W i, U i, R i, suppose R i is unobservable = rewrite balance model as or µ R = µ W µ U φ = M(θ) CMAR-Hobart Science Seminar, March 6,
65 Bayesian Melding for ENA Inference Among W i, U i, R i, suppose R i is unobservable = rewrite balance model as or µ R = µ W µ U φ = M(θ) where φ = µ R model output θ = (µ W, µ U ) model input M(θ) = (1, 1)θ the model M : θ φ CMAR-Hobart Science Seminar, March 6,
66 Bayesian Melding for ENA Inference Bayesian Inference CMAR-Hobart Science Seminar, March 6,
67 Bayesian Melding for ENA Inference Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference CMAR-Hobart Science Seminar, March 6,
68 Bayesian Melding for ENA Inference Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference Bayesian Melding (due to Poole & Raftery, 2000) CMAR-Hobart Science Seminar, March 6,
69 Bayesian Melding for ENA Inference Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference Bayesian Melding (due to Poole & Raftery, 2000) two priors for φ: specified prior, h(φ) induced prior, h (φ), from cranking prior for θ through M CMAR-Hobart Science Seminar, March 6,
70 Bayesian Melding for ENA Inference θ (1) M(θ (1) ) = φ (1).. θ (m) M(θ (m) ) = φ (m) g(θ) input prior h (φ) induced output prior CMAR-Hobart Science Seminar, March 6,
71 Bayesian Melding for ENA Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference predictions for missing T 3i s via posterior predictive Bayesian Melding (due to Poole & Raftery, 2000) two priors for φ: specified prior, h(φ) induced prior, h (φ), from cranking prior for θ through M CMAR-Hobart Science Seminar, March 6,
72 Bayesian Melding for ENA Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference predictions for missing T 3i s via posterior predictive Bayesian Melding (due to Poole & Raftery, 2000) two priors for φ: specified prior, h(φ) induced prior, h (φ), from cranking prior for θ through M combine both priors = melded prior for φ, q(φ) CMAR-Hobart Science Seminar, March 6,
73 Bayesian Melding for ENA Inference Melded prior for φ is for some γ (0,1). q(φ) h (φ) γ h(φ) 1 γ CMAR-Hobart Science Seminar, March 6,
74 Bayesian Melding for ENA Inference Melded prior for φ is for some γ (0,1). q(φ) h (φ) γ h(φ) 1 γ What s γ? arbitrary in principle can be specified to reflect expert opinions on relative reliability between h and h (or M) CMAR-Hobart Science Seminar, March 6,
75 Bayesian Melding for ENA Inference Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference predictions for missing T 3i s via posterior predictive Bayesian Melding (due to Poole & Raftery, 2000) two priors for φ: specified prior, f (φ) induced prior, f (φ), from cranking prior for θ through M combine both priors = melded prior for φ, q(φ) CMAR-Hobart Science Seminar, March 6,
76 Bayesian Melding for ENA Inference Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference predictions for missing T 3i s via posterior predictive Bayesian Melding (due to Poole & Raftery, 2000) two priors for φ: specified prior, f (φ) induced prior, f (φ), from cranking prior for θ through M combine both priors = melded prior for φ, q(φ) crank melded prior q(φ) through M 1 = melded prior for θ, p(θ) CMAR-Hobart Science Seminar, March 6,
77 Bayesian Melding for ENA Inference Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference predictions for missing T 3i s via posterior predictive Bayesian Melding (due to Poole & Raftery, 2000) two priors for φ: specified prior, f (φ) induced prior, f (φ), from cranking prior for θ through M combine both priors = melded prior for φ, q(φ) crank melded prior q(φ) through M 1 = melded prior for θ, p(θ) there are tricks to overcome non-invertibility of M CMAR-Hobart Science Seminar, March 6,
78 Bayesian Melding for ENA Inference Bayesian Inference specify prior and likelihood for φ, θ obtain posterior for φ, θ = posterior inference predictions for missing T 3i s via posterior predictive Bayesian Melding (due to Poole & Raftery, 2000) two priors for φ: specified prior, f (φ) induced prior, f (φ), from cranking prior for θ through M combine both priors = melded prior for φ, q(φ) crank melded prior q(φ) through M 1 posterior inference based on p and M there are tricks to overcome non-invertibility of M = melded prior for θ, p(θ) CMAR-Hobart Science Seminar, March 6,
79 Bayesian Melding for ENA Inference posterior for θ: π θ (θ) p(θ) L θ (θ) L φ ( M(θ) ) CMAR-Hobart Science Seminar, March 6,
80 Bayesian Melding for ENA Inference posterior for θ: π θ (θ) p(θ) L θ (θ) L φ ( M(θ) ) posterior for φ: M : π θ (θ) π φ (φ) CMAR-Hobart Science Seminar, March 6,
81 Bayesian Melding for ENA Inference posterior for θ: π θ (θ) p(θ) L θ (θ) L φ ( M(θ) ) posterior for φ: M : π θ (θ) π φ (φ) Hallelujah! CMAR-Hobart Science Seminar, March 6,
82 Application: Chesapeake Bay Summer Network based on Baird & Ulanowicz (1989), Ecological Monographs 59, and ulan/ntwk/datall.zip CMAR-Hobart Science Seminar, March 6,
83 Application: Chesapeake Bay Summer Network CMAR-Hobart Science Seminar, March 6,
84 Application: Chesapeake Bay Summer Network CMAR-Hobart Science Seminar, March 6,
85 Application: Chesapeake Bay Summer Network transfer medium: carbon (g/m 2 /summer) i=1,..., 36 compartments CMAR-Hobart Science Seminar, March 6,
86 Application: Chesapeake Bay Summer Network transfer medium: carbon (g/m 2 /summer) i=1,..., 36 compartments Elements of Bayesian inference Likelihood: W i, U i, R i θ, φ independent exponentials specified prior: (θ, φ) trivariate log-normal CMAR-Hobart Science Seminar, March 6,
87 Application: Chesapeake Bay Summer Network transfer medium: carbon (g/m 2 /summer) i=1,..., 36 compartments Elements of Bayesian inference Likelihood: W i, U i, R i θ, φ independent exponentials specified prior: (θ, φ) trivariate log-normal = W i, U i, R i are marginally dependent (as would be necessary due to theoretical balance) CMAR-Hobart Science Seminar, March 6,
88 Application: Chesapeake Bay Summer Network NOTE: Instead of painstaking extraction of (unbalanced) data from the flow diagram, we adopted the online data (already balanced) would ideally use former. CMAR-Hobart Science Seminar, March 6,
89 Application: Chesapeake Bay Summer Network Data: W U R Specified prior ( ) and Melded Posterior ( ): θ θ φ CMAR-Hobart Science Seminar, March 6,
90 Application: Chesapeake Bay Summer Network Estimates and 95% Confidence intervals: θ 1 = µ W θ 2 = µ U φ = µ R Melding Posterior Mean HPD interval (72.69, ) (47.99, 80.84) (17.57, 33.23) Classical CLT interval (33.24, ) (20.70, ) (4.24, 43.80) CMAR-Hobart Science Seminar, March 6,
91 Application: Chesapeake Bay Summer Network Estimates and 95% Confidence intervals: θ 1 = µ W θ 2 = µ U φ = µ R Melding Posterior Mean HPD interval (72.69, ) (47.99, 80.84) (17.57, 33.23) Classical CLT interval (33.24, ) (20.70, ) (4.24, 43.80) Classical intervals are MUCH WIDER! CMAR-Hobart Science Seminar, March 6,
92 Application: Chesapeake Bay Summer Network Dependence among θ 1, θ 2, φ: θ φ φ θ θ θ 2 CMAR-Hobart Science Seminar, March 6,
93 Application: Chesapeake Bay Summer Network Dependence among θ 1, θ 2, φ: θ φ φ θ 1 High dependence is presumed among all 3 due to theoretical balance. Inference indicates dependence is only high between µ W and µ U = new insight! Note: inference on dependence structure NOT possible with classical inference (which treats µ s as constants). CMAR-Hobart Science Seminar, March 6, θ 1 θ 2
94 Summary Physics Statistics (1) each quantity (variable) no random variable needs must satisfy within- to satisfy balance compartment balance = no compartment needs = collapse compart- to satisfy balance ments to allow balance random variables have = wrong biology expectations that satisfy within-system balance = the more compartments (i.e. data) the better! CMAR-Hobart Science Seminar, March 6,
95 Summary Physics Statistics (1) each quantity (variable) no random variable needs must satisfy within- to satisfy balance compartment balance = no compartment needs = collapse compart- to satisfy balance ments to allow balance random variables have = wrong biology expectations that satisfy within-system balance = the more compartments (i.e. data) the better! can be a need as long as replicated observations are available per compartment CMAR-Hobart Science Seminar, March 6,
96 Summary Physics Statistics (2) unobservable variables statistical prediction are deduced from (inferential) possible auxiliary theory in certain scenarios e.g. variable observed for some compartments e.g. deduce unobserved through formal regression (both in progress) (3) no formal inference / possible for confidence statements (a) any quantity in for any quantity system-balance equation (b) unobservable variable in some cases (see (2)) (4) multiple media on perceivably straightsame system hard forward to impossible (in progress) CMAR-Hobart Science Seminar, March 6,
97 Conclusion CMAR-Hobart Science Seminar, March 6,
98 Conclusion So why statistical inference for food webs? CMAR-Hobart Science Seminar, March 6,
99 Conclusion So why statistical inference for food webs? statistical models are more honest: properly acknowledge uncertainty statistical inference-based techniques are tractable proper prediction inference is possible (straightforward within Bayesian framework) CMAR-Hobart Science Seminar, March 6,
100 Conclusion So why statistical inference for food webs? statistical models are more honest: properly acknowledge uncertainty statistical inference-based techniques are tractable proper prediction inference is possible (straightforward within Bayesian framework) ENA with Bayesian melding additionally overcomes empirical imbalance under theoretical balance soundly integrates statistical practice with physical theory often preferred by scientists over purely empirically driven approaches CMAR-Hobart Science Seminar, March 6,
101 Thank you! This presentation is available from: gchiu/talks/csiro-hobart09.pdf Articles: Chiu & Gould (submitted). Chiu & Gould (2008), U of Waterloo Working Paper # navigation/techreports/08workingpapers/08-07.pdf CMAR-Hobart Science Seminar, March 6,
FULL TITLE Statistical Inference for Food Webs with Emphasis on Ecological Networks via Bayesian Melding. SHORT TITLE For Peer Review
0 0 0 0 0 FULL TITLE Statistical Inference for Food Webs with Emphasis on Ecological Networks via Bayesian Melding SHORT TITLE Statistical Inference for Ecological Networks via Bayesian Melding AUTHORS
More informationAn Application of Bayesian Melding to Ecological Networks. Joshua Michael Gould. A research paper presented to the. University of Waterloo
An Application of Bayesian Melding to Ecological Networks by Joshua Michael Gould A research paper presented to the University of Waterloo In partial fulfillment of the requirements for the degree of Master
More informationMeasurement error as missing data: the case of epidemiologic assays. Roderick J. Little
Measurement error as missing data: the case of epidemiologic assays Roderick J. Little Outline Discuss two related calibration topics where classical methods are deficient (A) Limit of quantification methods
More informationDS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling
DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling Due: Tuesday, May 10, 2016, at 6pm (Submit via NYU Classes) Instructions: Your answers to the questions below, including
More informationUncertain Inference and Artificial Intelligence
March 3, 2011 1 Prepared for a Purdue Machine Learning Seminar Acknowledgement Prof. A. P. Dempster for intensive collaborations on the Dempster-Shafer theory. Jianchun Zhang, Ryan Martin, Duncan Ermini
More informationStatistics Boot Camp. Dr. Stephanie Lane Institute for Defense Analyses DATAWorks 2018
Statistics Boot Camp Dr. Stephanie Lane Institute for Defense Analyses DATAWorks 2018 March 21, 2018 Outline of boot camp Summarizing and simplifying data Point and interval estimation Foundations of statistical
More informationCromwell's principle idealized under the theory of large deviations
Cromwell's principle idealized under the theory of large deviations Seminar, Statistics and Probability Research Group, University of Ottawa Ottawa, Ontario April 27, 2018 David Bickel University of Ottawa
More informationBayesian Methods for Estimating the Reliability of Complex Systems Using Heterogeneous Multilevel Information
Statistics Preprints Statistics 8-2010 Bayesian Methods for Estimating the Reliability of Complex Systems Using Heterogeneous Multilevel Information Jiqiang Guo Iowa State University, jqguo@iastate.edu
More information6 Pattern Mixture Models
6 Pattern Mixture Models A common theme underlying the methods we have discussed so far is that interest focuses on making inference on parameters in a parametric or semiparametric model for the full data
More informationBayesian Inference by Density Ratio Estimation
Bayesian Inference by Density Ratio Estimation Michael Gutmann https://sites.google.com/site/michaelgutmann Institute for Adaptive and Neural Computation School of Informatics, University of Edinburgh
More informationSYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I
SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability
More informationA Bayesian Treatment of Linear Gaussian Regression
A Bayesian Treatment of Linear Gaussian Regression Frank Wood December 3, 2009 Bayesian Approach to Classical Linear Regression In classical linear regression we have the following model y β, σ 2, X N(Xβ,
More informationIntroduction to Bayesian Inference
University of Pennsylvania EABCN Training School May 10, 2016 Bayesian Inference Ingredients of Bayesian Analysis: Likelihood function p(y φ) Prior density p(φ) Marginal data density p(y ) = p(y φ)p(φ)dφ
More informationTopic 12 Overview of Estimation
Topic 12 Overview of Estimation Classical Statistics 1 / 9 Outline Introduction Parameter Estimation Classical Statistics Densities and Likelihoods 2 / 9 Introduction In the simplest possible terms, the
More informationReview: Statistical Model
Review: Statistical Model { f θ :θ Ω} A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. The statistical model
More informationRigorous Science - Based on a probability value? The linkage between Popperian science and statistical analysis
Rigorous Science - Based on a probability value? The linkage between Popperian science and statistical analysis The Philosophy of science: the scientific Method - from a Popperian perspective Philosophy
More informationMeasuring Uncertainty in Spatial Data via Bayesian Melding
Measuring Uncertainty in Spatial Data via Bayesian Melding Matt Falk Queensland University of Technology (QUT) m.falk@qut.edu.au Joint work with Robert Denham (NRW) and Kerrie Mengersen (QUT) Data Driven
More informationNeutron inverse kinetics via Gaussian Processes
Neutron inverse kinetics via Gaussian Processes P. Picca Politecnico di Torino, Torino, Italy R. Furfaro University of Arizona, Tucson, Arizona Outline Introduction Review of inverse kinetics techniques
More informationGaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008
Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:
More informationCarl N. Morris. University of Texas
EMPIRICAL BAYES: A FREQUENCY-BAYES COMPROMISE Carl N. Morris University of Texas Empirical Bayes research has expanded significantly since the ground-breaking paper (1956) of Herbert Robbins, and its province
More informationCopyrighted Material. 1.1 Large-Scale Interconnected Dynamical Systems
Chapter One Introduction 1.1 Large-Scale Interconnected Dynamical Systems Modern complex dynamical systems 1 are highly interconnected and mutually interdependent, both physically and through a multitude
More informationPhysics 509: Error Propagation, and the Meaning of Error Bars. Scott Oser Lecture #10
Physics 509: Error Propagation, and the Meaning of Error Bars Scott Oser Lecture #10 1 What is an error bar? Someone hands you a plot like this. What do the error bars indicate? Answer: you can never be
More informationRigorous Science - Based on a probability value? The linkage between Popperian science and statistical analysis
/3/26 Rigorous Science - Based on a probability value? The linkage between Popperian science and statistical analysis The Philosophy of science: the scientific Method - from a Popperian perspective Philosophy
More informationRigorous Science - Based on a probability value? The linkage between Popperian science and statistical analysis
/9/27 Rigorous Science - Based on a probability value? The linkage between Popperian science and statistical analysis The Philosophy of science: the scientific Method - from a Popperian perspective Philosophy
More informationAdvanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Gaussian Processes Barnabás Póczos http://www.gaussianprocess.org/ 2 Some of these slides in the intro are taken from D. Lizotte, R. Parr, C. Guesterin
More informationRidge regression. Patrick Breheny. February 8. Penalized regression Ridge regression Bayesian interpretation
Patrick Breheny February 8 Patrick Breheny High-Dimensional Data Analysis (BIOS 7600) 1/27 Introduction Basic idea Standardization Large-scale testing is, of course, a big area and we could keep talking
More informationBayesian Networks in Educational Assessment
Bayesian Networks in Educational Assessment Estimating Parameters with MCMC Bayesian Inference: Expanding Our Context Roy Levy Arizona State University Roy.Levy@asu.edu 2017 Roy Levy MCMC 1 MCMC 2 Posterior
More informationIntroduction into Bayesian statistics
Introduction into Bayesian statistics Maxim Kochurov EF MSU November 15, 2016 Maxim Kochurov Introduction into Bayesian statistics EF MSU 1 / 7 Content 1 Framework Notations 2 Difference Bayesians vs Frequentists
More informationBayesian Hierarchical Modelling: Incorporating spatial information in water resources assessment and accounting
Bayesian Hierarchical Modelling: Incorporating spatial information in water resources assessment and accounting Grace Chiu & Eric Lehmann (CSIRO Mathematics, Informatics and Statistics) A water information
More informationIncorporating the Effects of Traffic Signal Progression Into the Proposed Incremental Queue Accumulation (IQA) Method
#06-0107 Incorporating the Effects of Traffic Signal Progression Into the Proposed Incremental Queue Accumulation (IQA) Method Dennis W. Strong, President Strong Concepts 1249 Shermer Road, Suite 100 Northbrook,
More informationntopic Organic Traffic Study
ntopic Organic Traffic Study 1 Abstract The objective of this study is to determine whether content optimization solely driven by ntopic recommendations impacts organic search traffic from Google. The
More informationBayesian network modeling. 1
Bayesian network modeling http://springuniversity.bc3research.org/ 1 Probabilistic vs. deterministic modeling approaches Probabilistic Explanatory power (e.g., r 2 ) Explanation why Based on inductive
More informationHow to build an automatic statistician
How to build an automatic statistician James Robert Lloyd 1, David Duvenaud 1, Roger Grosse 2, Joshua Tenenbaum 2, Zoubin Ghahramani 1 1: Department of Engineering, University of Cambridge, UK 2: Massachusetts
More informationShould all Machine Learning be Bayesian? Should all Bayesian models be non-parametric?
Should all Machine Learning be Bayesian? Should all Bayesian models be non-parametric? Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/
More informationAn Brief Overview of Particle Filtering
1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems
More informationσ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) =
Until now we have always worked with likelihoods and prior distributions that were conjugate to each other, allowing the computation of the posterior distribution to be done in closed form. Unfortunately,
More informationInvestigation into the use of confidence indicators with calibration
WORKSHOP ON FRONTIERS IN BENCHMARKING TECHNIQUES AND THEIR APPLICATION TO OFFICIAL STATISTICS 7 8 APRIL 2005 Investigation into the use of confidence indicators with calibration Gerard Keogh and Dave Jennings
More informationCPSC 340: Machine Learning and Data Mining. MLE and MAP Fall 2017
CPSC 340: Machine Learning and Data Mining MLE and MAP Fall 2017 Assignment 3: Admin 1 late day to hand in tonight, 2 late days for Wednesday. Assignment 4: Due Friday of next week. Last Time: Multi-Class
More informationPenalized Loss functions for Bayesian Model Choice
Penalized Loss functions for Bayesian Model Choice Martyn International Agency for Research on Cancer Lyon, France 13 November 2009 The pure approach For a Bayesian purist, all uncertainty is represented
More informationCOS513 LECTURE 8 STATISTICAL CONCEPTS
COS513 LECTURE 8 STATISTICAL CONCEPTS NIKOLAI SLAVOV AND ANKUR PARIKH 1. MAKING MEANINGFUL STATEMENTS FROM JOINT PROBABILITY DISTRIBUTIONS. A graphical model (GM) represents a family of probability distributions
More informationBayesian vs frequentist techniques for the analysis of binary outcome data
1 Bayesian vs frequentist techniques for the analysis of binary outcome data By M. Stapleton Abstract We compare Bayesian and frequentist techniques for analysing binary outcome data. Such data are commonly
More informationStatistical inference
Statistical inference Contents 1. Main definitions 2. Estimation 3. Testing L. Trapani MSc Induction - Statistical inference 1 1 Introduction: definition and preliminary theory In this chapter, we shall
More informationA First Course on Kinetics and Reaction Engineering Supplemental Unit S4. Numerically Fitting Models to Data
Supplemental Unit S4. Numerically Fitting Models to Data Defining the Problem Many software routines for fitting a model to experimental data can only be used when the model is of a pre-defined mathematical
More informationICML Scalable Bayesian Inference on Point processes. with Gaussian Processes. Yves-Laurent Kom Samo & Stephen Roberts
ICML 2015 Scalable Nonparametric Bayesian Inference on Point Processes with Gaussian Processes Machine Learning Research Group and Oxford-Man Institute University of Oxford July 8, 2015 Point Processes
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 11 January 7, 2013 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline How to communicate the statistical uncertainty
More informationinterval forecasting
Interval Forecasting Based on Chapter 7 of the Time Series Forecasting by Chatfield Econometric Forecasting, January 2008 Outline 1 2 3 4 5 Terminology Interval Forecasts Density Forecast Fan Chart Most
More informationStructural Uncertainty in Health Economic Decision Models
Structural Uncertainty in Health Economic Decision Models Mark Strong 1, Hazel Pilgrim 1, Jeremy Oakley 2, Jim Chilcott 1 December 2009 1. School of Health and Related Research, University of Sheffield,
More informationBayesian Econometrics
Bayesian Econometrics Christopher A. Sims Princeton University sims@princeton.edu September 20, 2016 Outline I. The difference between Bayesian and non-bayesian inference. II. Confidence sets and confidence
More informationDecision theory. 1 We may also consider randomized decision rules, where δ maps observed data D to a probability distribution over
Point estimation Suppose we are interested in the value of a parameter θ, for example the unknown bias of a coin. We have already seen how one may use the Bayesian method to reason about θ; namely, we
More informationBios 6649: Clinical Trials - Statistical Design and Monitoring
Bios 6649: Clinical Trials - Statistical Design and Monitoring Spring Semester 2015 John M. Kittelson Department of Biostatistics & nformatics Colorado School of Public Health University of Colorado Denver
More informationStatistical Tools and Techniques for Solar Astronomers
Statistical Tools and Techniques for Solar Astronomers Alexander W Blocker Nathan Stein SolarStat 2012 Outline Outline 1 Introduction & Objectives 2 Statistical issues with astronomical data 3 Example:
More informationProbability and Statistics
Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be CHAPTER 4: IT IS ALL ABOUT DATA 4a - 1 CHAPTER 4: IT
More informationOn the errors introduced by the naive Bayes independence assumption
On the errors introduced by the naive Bayes independence assumption Author Matthijs de Wachter 3671100 Utrecht University Master Thesis Artificial Intelligence Supervisor Dr. Silja Renooij Department of
More informationSTAT 425: Introduction to Bayesian Analysis
STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 1) Fall 2017 1 / 10 Lecture 7: Prior Types Subjective
More informationDeep Poisson Factorization Machines: a factor analysis model for mapping behaviors in journalist ecosystem
000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050
More informationCPSC 340: Machine Learning and Data Mining
CPSC 340: Machine Learning and Data Mining MLE and MAP Original version of these slides by Mark Schmidt, with modifications by Mike Gelbart. 1 Admin Assignment 4: Due tonight. Assignment 5: Will be released
More informationUncertainty analysis of nonpoint source pollution modeling:
2013 SWAT Conference Uncertainty analysis of nonpoint source pollution modeling: An important implication for Soil and Water Assessment Tool Professor Zhenyao Shen 2013-07-17 Toulouse Contents 1 2 3 4
More informationPredictive Engineering and Computational Sciences. Research Challenges in VUQ. Robert Moser. The University of Texas at Austin.
PECOS Predictive Engineering and Computational Sciences Research Challenges in VUQ Robert Moser The University of Texas at Austin March 2012 Robert Moser 1 / 9 Justifying Extrapolitive Predictions Need
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationIntroduction to Bayesian Statistics
Bayesian Parameter Estimation Introduction to Bayesian Statistics Harvey Thornburg Center for Computer Research in Music and Acoustics (CCRMA) Department of Music, Stanford University Stanford, California
More informationA decision theoretic approach to Imputation in finite population sampling
A decision theoretic approach to Imputation in finite population sampling Glen Meeden School of Statistics University of Minnesota Minneapolis, MN 55455 August 1997 Revised May and November 1999 To appear
More informationA Note on Lenk s Correction of the Harmonic Mean Estimator
Central European Journal of Economic Modelling and Econometrics Note on Lenk s Correction of the Harmonic Mean Estimator nna Pajor, Jacek Osiewalski Submitted: 5.2.203, ccepted: 30.0.204 bstract The paper
More informationIntroduction to Design and Analysis of Computer Experiments
Introduction to Design and Analysis of Thomas Santner Department of Statistics The Ohio State University October 2010 Outline Empirical Experimentation Empirical Experimentation Physical Experiments (agriculture,
More informationConfidence Distribution
Confidence Distribution Xie and Singh (2013): Confidence distribution, the frequentist distribution estimator of a parameter: A Review Céline Cunen, 15/09/2014 Outline of Article Introduction The concept
More informationLecture 13 Fundamentals of Bayesian Inference
Lecture 13 Fundamentals of Bayesian Inference Dennis Sun Stats 253 August 11, 2014 Outline of Lecture 1 Bayesian Models 2 Modeling Correlations Using Bayes 3 The Universal Algorithm 4 BUGS 5 Wrapping Up
More informationBayesian Melding. Assessing Uncertainty in UrbanSim. University of Washington
Bayesian Melding Assessing Uncertainty in UrbanSim Hana Ševčíková University of Washington hana@stat.washington.edu Joint work with Paul Waddell and Adrian Raftery University of Washington UrbanSim Workshop,
More informationarxiv: v1 [physics.data-an] 2 Mar 2011
Incorporating Nuisance Parameters in Likelihoods for Multisource Spectra J. S. Conway University of California, Davis, USA arxiv:1103.0354v1 [physics.data-an] Mar 011 1 Overview Abstract We describe here
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationFE670 Algorithmic Trading Strategies. Stevens Institute of Technology
FE670 Algorithmic Trading Strategies Lecture 3. Factor Models and Their Estimation Steve Yang Stevens Institute of Technology 09/12/2012 Outline 1 The Notion of Factors 2 Factor Analysis via Maximum Likelihood
More informationBayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007
Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.
More informationFORECASTING STANDARDS CHECKLIST
FORECASTING STANDARDS CHECKLIST An electronic version of this checklist is available on the Forecasting Principles Web site. PROBLEM 1. Setting Objectives 1.1. Describe decisions that might be affected
More informationQuantile POD for Hit-Miss Data
Quantile POD for Hit-Miss Data Yew-Meng Koh a and William Q. Meeker a a Center for Nondestructive Evaluation, Department of Statistics, Iowa State niversity, Ames, Iowa 50010 Abstract. Probability of detection
More informationChapter Three. Hypothesis Testing
3.1 Introduction The final phase of analyzing data is to make a decision concerning a set of choices or options. Should I invest in stocks or bonds? Should a new product be marketed? Are my products being
More informationBayesian Reliability Analysis: Statistical Challenges from Science-Based Stockpile Stewardship
: Statistical Challenges from Science-Based Stockpile Stewardship Alyson G. Wilson, Ph.D. agw@lanl.gov Statistical Sciences Group Los Alamos National Laboratory May 22, 28 Acknowledgments Christine Anderson-Cook
More informationBAYESIAN ANALYSIS OF DOSE-RESPONSE CALIBRATION CURVES
Libraries Annual Conference on Applied Statistics in Agriculture 2005-17th Annual Conference Proceedings BAYESIAN ANALYSIS OF DOSE-RESPONSE CALIBRATION CURVES William J. Price Bahman Shafii Follow this
More informationStandard Errors & Confidence Intervals. N(0, I( β) 1 ), I( β) = [ 2 l(β, φ; y) β i β β= β j
Standard Errors & Confidence Intervals β β asy N(0, I( β) 1 ), where I( β) = [ 2 l(β, φ; y) ] β i β β= β j We can obtain asymptotic 100(1 α)% confidence intervals for β j using: β j ± Z 1 α/2 se( β j )
More informationBAYESIAN INFERENCE ON MIXTURE OF GEOMETRIC WITH DEGENERATE DISTRIBUTION: ZERO INFLATED GEOMETRIC DISTRIBUTION
IJRRAS 3 () October www.arpapress.com/volumes/vol3issue/ijrras_3 5.pdf BAYESIAN INFERENCE ON MIXTURE OF GEOMETRIC WITH DEGENERATE DISTRIBUTION: ZERO INFLATED GEOMETRIC DISTRIBUTION Mayuri Pandya, Hardik
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationAges of stellar populations from color-magnitude diagrams. Paul Baines. September 30, 2008
Ages of stellar populations from color-magnitude diagrams Paul Baines Department of Statistics Harvard University September 30, 2008 Context & Example Welcome! Today we will look at using hierarchical
More informationBayesian Inference. Chapter 4: Regression and Hierarchical Models
Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Advanced Statistics and Data Mining Summer School
More informationABC methods for phase-type distributions with applications in insurance risk problems
ABC methods for phase-type with applications problems Concepcion Ausin, Department of Statistics, Universidad Carlos III de Madrid Joint work with: Pedro Galeano, Universidad Carlos III de Madrid Simon
More informationAreal data models. Spatial smoothers. Brook s Lemma and Gibbs distribution. CAR models Gaussian case Non-Gaussian case
Areal data models Spatial smoothers Brook s Lemma and Gibbs distribution CAR models Gaussian case Non-Gaussian case SAR models Gaussian case Non-Gaussian case CAR vs. SAR STAR models Inference for areal
More informationBayesian linear regression
Bayesian linear regression Linear regression is the basis of most statistical modeling. The model is Y i = X T i β + ε i, where Y i is the continuous response X i = (X i1,..., X ip ) T is the corresponding
More informationBayesian Semi-supervised Learning with Deep Generative Models
Bayesian Semi-supervised Learning with Deep Generative Models Jonathan Gordon Department of Engineering Cambridge University jg801@cam.ac.uk José Miguel Hernández-Lobato Department of Engineering Cambridge
More informationData Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University
Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Why uncertainty? Why should data mining care about uncertainty? We
More informationDSGE Methods. Estimation of DSGE models: GMM and Indirect Inference. Willi Mutschler, M.Sc.
DSGE Methods Estimation of DSGE models: GMM and Indirect Inference Willi Mutschler, M.Sc. Institute of Econometrics and Economic Statistics University of Münster willi.mutschler@wiwi.uni-muenster.de Summer
More informationLinear Models 1. Isfahan University of Technology Fall Semester, 2014
Linear Models 1 Isfahan University of Technology Fall Semester, 2014 References: [1] G. A. F., Seber and A. J. Lee (2003). Linear Regression Analysis (2nd ed.). Hoboken, NJ: Wiley. [2] A. C. Rencher and
More informationIntroduction to Design of Experiments
Introduction to Design of Experiments Jean-Marc Vincent and Arnaud Legrand Laboratory ID-IMAG MESCAL Project Universities of Grenoble {Jean-Marc.Vincent,Arnaud.Legrand}@imag.fr November 20, 2011 J.-M.
More informationHierarchical Models & Bayesian Model Selection
Hierarchical Models & Bayesian Model Selection Geoffrey Roeder Departments of Computer Science and Statistics University of British Columbia Jan. 20, 2016 Contact information Please report any typos or
More informationA Discussion of the Bayesian Approach
A Discussion of the Bayesian Approach Reference: Chapter 10 of Theoretical Statistics, Cox and Hinkley, 1974 and Sujit Ghosh s lecture notes David Madigan Statistics The subject of statistics concerns
More informationBased on slides by Richard Zemel
CSC 412/2506 Winter 2018 Probabilistic Learning and Reasoning Lecture 3: Directed Graphical Models and Latent Variables Based on slides by Richard Zemel Learning outcomes What aspects of a model can we
More informationhsnim: Hyper Scalable Network Inference Machine for Scale-Free Protein-Protein Interaction Networks Inference
CS 229 Project Report (TR# MSB2010) Submitted 12/10/2010 hsnim: Hyper Scalable Network Inference Machine for Scale-Free Protein-Protein Interaction Networks Inference Muhammad Shoaib Sehgal Computer Science
More informationPhysics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester
Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability
More informationIEOR E4570: Machine Learning for OR&FE Spring 2015 c 2015 by Martin Haugh. The EM Algorithm
IEOR E4570: Machine Learning for OR&FE Spring 205 c 205 by Martin Haugh The EM Algorithm The EM algorithm is used for obtaining maximum likelihood estimates of parameters when some of the data is missing.
More informationApproximate Bayesian computation: methods and applications for complex systems
Approximate Bayesian computation: methods and applications for complex systems Mark A. Beaumont, School of Biological Sciences, The University of Bristol, Bristol, UK 11 November 2015 Structure of Talk
More informationCOMS 4721: Machine Learning for Data Science Lecture 20, 4/11/2017
COMS 4721: Machine Learning for Data Science Lecture 20, 4/11/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University SEQUENTIAL DATA So far, when thinking
More informationDSGE-Models. Limited Information Estimation General Method of Moments and Indirect Inference
DSGE-Models General Method of Moments and Indirect Inference Dr. Andrea Beccarini Willi Mutschler, M.Sc. Institute of Econometrics and Economic Statistics University of Münster willi.mutschler@uni-muenster.de
More informationComparing Non-informative Priors for Estimation and Prediction in Spatial Models
Environmentrics 00, 1 12 DOI: 10.1002/env.XXXX Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Regina Wu a and Cari G. Kaufman a Summary: Fitting a Bayesian model to spatial
More informationGAUSSIAN PROCESS REGRESSION
GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The
More informationUncertainty due to Finite Resolution Measurements
Uncertainty due to Finite Resolution Measurements S.D. Phillips, B. Tolman, T.W. Estler National Institute of Standards and Technology Gaithersburg, MD 899 Steven.Phillips@NIST.gov Abstract We investigate
More information