Analog Circuit Verification by Statistical Model Checking
|
|
- Sharlene Skinner
- 5 years ago
- Views:
Transcription
1 16 th Asian Pacific Design Automation ti Conference Analog Circuit Verification by Statistical Model Checking Author Ying-Chih Wang (Presenter) Anvesh Komuravelli Paolo Zuliani Edmund M. Clarke Date 2011/1/26 Thi k i d b th Gi S l S t R h C t This work is sponsored by the GigaScale Systems Research Center, Semiconductor Research Corporation, National Science Foundation, General Motors, Air Force and the Office of Naval Research
2 Overview Introduction Background Assumptions Bayesian Statistics Algorithm Application to Analog Circuits Bounded Linear Temporal Logic Experimental Results Discussion Conclusion and Future Work 2
3 Introduction Motivation Analog circuits behaviors vary dramatically with changes in prameters MOSFET parameters, temperature, etc. Performance, correctness Circuits resiliency under uncertainties Process variation, noise, etc. How to deal with uncertainties? 3
4 Introduction Problem Does the design satisfy the property with probability p threshold θ? Monte Carlo: estimate the probability p How good is the estimate? Property : Bounded Linear Temporal Logic (BLTL) Statistical Model Checking (SMC) Recently applied to Stateflow/Simulink models [1] Probabilistic Guarantees Bayesian Hypothesis Testing Bayesian Estimation [1] P. Zuliani, A. Platzer and E. M. Clarke, Bayesian Statistical Model Checking with Application to Stateflow/Simulink Verification, in HSCC
5 Background Assumptions Monte Carlo Simulation Draw independent and identically distributed (i.i.d.) i sample traces Characterize each simulation trace as Pass/Fail Property Ф + = Biased coin Stochastic system M M Ф with probability p Bernoulli Random Variable Y P(Y=1) = p P(Y=0) = 1-p 5
6 Background Bayesian Statistics Perform Experiments Sequentially Prior density of p (Beta Distribution) Bayesian 40 Estimation 35 Experimental Results {Pass,Fail,Pass,Pass, } Bayesian Hypothesis Testing Bayes 20 B Factor Posterior density of p 6
7 Overview Analog Circuit Verification Analog Circuit it Descriptions Process Variation Descriptions SPICE Circuit Simulator BLTL Formula φ Execution Traces σ BLTL Model Checker Bayes Test Bayes Factor > T M P θ (φ) M P <θ (φ) Bayes Factor < 1/T 7
8 Background Bayesian Statistics Null hypothesis H 0 (p θ) Alternative hypothesis H 1 1(p (p<θ) P(H 0 ), P(H 1 ) sum to 1 X: experimental results Bayes Factor Jefferys Test [1960s] Fixed Sample Size Bayes Factor > 100 : Strongly supports H 0 Sequential Version (Fixed Bayes Factor Threshold) 8
9 Statistical Model Checking Algorithm Require: Property P θ (Φ), Threshold T 1, Prior density g n := 0 {number of traces drawn so far} x := 0 {number of traces satisfying i Φ so far} repeat σ := draw a sample trace of the system (iid) n := n+1 if σ Φ then x := x + 1 endif B:= BayesFactor(n, x, θ, g) until (B > T vb< 1/T ) if (B >T)then ) return H 0 accepted else return HH 0 rejected endif Theorem (Error bounds). When the Bayesian algorithm using threshold T stops, the following holds: Prob ( accept H 0 H 1 ) 1/T Prob ( reject H 0 H 0 ) 1/T 9
10 Bounded Linear Temporal Logic States: State Variables Atomic Propositions (AP): e 1 ~ e 2 e 1, e 2 : arithmetic expressions over state variables, ~ {<,,>,,=},,, t t Syntax φ :: = AP φ1 φ2 φ F φ G φ Let σ = (s 0,t 0 ), (s 1,t 1 ),... be a simulation trace of the model stays in s i for duration t i. s 0 s 1 s t k 0 t 1 t k σ k : suffix trace of σ starting at state k σ k = (s k,t k ), (s k+1,t k+1 ), 10
11 Bounded Linear Temporal Logic The semantics of BLTL for suffix trace σ k (trace starting at state k): σ k AP σ k φ iff σ k Ф 1 or σ k Φ 1 φ2 2 σ k φ iff σ k Φ does not hold σ k i 1 t F φ tk+ l t s k iff atomic proposition AP is true in state s k iff exists i 0 s.t. and σ k+i l= 0 Φ Ф s k+i t k t k+1 t k+i time bound t F [10ns] (Vout>1) Within 10ns, Vout should eventually be larger than 1 volt. 11
12 Bounded Linear Temporal Logic The semantics of BLTL for suffix trace σ k (trace starting at state k): σ k s k t G φ i 1 iff for all i 0 s.t. t t, σ k+i Φ Ф Ф s Ф k+i t k t k+1 t k+i time bound t G [10ns] (Vout>1) In the next 10ns, Vout should always be greater than 1 volt. Nesting Operators F [100ns] G [10ns] (Vout>1) Within 100ns, Vout will be greater than 1 volt and stay greater than 1 volt for at least 10 ns. l= 0 Ф k+ l 12
13 An Example : OP amp
14 An Example : OP amp Specifications Specifications 1 Input Offset Voltage < 1 mv 2 Output Swing Range 0.2 V to 1 V 3 Slew Rate > 25 V/μSec 4 Open-Loop Voltage > 8000 V/V Gain 5 Loop-Gain Unit-gain > 10 MHz Frequency 6 Phase Margin > 60 14
15 Example : OP amp Specifications Specifications : Quantities computed from simulation traces in most cases, can be translated to BLTL directly from definitions e.g. Swing Range: Max(Vout) > 1V and Min(Vout) < 0.2V F [100μs] (V out < 0.2) F [100μs] (V out > 1) Within entire trace, Vout will eventually be greater than 1V and smaller than 0.2V 100μs : the end time of transient simulation 15
16 Example : OP amp Specifications Frequency-domain properties? Transient response σ =(s 0,t 0 ), (s 1,t 1 )(s ),(s 2,t 2 ), Frequency response σ = (s 0,f 0 ), (s 1,f 1 ),(s 2,f 2 ), Substitute frequency-stamps for time-stamps Specify frequency domain properties in BLTL Check with the same BLTL checker Semantics Changed G [1GHz] φ :φ holds for all frequencies from current to 1GHz larger F [1GHz] φ :φ holds for some frequency from current to 1GHz larger 16
17 An Example : OP amp Specifications Specifications ( Transient ) BLTL Specifications 1 Input Offset Voltage < 1 mv F [100μs] (V out = 0.6) G [100μs] ((V out = 0.6) ( V in+ V in < 0.001)) 2 Output Swing Range 0.2 V to 1 V F [100μs] (V out < 0.2) F [100μs] (V out > 1) 3 Slew Rate > 25 V/μSec G [100μs] ( ((V out = 1 V in > 0.65) F [0.008μs] (V out = 0.8)) ((V out = 0.2 V in < 0.55) F [0.008μs] (V out = 0.4)) ) Specifications ( Frequency-domain ) BLTL Specifications 4 Open-Loop Voltage Gain > 8000 V/V G [1KHz] (V mag out > 8000) 5 Loop-Gain Unit-gain >10MHz G [10MHz] (V mag out >1) Frequency 6 Phase Margin > 60 F [10GHz] (V mag out = 1) G [10GHz] ((Vmag (V out =1) (Vphase out >60 ) ) 17
18 Experimental Setup Platform: Linux virtual machine running on a 2.26GHz i3-350m, 4GB RAM computer We model-check the BLTL formulae on previous slides e.g. (Swing Range) H 0 :M P θ [F [100μs] (V out < 0.2) F [100μs] (V out > 1)] 18
19 Experimental Results Target yield: 0.95 Monte Carlo analysis: do not satisfy /satisfy yield threshold Monte Carlo (1000 samples) Measured Value Specifications Mean Stddev Yield 1 Input Offset Voltage (V) Swing Range Min (V) Swing Range Max (V) Negative Slew Rate (V/μSec) Positive Slew Rate (V/μSec) Open-Loop Voltage Gain (V/V) Loop-Gain UGF (MHz) Phase Margin ( )
20 Experimental Results Statistical Model Checking Null Hypothesis (H 0 ): From Specs, e.g. H 0 :M P θ [F [100μs] (V out < 0.2) F [100μs] (V out > 1)] Probability Threshold θ: Set to Target Yield Bayes Factor Threshold T: Test Strength Needed T = 1000 : Probability of error is less than Prior Distribution of p: Uniform : No idea 20
21 Experimental Results Monte Carlo : do not satisfy /satisfy yield threshold 0.95 SMC testing result : Prior: Uniform, T = 1000, θ = 0.95 reject / accept null hypothesis Monte Carlo (1000 samples) Measured Value SMC(θ = 0.95 ) Specifications Mean Stddev Yield Samples/Runtime 1 Input Offset Voltage (V) /39s 2 Swing Range Min (V) /98s Swing Range Max (V) /98s 3 Negative Slew Rate (V/μSec) /98s Positive Slew Rate (V/μSec) /98s 4 Open-Loop Voltage Gain (V/V) /303s 5 Loop-Gain UGF (MHz) /98s 6 Phase Margin ( ) /98s 21
22 Experimental Results Prior: uniform, T = 1000, θ in [0.7, 0.999] SMC testing result : reject / accept null hypothesis Spec Null Hypothesis 1 M P θ [F [100μs] (V out = 0.6) G [100μs] ((V out = 0.6) ( V in+ V in <0.001))] 2 M P <02) θ [F [100μs] (V out 0.2) F [100μs] (V out >1)] 3 M P θ [G [100μs] ( ((V out = 1 V in > 0.65) F [0.008μs] (V out < 0.8)) ((V out < 0.2 V in < 0.55) F [0.008μs] (V out < 0.4)) )] Samples/Runtime Probability threshold θ Spec /105s 9933/12161s 201/275s 10/13s 7/9s 2 16/18s 24/27s 44/51s 239/280s 693/813s 3 16/23s 24/31s 44/57s 239/316s 693/916s 22
23 Experimental Results Prior: uniform, T = 1000, θ in [0.7, 0.999] SMC testing result : reject / accept null hypothesis Spec Null Hypothesis 4 M P θ [G [1KHz] (Vmag out > 8000)] 5 M P θ [G [10MHz] (Vmag out >1)] 6 M P θ [F [10GHz] (Vmag out = 1) G [10GHz] ((Vmag out =1) (Vphase out >60 ))] Samples/Runtime Probability threshold θ Spec /26s 43/49s 98/114s 1103/1309s 50/57s 5 16/18s 24/28s 44/51s 239/279s 693/807s 6 16/20s 24/30s 44/55s 239/303s 693/882s 23
24 Discussion Statistical Model Checking is faster when The threshold probability θ is away from the unknown probability p, e.g. p 1 θ Samples Easily integrated in the validation flow Relies only on Monte Carlo sampling and SPICE Add-ons: Online BLTL monitoring, Bayes factor calculation Low computation overhead Runtime dominated by SPICE simulation 24
25 Conclusion Introduced SMC to analog circuit verification Avoid Monte Carlo simulation drawing unnecessary samples when evidence is enough. Demonstrated the feasibility of using BLTL specifications for a simple analog circuit BLTL can specify complex interactions between signals Future Works More experiments larger examples with more complicated specifications Introducing the technique of importance sampling 25
26 Thank you. 26
27 Bayesian Statistical Model Checking Suppose satisfies with (unknown) probability p p is given by a random variable (defined on [0,1]) with density g g represents the prior belief that satisfies Generate independent and identically distributed (iid) sample traces σ 1,,σσ n X = {x 1,,x n }, x i : the i th sample trace σ i satisfies x i =1iff x i =0iff Then, x i will be a Bernoulli trial with conditional density (likelihood function) :f(x = x i 1-x i i u) u (1 u) 27
28 Statistical Model Checking Calculate Bayes Factor using posterior and prior probability bilit Posterior density (Bayes Theorem) (cond. iid Bernoulli s) Likelihood function 28
29 Statistical Model Checking H 0 : p θ, H 1 : p < θ, Prior Probability (g: prior density of p) P( H ) π g( udu ) = = PH ( 1) gudu ( ) 1 0 θ θ = = π 0 Bayes Factor of sample X={x 1,,x n }, and hypotheses H 0, H 1 is 1 P ( H X ) P ( H ) f( x u) f( xn u) g( u) du θ 1 π PH ( X) PH ( ) f( x u) f( x u) gudu ( ) θ 0 = θ 1 0 π n 29
30 Beta Prior Prior g is Beta of parameters α>0,, β>0 F (, ) ( ) is the Beta distribution function (i.e., Prob(X u))
31 Why Beta priors? Defined over [0,1] Beta distributions are conjugate to Binomial distributions: If prior g is Beta and likelihood function is Binomial then posterior is Beta Suppose likelihood Binomial(n,x), prior Beta(α,β): posterior f(u x 1,,x n ) where x = Σ i x i f(x 1 u) f(x n u) g(u) = u x (1 u) n-x u α-1 (1 u) β-1 = u x+α -1 (1 u) n-x+β-1 Posterior is Beta of parameters x+α and n-x+β 07/16/09
32 Beta Density Shapes α=.5 β= α=2 β= α=4 β=
33 Computing the Bayes Factor Proposition The Bayes factor of H 0 :M P θ (Φ) vs H 1 :M P <θ (Φ) for n Bernoulli samples (with x n successes) and prior Beta(α,β) where F (, ) ( ) is the Beta distribution function.
34 Bayesian Interval Estimation - IV width 2δ prior is beta(α=4,β=5) posterior density after 1000 samples and 900 successes is beta(α=904,β=105) posterior mean =
35 Related Work Monte Carlo Simulation A random variable X with unknown distribution ib ti f(x) p = PX ( > t ) Monte Carlo is a statistical method to estimate probability of an event (such as X > t) under unknown distribution. The goal is to find p = E { I ( X > t} N 1 pˆ = I( X > t) p MC N i = 1 N i 1, x> t I( x> t) = 0,otherwise The Monte Carlo method generates N independent samples of X (X 1,,X N )to form a estimate of p 35
36 Related Work Importance Sampling Importance Sampling samples from a biased distribution that the rare event happens more often. Let f(x) is the density of X, f*(x) is the density of X* p= E I( X > t} { } = I( x> t) f ( x) dx f( x) * = I( x> t) f ( xdx ) * f ( x ) { ( * t) W( X * )} = E I X > W( x) = f ( x) * f ( x) Sampling from a biased random variable X*, IS estimator: 1 N * * pˆ IS = W ( Xi ) I ( Xi > t ) N i = 1 36
37 Bayesian Interval Estimation - V Require: BLTL property Φ, interval-width δ, coverage c, prior beta parameters α,β n := 0 {number of traces drawn so far} x := 0 {number of traces satisfying so far} repeat σ := draw a sample trace of the system (iid) n := n + 1 if σ Φ then x := x + 1 endif mean = (x+α)/(n+α+β) (t 0,t 1 ) = (mean-δ, mean+δ) I:= PosteriorProbability (t 0,t 1,n,x,α,β) until (I > c) return (t 0,t 1 ), mean
38 Bayesian Interval Estimation - VI Recall the algorithm outputs the interval (t 0,t 1 ) Define the null hypothesis H 0 : t 0 < p < t 1 Theorem (Error bound). When the Bayesian estimation algorithm (using coverage ½< c < 1) stops we have Prob ( accept H 0 H 1 ) (1/c -1)π 0 /(1-π 0 ) Prob ( reject H 0 H 0 ) (1/c -1)π 0 /(1-π π 0 ) π 0 is the prior probability of H 0
Analog Circuit Verification by Statistical Model Checking
Analog Circuit Verification by Statistical Model Checking Ying-Chih Wang, Anvesh Komuravelli, Paolo Zuliani and Edmund M. Clarke Department of Electrical and Computer Engineering, Carnegie Mellon University,
More informationBayesian statistical model checking with application to Stateflow/Simulink verification
Form Methods Syst Des (2013) 43:338 367 DOI 10.1007/s10703-013-0195-3 Bayesian statistical model checking with application to Stateflow/Simulink verification Paolo Zuliani André Platzer Edmund M. Clarke
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationBayesian RL Seminar. Chris Mansley September 9, 2008
Bayesian RL Seminar Chris Mansley September 9, 2008 Bayes Basic Probability One of the basic principles of probability theory, the chain rule, will allow us to derive most of the background material in
More informationConcept of Statistical Model Checking. Presented By: Diana EL RABIH
Concept of Statistical Model Checking Presented By: Diana EL RABIH Introduction Model checking of stochastic systems Continuous-time Markov chains Continuous Stochastic Logic (CSL) Probabilistic time-bounded
More informationStatistical Model Checking Applied on Perception and Decision-making Systems for Autonomous Driving
Statistical Model Checking Applied on Perception and Decision-making Systems for Autonomous Driving J. Quilbeuf 1 M. Barbier 2,3 L. Rummelhard 3 C. Laugier 2 A. Legay 1 T. Genevois 2 J. Ibañez-Guzmán 3
More informationComputational Perception. Bayesian Inference
Computational Perception 15-485/785 January 24, 2008 Bayesian Inference The process of probabilistic inference 1. define model of problem 2. derive posterior distributions and estimators 3. estimate parameters
More informationTUTORIAL 8 SOLUTIONS #
TUTORIAL 8 SOLUTIONS #9.11.21 Suppose that a single observation X is taken from a uniform density on [0,θ], and consider testing H 0 : θ = 1 versus H 1 : θ =2. (a) Find a test that has significance level
More informationLecture : Probabilistic Machine Learning
Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2017 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationPoint Estimation. Vibhav Gogate The University of Texas at Dallas
Point Estimation Vibhav Gogate The University of Texas at Dallas Some slides courtesy of Carlos Guestrin, Chris Bishop, Dan Weld and Luke Zettlemoyer. Basics: Expectation and Variance Binary Variables
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationData Analysis and Uncertainty Part 2: Estimation
Data Analysis and Uncertainty Part 2: Estimation Instructor: Sargur N. University at Buffalo The State University of New York srihari@cedar.buffalo.edu 1 Topics in Estimation 1. Estimation 2. Desirable
More informationEstimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio
Estimation of reliability parameters from Experimental data (Parte 2) This lecture Life test (t 1,t 2,...,t n ) Estimate θ of f T t θ For example: λ of f T (t)= λe - λt Classical approach (frequentist
More informationSome slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2
Logistics CSE 446: Point Estimation Winter 2012 PS2 out shortly Dan Weld Some slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2 Last Time Random variables, distributions Marginal, joint & conditional
More informationComparison of Bayesian and Frequentist Inference
Comparison of Bayesian and Frequentist Inference 18.05 Spring 2014 First discuss last class 19 board question, January 1, 2017 1 /10 Compare Bayesian inference Uses priors Logically impeccable Probabilities
More informationIntroduction to Probability and Statistics (Continued)
Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:
More informationIntroduction to Machine Learning. Lecture 2
Introduction to Machine Learning Lecturer: Eran Halperin Lecture 2 Fall Semester Scribe: Yishay Mansour Some of the material was not presented in class (and is marked with a side line) and is given for
More informationBayesian Statistics. Debdeep Pati Florida State University. February 11, 2016
Bayesian Statistics Debdeep Pati Florida State University February 11, 2016 Historical Background Historical Background Historical Background Brief History of Bayesian Statistics 1764-1838: called probability
More informationIntroduction to Bayesian Statistics 1
Introduction to Bayesian Statistics 1 STA 442/2101 Fall 2018 1 This slide show is an open-source document. See last slide for copyright information. 1 / 42 Thomas Bayes (1701-1761) Image from the Wikipedia
More informationBayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework
HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for
More informationProbability and Estimation. Alan Moses
Probability and Estimation Alan Moses Random variables and probability A random variable is like a variable in algebra (e.g., y=e x ), but where at least part of the variability is taken to be stochastic.
More informationCSC321 Lecture 18: Learning Probabilistic Models
CSC321 Lecture 18: Learning Probabilistic Models Roger Grosse Roger Grosse CSC321 Lecture 18: Learning Probabilistic Models 1 / 25 Overview So far in this course: mainly supervised learning Language modeling
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More informationBayesian Social Learning with Random Decision Making in Sequential Systems
Bayesian Social Learning with Random Decision Making in Sequential Systems Yunlong Wang supervised by Petar M. Djurić Department of Electrical and Computer Engineering Stony Brook University Stony Brook,
More informationMachine Learning using Bayesian Approaches
Machine Learning using Bayesian Approaches Sargur N. Srihari University at Buffalo, State University of New York 1 Outline 1. Progress in ML and PR 2. Fully Bayesian Approach 1. Probability theory Bayes
More informationMachine Learning CMPT 726 Simon Fraser University. Binomial Parameter Estimation
Machine Learning CMPT 726 Simon Fraser University Binomial Parameter Estimation Outline Maximum Likelihood Estimation Smoothed Frequencies, Laplace Correction. Bayesian Approach. Conjugate Prior. Uniform
More informationSTAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01
STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist
More informationIntroduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the
More informationUSEFUL PROPERTIES OF THE MULTIVARIATE NORMAL*
USEFUL PROPERTIES OF THE MULTIVARIATE NORMAL* 3 Conditionals and marginals For Bayesian analysis it is very useful to understand how to write joint, marginal, and conditional distributions for the multivariate
More informationAnswers and expectations
Answers and expectations For a function f(x) and distribution P(x), the expectation of f with respect to P is The expectation is the average of f, when x is drawn from the probability distribution P E
More informationA Robustness Optimization of SRAM Dynamic Stability by Sensitivity-based Reachability Analysis
ASP-DAC 2014 A Robustness Optimization of SRAM Dynamic Stability by Sensitivity-based Reachability Analysis Yang Song, Sai Manoj P. D. and Hao Yu School of Electrical and Electronic Engineering, Nanyang
More informationIntroduction to Machine Learning
Introduction to Machine Learning Generative Models Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574 1
More informationOutline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution
Outline A short review on Bayesian analysis. Binomial, Multinomial, Normal, Beta, Dirichlet Posterior mean, MAP, credible interval, posterior distribution Gibbs sampling Revisit the Gaussian mixture model
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationIntroduction to Bayesian Learning. Machine Learning Fall 2018
Introduction to Bayesian Learning Machine Learning Fall 2018 1 What we have seen so far What does it mean to learn? Mistake-driven learning Learning by counting (and bounding) number of mistakes PAC learnability
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationTime Series and Dynamic Models
Time Series and Dynamic Models Section 1 Intro to Bayesian Inference Carlos M. Carvalho The University of Texas at Austin 1 Outline 1 1. Foundations of Bayesian Statistics 2. Bayesian Estimation 3. The
More informationan introduction to bayesian inference
with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena
More informationIntroduction: MLE, MAP, Bayesian reasoning (28/8/13)
STA561: Probabilistic machine learning Introduction: MLE, MAP, Bayesian reasoning (28/8/13) Lecturer: Barbara Engelhardt Scribes: K. Ulrich, J. Subramanian, N. Raval, J. O Hollaren 1 Classifiers In this
More informationDS-GA 1002 Lecture notes 11 Fall Bayesian statistics
DS-GA 100 Lecture notes 11 Fall 016 Bayesian statistics In the frequentist paradigm we model the data as realizations from a distribution that depends on deterministic parameters. In contrast, in Bayesian
More informationMultiple regression. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar
Multiple regression CM226: Machine Learning for Bioinformatics. Fall 2016 Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar Multiple regression 1 / 36 Previous two lectures Linear and logistic
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationIntroduc)on to Bayesian Methods
Introduc)on to Bayesian Methods Bayes Rule py x)px) = px! y) = px y)py) py x) = px y)py) px) px) =! px! y) = px y)py) y py x) = py x) =! y "! y px y)py) px y)py) px y)py) px y)py)dy Bayes Rule py x) =
More informationHierarchical Models & Bayesian Model Selection
Hierarchical Models & Bayesian Model Selection Geoffrey Roeder Departments of Computer Science and Statistics University of British Columbia Jan. 20, 2016 Contact information Please report any typos or
More informationA Bayesian Approach to Model Checking Biological Systems
A Bayesian Approach to Model Checking Biological Systems Sumit K. Jha, Edmund M. Clarke, Christopher J. Langmead,2, Axel Legay 3, André Platzer, and Paolo Zuliani Computer Science Department, Carnegie
More informationMachine Learning. Probability Basics. Marc Toussaint University of Stuttgart Summer 2014
Machine Learning Probability Basics Basic definitions: Random variables, joint, conditional, marginal distribution, Bayes theorem & examples; Probability distributions: Binomial, Beta, Multinomial, Dirichlet,
More informationProbability, Entropy, and Inference / More About Inference
Probability, Entropy, and Inference / More About Inference Mário S. Alvim (msalvim@dcc.ufmg.br) Information Theory DCC-UFMG (2018/02) Mário S. Alvim (msalvim@dcc.ufmg.br) Probability, Entropy, and Inference
More informationFrequentist Statistics and Hypothesis Testing Spring
Frequentist Statistics and Hypothesis Testing 18.05 Spring 2018 http://xkcd.com/539/ Agenda Introduction to the frequentist way of life. What is a statistic? NHST ingredients; rejection regions Simple
More informationIntroduction Probabilistic Programming ProPPA Inference Results Conclusions. Embedding Machine Learning in Stochastic Process Algebra.
Embedding Machine Learning in Stochastic Process Algebra Jane Hillston Joint work with Anastasis Georgoulas and Guido Sanguinetti, School of Informatics, University of Edinburgh 16th August 2017 quan col....
More informationF denotes cumulative density. denotes probability density function; (.)
BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 11 January 7, 2013 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline How to communicate the statistical uncertainty
More informationLinear Models A linear model is defined by the expression
Linear Models A linear model is defined by the expression x = F β + ɛ. where x = (x 1, x 2,..., x n ) is vector of size n usually known as the response vector. β = (β 1, β 2,..., β p ) is the transpose
More informationIntroduction to Probabilistic Machine Learning
Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning
More informationLecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis
Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationBayesian Phylogenetics:
Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes
More informationFundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner
Fundamentals CS 281A: Statistical Learning Theory Yangqing Jia Based on tutorial slides by Lester Mackey and Ariel Kleiner August, 2011 Outline 1 Probability 2 Statistics 3 Linear Algebra 4 Optimization
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 2: Bayesian Basics https://people.orie.cornell.edu/andrew/orie6741 Cornell University August 25, 2016 1 / 17 Canonical Machine Learning
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University August 30, 2017 Today: Decision trees Overfitting The Big Picture Coming soon Probabilistic learning MLE,
More informationStrong Lens Modeling (II): Statistical Methods
Strong Lens Modeling (II): Statistical Methods Chuck Keeton Rutgers, the State University of New Jersey Probability theory multiple random variables, a and b joint distribution p(a, b) conditional distribution
More informationDIFFERENT APPROACHES TO STATISTICAL INFERENCE: HYPOTHESIS TESTING VERSUS BAYESIAN ANALYSIS
DIFFERENT APPROACHES TO STATISTICAL INFERENCE: HYPOTHESIS TESTING VERSUS BAYESIAN ANALYSIS THUY ANH NGO 1. Introduction Statistics are easily come across in our daily life. Statements such as the average
More informationHypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006
Hypothesis Testing Part I James J. Heckman University of Chicago Econ 312 This draft, April 20, 2006 1 1 A Brief Review of Hypothesis Testing and Its Uses values and pure significance tests (R.A. Fisher)
More informationCS 446 Machine Learning Fall 2016 Nov 01, Bayesian Learning
CS 446 Machine Learning Fall 206 Nov 0, 206 Bayesian Learning Professor: Dan Roth Scribe: Ben Zhou, C. Cervantes Overview Bayesian Learning Naive Bayes Logistic Regression Bayesian Learning So far, we
More informationReview of Probabilities and Basic Statistics
Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to
More informationSequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember
More informationParameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn
Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation
More informationCompute f(x θ)f(θ) dθ
Bayesian Updating: Continuous Priors 18.05 Spring 2014 b a Compute f(x θ)f(θ) dθ January 1, 2017 1 /26 Beta distribution Beta(a, b) has density (a + b 1)! f (θ) = θ a 1 (1 θ) b 1 (a 1)!(b 1)! http://mathlets.org/mathlets/beta-distribution/
More informationK. Nishijima. Definition and use of Bayesian probabilistic networks 1/32
The Probabilistic Analysis of Systems in Engineering 1/32 Bayesian probabilistic bili networks Definition and use of Bayesian probabilistic networks K. Nishijima nishijima@ibk.baug.ethz.ch 2/32 Today s
More informationIntroduction to Artificial Intelligence (AI)
Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 9 Oct, 11, 2011 Slide credit Approx. Inference : S. Thrun, P, Norvig, D. Klein CPSC 502, Lecture 9 Slide 1 Today Oct 11 Bayesian
More informationFoundations of Statistical Inference
Foundations of Statistical Inference Julien Berestycki Department of Statistics University of Oxford MT 2016 Julien Berestycki (University of Oxford) SB2a MT 2016 1 / 20 Lecture 6 : Bayesian Inference
More informationAnnouncements. Proposals graded
Announcements Proposals graded Kevin Jamieson 2018 1 Hypothesis testing Machine Learning CSE546 Kevin Jamieson University of Washington October 30, 2018 2018 Kevin Jamieson 2 Anomaly detection You are
More informationInference for a Population Proportion
Al Nosedal. University of Toronto. November 11, 2015 Statistical inference is drawing conclusions about an entire population based on data in a sample drawn from that population. From both frequentist
More informationArtificial Intelligence
Artificial Intelligence Probabilities Marc Toussaint University of Stuttgart Winter 2018/19 Motivation: AI systems need to reason about what they know, or not know. Uncertainty may have so many sources:
More informationBayesian Statistical Methods. Jeff Gill. Department of Political Science, University of Florida
Bayesian Statistical Methods Jeff Gill Department of Political Science, University of Florida 234 Anderson Hall, PO Box 117325, Gainesville, FL 32611-7325 Voice: 352-392-0262x272, Fax: 352-392-8127, Email:
More informationBayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007
Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.
More informationBayesian Methods with Monte Carlo Markov Chains II
Bayesian Methods with Monte Carlo Markov Chains II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University hslu@stat.nctu.edu.tw http://tigpbp.iis.sinica.edu.tw/courses.htm 1 Part 3
More informationComputational Cognitive Science
Computational Cognitive Science Lecture 9: A Bayesian model of concept learning Chris Lucas School of Informatics University of Edinburgh October 16, 218 Reading Rules and Similarity in Concept Learning
More informationProbability and Information Theory. Sargur N. Srihari
Probability and Information Theory Sargur N. srihari@cedar.buffalo.edu 1 Topics in Probability and Information Theory Overview 1. Why Probability? 2. Random Variables 3. Probability Distributions 4. Marginal
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 143 Part IV
More informationAarti Singh. Lecture 2, January 13, Reading: Bishop: Chap 1,2. Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell
Machine Learning 0-70/5 70/5-78, 78, Spring 00 Probability 0 Aarti Singh Lecture, January 3, 00 f(x) µ x Reading: Bishop: Chap, Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell Announcements Homework
More informationParametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012
Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood
More informationBayesian Inference for Dirichlet-Multinomials
Bayesian Inference for Dirichlet-Multinomials Mark Johnson Macquarie University Sydney, Australia MLSS Summer School 1 / 50 Random variables and distributed according to notation A probability distribution
More informationBayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification
Bayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification Michael Anderson, PhD Hélène Carabin, DVM, PhD Department of Biostatistics and Epidemiology The University of Oklahoma
More informationDavid Giles Bayesian Econometrics
David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian
More information(4) One-parameter models - Beta/binomial. ST440/550: Applied Bayesian Statistics
Estimating a proportion using the beta/binomial model A fundamental task in statistics is to estimate a proportion using a series of trials: What is the success probability of a new cancer treatment? What
More informationRejection sampling - Acceptance probability. Review: How to sample from a multivariate normal in R. Review: Rejection sampling. Weighted resampling
Rejection sampling - Acceptance probability Review: How to sample from a multivariate normal in R Goal: Simulate from N d (µ,σ)? Note: For c to be small, g() must be similar to f(). The art of rejection
More information2016 SISG Module 17: Bayesian Statistics for Genetics Lecture 3: Binomial Sampling
2016 SISG Module 17: Bayesian Statistics for Genetics Lecture 3: Binomial Sampling Jon Wakefield Departments of Statistics and Biostatistics University of Washington Outline Introduction and Motivating
More informationApproximate Bayesian Computation
Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki and Aalto University 1st December 2015 Content Two parts: 1. The basics of approximate
More informationPart III. A Decision-Theoretic Approach and Bayesian testing
Part III A Decision-Theoretic Approach and Bayesian testing 1 Chapter 10 Bayesian Inference as a Decision Problem The decision-theoretic framework starts with the following situation. We would like to
More informationLearning Bayesian Networks
Learning Bayesian Networks Probabilistic Models, Spring 2011 Petri Myllymäki, University of Helsinki V-1 Aspects in learning Learning the parameters of a Bayesian network Marginalizing over all all parameters
More informationPreliminary statistics
1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),
More informationSequential Monitoring of Clinical Trials Session 4 - Bayesian Evaluation of Group Sequential Designs
Sequential Monitoring of Clinical Trials Session 4 - Bayesian Evaluation of Group Sequential Designs Presented August 8-10, 2012 Daniel L. Gillen Department of Statistics University of California, Irvine
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More informationAccouncements. You should turn in a PDF and a python file(s) Figure for problem 9 should be in the PDF
Accouncements You should turn in a PDF and a python file(s) Figure for problem 9 should be in the PDF Please do not zip these files and submit (unless there are >5 files) 1 Bayesian Methods Machine Learning
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 2. MLE, MAP, Bayes classification Barnabás Póczos & Aarti Singh 2014 Spring Administration http://www.cs.cmu.edu/~aarti/class/10701_spring14/index.html Blackboard
More informationStatistical Inference
Statistical Inference Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Spring, 2006 1. DeGroot 1973 In (DeGroot 1973), Morrie DeGroot considers testing the
More informationBayesian Inference: Concept and Practice
Inference: Concept and Practice fundamentals Johan A. Elkink School of Politics & International Relations University College Dublin 5 June 2017 1 2 3 Bayes theorem In order to estimate the parameters of
More information