Uncertainty Quantification in Performance Evaluation of Manufacturing Processes
|
|
- Jessie Blake
- 6 years ago
- Views:
Transcription
1 Uncertainty Quantification in Performance Evaluation of Manufacturing Processes Manufacturing Systems October 27, 2014 Saideep Nannapaneni, Sankaran Mahadevan Vanderbilt University, Nashville, TN Acknowledgement Funding support: NIST (Cooperative Agreement NANB14H036, Monitor: Sudarsan Rachuri) Technical help: Seung-Jun Shin, Senthilkumaran Kumaraguru, Anantha Narayanan, Sudarsan Rachuri
2 Why Uncertainty Quantification? Uncertainty Sources Data Sources Process variability Operational data Reliability data Material uncertainty Evaluation Uncertainty Model uncertainty Legacy system data Information Fusion Test data Sensor uncertainty Simulation data Mathematical models Expert Opinion Performance evaluation affected by available data and uncertainty Uncertainty sources linear or non-linear combination, occur at different stages Information sources heterogeneous A systematic approach is necessary for information fusion and uncertainty quantification. 2
3 Aleatory vs. epistemic uncertainty Four types of quantities 1. Deterministic known value (fixed constant) 2. Stochastic random variability Distribution statistics are known Aleatory, irreducible uncertainty 3. Quantity is deterministic, but unknown Epistemic, reducible uncertainty True value 4. Quantity is stochastic, but distribution characteristics are unknown Has both aleatory and epistemic uncertainty Unknown distribution type Unknown distribution parameters 3
4 Bayesian network a b d e g a, b,.. component nodes (variables) g system-level output c f U - set of all nodes { a, b,, g } Joint PDF of all variables π(u) = π(a) π(b a) π(c a) π(d c) π(e b, d) π(f ) π(g e, f ) PDF of node g π(g) = π(u) da db df Data m b e With new observed data m π(u, m) = π(u) π(m b) a c d f g 4
5 Methodology for UQ (1/2) Construction of Bayesian Network Physics-based Models Use available models based on process physics (domain knowledge) Eg: Differential equations Construction of Bayesian Network Data-driven approach Build surrogate models using data e.g., Regression, Polynomial models, Gaussian Process Structure Learning algorithms Constraint-based and Score-based algorithms Hybrid approach Combination of physics-based and data-driven approaches Partial structure of the BN using domain knowledge Learn other parts of network using data Bayesian Model Calibration/Updating Uncertainty Propagation Bartram & Mahadevan, SCHM,
6 Methodology for UQ (2/2) Bayesian Model Calibration/Updating Estimate unknown (epistemic) parameters using data in a probabilistic manner Bayes theorem Likelihood Prior Pr Pr Pr Pr, Posterior Evidence Sampling from Posterior: Markov Chain Monte Carlo (MCMC) Uncertainty Propagation Obtain updated posterior distribution of the Quantity of Interest (QoI) Monte Carlo Sampling - samples from posterior distribution Samples are propagated through the network Construction of Bayesian Network Bayesian Model Calibration, Updating Uncertainty Propagation Posterior distribution of QoI (Performance Metric) 6
7 Scalability Dimension Reduction Size of Production Network increases Increase in Epistemic Parameters Goal: Obtain a reduced set of variables such that there is no significant change in statistics of QoI Increase in Computational effort Obtain prior sensitivity indices by performing sensitivity analysis using prior distributions Assume a threshold value for sensitivity index. If sensitivity index < threshold value, assume that variable to be deterministic at the nominal value. Approach: Global Sensitivity Analysis Assess the variance contribution of each of the variables to the variance of the QoI., 1 7
8 Sensitivity Analysis under Epistemic Uncertainty Global Sensitivity Analysis Applicable to Y = G(X) where G is deterministic One value of X One value of Y Define auxiliary variables Can include both aleatory & epistemic sources Data uncertainty Model uncertainty P Transfer Function Distribution X G(X) Y Introduce auxiliary variable U represent variability U (0, 1) X U f ( x P) dx X F -1 (U, P) U, P X Sankararaman & Mahadevan, RESS,
9 Handling Big Data HDFS : Hadoop Distributed File System To store large amounts of data in databases MapReduce A parallel processing framework for large-scale data processing. Model Building using all data computationally unaffordable Data Reduction and Feature Selection techniques required Clustering Selection of Data using Design of Experiments (DoE) Apache Mahout : Machine learning library for Hadoop The reduced dataset can be used for Bayesian Network construction 9
10 Summary Methodology Construction of Bayesian Network Data Reduction, Dimension Reduction Construction of Bayesian Network Bayesian Model Calibration, Updating Bayesian Model Calibration, Updating Uncertainty Propagation Without Dimension/Data Reduction Uncertainty Propagation With Data/ Dimension Reduction 10
11 Demonstration Problem Metal Part GTAW Arc Welding Metallic Module OR OR Plastic Grains Plastic Part GMAW Product Module Injection Molding Process Metal Powder Machine1 Metal Part Threaded Fastening e.g. Power Train Die Casting Machine2 Machine3 Turning 11
12 Injection Molding Goal: UQ in Energy Consumption of an Injection Molding Process Three stages Melting of polymer Injection of polymer into the mold Cooling of polymer to form product Source: 12
13 Melting Process Power used for melting the dye Volume of shot Energy consumed for melting 1 Flow rate and Average flow rate Δ 100 Legend = specific gravity of polymer = heat capacity of polymer = Injection temperature = polymer temperature = polymer heat of fusion = Volume of mold = shrinkage Δ = buffer = total volume injected 13
14 Injection Process Injection time 2 1Δ 1000 Energy consumed in injection process Legend = number of cavities 14
15 Cooling Process Cooling Time ln 4 Energy consumed in cooling process Legend = maximum wall thickness of mold = thermal diffusivity = Injection temperature = mold temperature = ejection temperature COP = coefficient of performance of cooling equipment 15
16 Total Energy Consumption Energy consumed in making a part Δ Total Electricity Cost Reset Energy Total Cycle Time Reset time t Legend = maximum wall thickness of mold = Energy consumed for resetting = Power required for basic energy consuming units, when machine is in stand by mode = dry cycle time = maximum clamp stroke in the mold = unit cost in USA = Total cycle time,,,, = Efficiencies of injection, reset, cooling, heating, machine power = depth of the part = output per day = number of days 16
17 Injection Molding Bayesian Network Constants Calibration Parameters (Epistemic) Deterministic Observations Aleatory 17
18 Injection Molding Materials Several types of materials used in injection molding Metals Glass Elastomers Thermoplastic polymers Thermosetting polymers Polyethylene properties (assumed) Specific gravity = 960 kg/m 3 Heat capacity = 2250 J/kg-K Thermal diffusivity = 2.27 x 10-7 m 2 /s Shrinkage = Heat of fusion = 240 kj/kg Nylon Polyethylene Polypropylene Polyvinyl chloride Polystyrene Acrylic Teflon 18
19 Injection Molding Process Parameters Synthetic Dataset Parameter Value Injection temperature 210 ( o C) Mold temperature 35 ( o C) Ejection temperature 50 ( o C) Injection pressure Flow rate 90 MPa 1.67e-5 / Coefficient of performance 0.7 All efficiency coefficients 0.7 Number of cavities 1 Fraction Δ Thickness m Volume of part m 3 Unit Cost 10 cents/kwh Parts per day 10,000 Number of days 28 19
20 Injection Molding Process - Synthetic dataset Error in temperature measurements ( o C) ~0,3 Error in cycle time measurements (s) ~0,2 Error in injection pressure measurements (MPa) ~ 0,4 Initial temperature of polymer ( o C) 30,2 Density of polymer (kg/m 3 ) 960,10 Heat Capacity (J/kg-K) 2250,20 Flow Rate (m 3 /s) ~0,1.5 6 Parameter Prior Distribution ( o C) Uniform (205,220) ( o C) Uniform (45,60) ( o C) Uniform (30,45) (MPa) Uniform (88,95) (m 3 /s) Uniform (1.6e-5,1.75e-5). 100 data points are generated and used in calibration process 20
21 Results Prior Sensitivity Analysis Variable Type of Uncertainty Prior Individual effect Prior - Total effect Remarks Epistemic Significant Aleatory Significant Epistemic Significant Aleatory Significant Epistemic Insignificant Aleatory Insignificant Aleatory Significant Sensitivity index < 0.01 (Threshold value) Insignificant Variables retained for Calibration : Injection Temperature, Ejection Temperature 21
22 Results - Prior and Posterior Plots (Epistemic variables) Injection Temperature Energy consumed per part Ejection Temperature 22
23 Results Prior and Posterior Sensitivity Analysis Variable Type of Uncertainty Prior Individual effect Prior - Total effect Remarks Posterior Individual effect Posterior Total effect Remarks Epistemic Significant Insignificant Aleatory Significant Significant Epistemic Significant Insignificant Aleatory Significant Significant Epistemic Insignificant Insignificant Aleatory Insignificant Significant Aleatory Significant Significant Uncertainty in and greatly reduced after calibration process (Effects Significant to Insignificant) 23
24 Illustrative Example Welding Process Energy Consumption (1/2) Goal: UQ in Energy Consumption of a Welding Process Volume of the Weld Theoretical Minimum Energy (filler and metal are the same) Input Energy (Laser) Efficiency of welding process Legend L = Length of weld = density of material = Heat capacity of material = Final Temperature = Initial Temperature = Latent Heat = Voltage = Current = Welding Speed 24
25 Energy Consumption in Welding Process (2/2) Total Power Total Electricity Cost Legend = Total weld time = Number of days in service = Unit cost for electricity in USA 25
26 Welding Process Bayesian Network Constants Calibration Parameters (Epistemic) Deterministic Observations Aleatory 26
27 Synthetic Dataset (1/2) Parameter Initial Temperature ( ) (K) Final Temperature ( ) (K) Heat Capacity ( ) (J/kgK) Density () (kg/m 3 ) Latent Heat () (kj/kg) Weld Zone parameters (mm) Length of weld () (mm) Value 303, ,10 500,5 8238,10 270,3 8.5, ,0.5 2,0.1 15,0.5 11,1 500,10 Voltage () (V) 20 Current () (A) 250 Weld Speed () (mm/min) cents/kwh 27
28 Synthetic Dataset(2/2) Error in length measurement (mm), ~ 0,0.01 Error in current measurement (A), ~ 0,2 Observation Data 8.5,0.5 0, ,0.5 0, ,1 0, ,2 Parameter Prior Distributions Prior Distributions Uniform (8.3,8.6) Uniform (0.2,0.7) Uniform (2.5,2.8) Uniform (0.3,0.6) Uniform (10,13) Uniform (0.8,1.3) Uniform (235,260) Uniform (1.7,2.5) - Standard deviation in measurement error of current 28
29 Results Prior Sensitivity Analysis Parameter Type of uncertainty Prior Individual effect Prior Total effect Remarks Aleatory Significant Epistemic Insignificant Epistemic 6.44 x x 10-3 Insignificant Aleatory Significant Epistemic Insignificant Epistemic x Insignificant Aleatory Significant Epistemic Significant Epistemic x Insignificant Aleatory Significant Aleatory Significant Aleatory Insignificant Aleatory Insignificant Aleatory Insignificant Aleatory x x 10-6 Insignificant Aleatory Insignificant Aleatory Significant Threshold value 0.01 Significant Epistemic variables - 29
30 Results Prior and Posterior Plots (Epistemic Variables) Weld parameter e Theoretical energy consumption per weld Prior and Posterior no major change, most of the uncertain variables are aleatory (uncertainty can not be reduced) 30
31 Results Prior and Posterior Sensitivity Analysis Parameter Type of uncertainty Prior Individual effect Prior Total effect Remarks Posterior Individual effect Posterior Total effect Remarks Aleatory Significant Significant Epistemic Insignificant Insignificant Epistemic 6.44 x x 10-3 Insignificant x x 10-5 Insignificant Aleatory Significant Significant Epistemic Insignificant Insignificant Epistemic x Insignificant x Insignificant Aleatory Significant Significant Epistemic Significant Insignificant Epistemic x Insignificant x Insignificant Aleatory Significant Significant Aleatory Significant Insignificant Aleatory Insignificant Insignificant Aleatory Insignificant Insignificant Aleatory Insignificant Insignificant Aleatory x x 10-6 Insignificant x x 10-6 Insignificant Aleatory Insignificant Insignificant Aleatory Significant Significant Uncertainty in greatly reduced after calibration process (Significant to Insignificant) 31
32 Comprehensive framework for uncertainty integration and management Bayesian network j Include all available models and data Include calibration, verification and validation results at multiple levels e d Heterogeneous data of varying precision g and cost c Models of varying complexity, accuracy, s cost Data on different but related systems Bayesian Network o n r Facilitates Forward problem: UQ in overall system-level prediction Integrate all available sources of information and results of modeling/testing/monitoring Inverse problem: Resource allocation at various stages of system life cycle Model development, data collection, system design, manufacturing, operations, health monitoring, risk management 32
33 Summary of Analyses Information Retrieval retrieve required information from database Dimension Reduction Select a subset of features/parameters Data Reduction Reduce amount of data through clustering Model Building Build a Bayesian Network Uncertainty Propagation Monte Carlo simulation Sensitivity Analysis (Aleatory vs. Epistemic) Variance decomposition Performance Prediction Monte Carlo simulation Temporal variation -- Dynamic Bayesian Network Diagnosis, Prognosis Classification, Particle filtering Decision-Making, Resource allocation 33
34 Future Work Derive Bayesian network from process and model libraries Include text and image data, along with numerical data in UQ Apply to production network with multiple processes Analysis over time with streaming data (Dynamic Bayesian Network) Explore various problems in manufacturing Process Monitoring (Dynamic Tracking) For diagnosis Stochastic Process Optimization Adjust parameters to meet the requirements Risk Management Reduce performance variability to be within desired limits Resource Allocation maximize information maximize reduction in uncertainty 34
Recent Advances in Bayesian Inference Techniques
Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian
More informationDynamic System Identification using HDMR-Bayesian Technique
Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in
More informationBayesian Networks BY: MOHAMAD ALSABBAGH
Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional
More informationSTA 4273H: Sta-s-cal Machine Learning
STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our
More informationA Probabilistic Framework for solving Inverse Problems. Lambros S. Katafygiotis, Ph.D.
A Probabilistic Framework for solving Inverse Problems Lambros S. Katafygiotis, Ph.D. OUTLINE Introduction to basic concepts of Bayesian Statistics Inverse Problems in Civil Engineering Probabilistic Model
More informationParameter Estimation. Industrial AI Lab.
Parameter Estimation Industrial AI Lab. Generative Model X Y w y = ω T x + ε ε~n(0, σ 2 ) σ 2 2 Maximum Likelihood Estimation (MLE) Estimate parameters θ ω, σ 2 given a generative model Given observed
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 7 Approximate
More informationDiscussion of Predictive Density Combinations with Dynamic Learning for Large Data Sets in Economics and Finance
Discussion of Predictive Density Combinations with Dynamic Learning for Large Data Sets in Economics and Finance by Casarin, Grassi, Ravazzolo, Herman K. van Dijk Dimitris Korobilis University of Essex,
More informationFast Likelihood-Free Inference via Bayesian Optimization
Fast Likelihood-Free Inference via Bayesian Optimization Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationMachine Learning. Probabilistic KNN.
Machine Learning. Mark Girolami girolami@dcs.gla.ac.uk Department of Computing Science University of Glasgow June 21, 2007 p. 1/3 KNN is a remarkably simple algorithm with proven error-rates June 21, 2007
More informationUncertainty Management and Quantification in Industrial Analysis and Design
Uncertainty Management and Quantification in Industrial Analysis and Design www.numeca.com Charles Hirsch Professor, em. Vrije Universiteit Brussel President, NUMECA International The Role of Uncertainties
More informationUncertainty Aggregation Variability, Statistical Uncertainty, and Model Uncertainty
Uncertainty Aggregation Variability, Statistical Uncertainty, and Model Uncertainty Sankaran Mahadevan Vanderbilt University, Nashville, TN, USA ETICS 2018 Roscoff, France; June 4, 2018 Sankaran Mahadevan
More informationAn Introduction to Reversible Jump MCMC for Bayesian Networks, with Application
An Introduction to Reversible Jump MCMC for Bayesian Networks, with Application, CleverSet, Inc. STARMAP/DAMARS Conference Page 1 The research described in this presentation has been funded by the U.S.
More informationA Data-driven Approach for Remaining Useful Life Prediction of Critical Components
GT S3 : Sûreté, Surveillance, Supervision Meeting GdR Modélisation, Analyse et Conduite des Systèmes Dynamiques (MACS) January 28 th, 2014 A Data-driven Approach for Remaining Useful Life Prediction of
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML
More informationWhat is (certain) Spatio-Temporal Data?
What is (certain) Spatio-Temporal Data? A spatio-temporal database stores triples (oid, time, loc) In the best case, this allows to look up the location of an object at any time 2 What is (certain) Spatio-Temporal
More informationRemaining Useful Performance Analysis of Batteries
Remaining Useful Performance Analysis of Batteries Wei He, Nicholas Williard, Michael Osterman, and Michael Pecht Center for Advanced Life Engineering, University of Maryland, College Park, MD 20742, USA
More informationSTA414/2104 Statistical Methods for Machine Learning II
STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements
More informationResearch Collection. Basics of structural reliability and links with structural design codes FBH Herbsttagung November 22nd, 2013.
Research Collection Presentation Basics of structural reliability and links with structural design codes FBH Herbsttagung November 22nd, 2013 Author(s): Sudret, Bruno Publication Date: 2013 Permanent Link:
More informationReplicated Softmax: an Undirected Topic Model. Stephen Turner
Replicated Softmax: an Undirected Topic Model Stephen Turner 1. Introduction 2. Replicated Softmax: A Generative Model of Word Counts 3. Evaluating Replicated Softmax as a Generative Model 4. Experimental
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationUNCERTAINTY ANALYSIS METHODS
UNCERTAINTY ANALYSIS METHODS Sankaran Mahadevan Email: sankaran.mahadevan@vanderbilt.edu Vanderbilt University, School of Engineering Consortium for Risk Evaluation with Stakeholders Participation, III
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationBasics of Uncertainty Analysis
Basics of Uncertainty Analysis Chapter Six Basics of Uncertainty Analysis 6.1 Introduction As shown in Fig. 6.1, analysis models are used to predict the performances or behaviors of a product under design.
More informationMachine Learning Overview
Machine Learning Overview Sargur N. Srihari University at Buffalo, State University of New York USA 1 Outline 1. What is Machine Learning (ML)? 2. Types of Information Processing Problems Solved 1. Regression
More informationUNCERTAINTY QUANTIFICATION AND INTEGRATION IN ENGINEERING SYSTEMS. Shankar Sankararaman. Dissertation. Submitted to the Faculty of the
UNCERTAINTY QUANTIFICATION AND INTEGRATION IN ENGINEERING SYSTEMS By Shankar Sankararaman Dissertation Submitted to the Faculty of the Graduate School of Vanderbilt University in partial fulfillment of
More informationWinter 2019 Math 106 Topics in Applied Mathematics. Lecture 1: Introduction
Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 1: Introduction 19 Winter M106 Class: MWF 12:50-1:55 pm @ 200
More information9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering
Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear
More informationSequential Importance Sampling for Rare Event Estimation with Computer Experiments
Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Brian Williams and Rick Picard LA-UR-12-22467 Statistical Sciences Group, Los Alamos National Laboratory Abstract Importance
More informationBayesian rules of probability as principles of logic [Cox] Notation: pr(x I) is the probability (or pdf) of x being true given information I
Bayesian rules of probability as principles of logic [Cox] Notation: pr(x I) is the probability (or pdf) of x being true given information I 1 Sum rule: If set {x i } is exhaustive and exclusive, pr(x
More informationReview: Probabilistic Matrix Factorization. Probabilistic Matrix Factorization (PMF)
Case Study 4: Collaborative Filtering Review: Probabilistic Matrix Factorization Machine Learning for Big Data CSE547/STAT548, University of Washington Emily Fox February 2 th, 214 Emily Fox 214 1 Probabilistic
More informationUncertainty Quantification and Validation Using RAVEN. A. Alfonsi, C. Rabiti. Risk-Informed Safety Margin Characterization. https://lwrs.inl.
Risk-Informed Safety Margin Characterization Uncertainty Quantification and Validation Using RAVEN https://lwrs.inl.gov A. Alfonsi, C. Rabiti North Carolina State University, Raleigh 06/28/2017 Assumptions
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationProbabilistic assessment of danger zones using a surrogate model of CFD simulations
HARMO 17 17 th International Conference on Harmonisation within Atmospheric Dispersion Modelling for Regulatory Purposes Probabilistic assessment of danger zones using a surrogate model of CFD simulations
More informationRecurrent Latent Variable Networks for Session-Based Recommendation
Recurrent Latent Variable Networks for Session-Based Recommendation Panayiotis Christodoulou Cyprus University of Technology paa.christodoulou@edu.cut.ac.cy 27/8/2017 Panayiotis Christodoulou (C.U.T.)
More informationRisk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods
Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Konstantin Zuev Institute for Risk and Uncertainty University of Liverpool http://www.liv.ac.uk/risk-and-uncertainty/staff/k-zuev/
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationThe University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),
The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz
More informationMachine learning comes from Bayesian decision theory in statistics. There we want to minimize the expected value of the loss function.
Bayesian learning: Machine learning comes from Bayesian decision theory in statistics. There we want to minimize the expected value of the loss function. Let y be the true label and y be the predicted
More informationSensor Tasking and Control
Sensor Tasking and Control Sensing Networking Leonidas Guibas Stanford University Computation CS428 Sensor systems are about sensing, after all... System State Continuous and Discrete Variables The quantities
More informationMachine Learning using Bayesian Approaches
Machine Learning using Bayesian Approaches Sargur N. Srihari University at Buffalo, State University of New York 1 Outline 1. Progress in ML and PR 2. Fully Bayesian Approach 1. Probability theory Bayes
More informationDevelopment of Stochastic Artificial Neural Networks for Hydrological Prediction
Development of Stochastic Artificial Neural Networks for Hydrological Prediction G. B. Kingston, M. F. Lambert and H. R. Maier Centre for Applied Modelling in Water Engineering, School of Civil and Environmental
More informationMachine Learning Techniques for Computer Vision
Machine Learning Techniques for Computer Vision Part 2: Unsupervised Learning Microsoft Research Cambridge x 3 1 0.5 0.2 0 0.5 0.3 0 0.5 1 ECCV 2004, Prague x 2 x 1 Overview of Part 2 Mixture models EM
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality
More informationModular Bayesian uncertainty assessment for Structural Health Monitoring
uncertainty assessment for Structural Health Monitoring Warwick Centre for Predictive Modelling André Jesus a.jesus@warwick.ac.uk June 26, 2017 Thesis advisor: Irwanda Laory & Peter Brommer Structural
More informationFinal Examination CS 540-2: Introduction to Artificial Intelligence
Final Examination CS 540-2: Introduction to Artificial Intelligence May 7, 2017 LAST NAME: SOLUTIONS FIRST NAME: Problem Score Max Score 1 14 2 10 3 6 4 10 5 11 6 9 7 8 9 10 8 12 12 8 Total 100 1 of 11
More informationIntroduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the
More informationLearning in Bayesian Networks
Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks
More informationRobust Pareto Design of GMDH-type Neural Networks for Systems with Probabilistic Uncertainties
. Hybrid GMDH-type algorithms and neural networks Robust Pareto Design of GMDH-type eural etworks for Systems with Probabilistic Uncertainties. ariman-zadeh, F. Kalantary, A. Jamali, F. Ebrahimi Department
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 13: Learning in Gaussian Graphical Models, Non-Gaussian Inference, Monte Carlo Methods Some figures
More informationTU Ilmenau. 58 th ILMENAU SCIENTIFIC COLLOQUIUM Technische Universität Ilmenau, September 2014 URN: urn:nbn:de:gbv:ilm1-2014iwk:3
URN (Paper): urn:nbn:de:gbv:ilm1-2014iwk-106:2 58 th ILMENAU SCIENTIFIC COLLOQUIUM Technische Universität Ilmenau, 08 12 September 2014 URN: urn:nbn:de:gbv:ilm1-2014iwk:3 STUDIES OF DESIGN AND TECHNOLOGY
More informationComputer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo
Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain
More informationIntroduction to Statistical Inference
Structural Health Monitoring Using Statistical Pattern Recognition Introduction to Statistical Inference Presented by Charles R. Farrar, Ph.D., P.E. Outline Introduce statistical decision making for Structural
More informationLearning Energy-Based Models of High-Dimensional Data
Learning Energy-Based Models of High-Dimensional Data Geoffrey Hinton Max Welling Yee-Whye Teh Simon Osindero www.cs.toronto.edu/~hinton/energybasedmodelsweb.htm Discovering causal structure as a goal
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationCoastal Hazards System: Interpretation and Application
Lessons Learned and Best Practices: Resilience of Coastal Infrastructure Hato Rey, PR March 8-9, 2017 Coastal Hazards System: Interpretation and Application Victor M. Gonzalez, P.E. Team: PI: Jeffrey A.
More informationSequential Monte Carlo Methods for Bayesian Computation
Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter
More informationStochastic optimization - how to improve computational efficiency?
Stochastic optimization - how to improve computational efficiency? Christian Bucher Center of Mechanics and Structural Dynamics Vienna University of Technology & DYNARDO GmbH, Vienna Presentation at Czech
More information* Matrix Factorization and Recommendation Systems
Matrix Factorization and Recommendation Systems Originally presented at HLF Workshop on Matrix Factorization with Loren Anderson (University of Minnesota Twin Cities) on 25 th September, 2017 15 th March,
More informationMathematical Formulation of Our Example
Mathematical Formulation of Our Example We define two binary random variables: open and, where is light on or light off. Our question is: What is? Computer Vision 1 Combining Evidence Suppose our robot
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More informationQuantile POD for Hit-Miss Data
Quantile POD for Hit-Miss Data Yew-Meng Koh a and William Q. Meeker a a Center for Nondestructive Evaluation, Department of Statistics, Iowa State niversity, Ames, Iowa 50010 Abstract. Probability of detection
More informationThe Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision
The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that
More informationGaussian Processes for Computer Experiments
Gaussian Processes for Computer Experiments Jeremy Oakley School of Mathematics and Statistics, University of Sheffield www.jeremy-oakley.staff.shef.ac.uk 1 / 43 Computer models Computer model represented
More informationAn introduction to Bayesian statistics and model calibration and a host of related topics
An introduction to Bayesian statistics and model calibration and a host of related topics Derek Bingham Statistics and Actuarial Science Simon Fraser University Cast of thousands have participated in the
More informationF denotes cumulative density. denotes probability density function; (.)
BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models
More informationECE521 week 3: 23/26 January 2017
ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear
More informationSum-Product Networks. STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 17, 2017
Sum-Product Networks STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 17, 2017 Introduction Outline What is a Sum-Product Network? Inference Applications In more depth
More informationRelated Concepts: Lecture 9 SEM, Statistical Modeling, AI, and Data Mining. I. Terminology of SEM
Lecture 9 SEM, Statistical Modeling, AI, and Data Mining I. Terminology of SEM Related Concepts: Causal Modeling Path Analysis Structural Equation Modeling Latent variables (Factors measurable, but thru
More informationNonparametric Bayesian Methods (Gaussian Processes)
[70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent
More informationNeutron inverse kinetics via Gaussian Processes
Neutron inverse kinetics via Gaussian Processes P. Picca Politecnico di Torino, Torino, Italy R. Furfaro University of Arizona, Tucson, Arizona Outline Introduction Review of inverse kinetics techniques
More informationUncertainty quantification and management in additive manufacturing: current status, needs, and opportunities
DOI 10.1007/s00170-017-0703-5 ORIGINAL ARTICLE Uncertainty quantification and management in additive manufacturing: current status, needs, and opportunities Zhen Hu 1 & Sankaran Mahadevan 2 Received: 9
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationBayesian parameter estimation in predictive engineering
Bayesian parameter estimation in predictive engineering Damon McDougall Institute for Computational Engineering and Sciences, UT Austin 14th August 2014 1/27 Motivation Understand physical phenomena Observations
More informationIGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October th 2014, Kalmar, Sweden
IGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October 28-30 th 2014, Kalmar, Sweden Comparison of probabilistic and alternative evidence theoretical methods for the handling of parameter
More informationEvaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions
Evaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions C. Xing, R. Caspeele, L. Taerwe Ghent University, Department
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 2: Bayesian Basics https://people.orie.cornell.edu/andrew/orie6741 Cornell University August 25, 2016 1 / 17 Canonical Machine Learning
More informationPolynomial chaos expansions for structural reliability analysis
DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Polynomial chaos expansions for structural reliability analysis B. Sudret & S. Marelli Incl.
More informationApproximate Inference Part 1 of 2
Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ Bayesian paradigm Consistent use of probability theory
More informationA graph contains a set of nodes (vertices) connected by links (edges or arcs)
BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,
More informationUncertainty Propagation and Global Sensitivity Analysis in Hybrid Simulation using Polynomial Chaos Expansion
Uncertainty Propagation and Global Sensitivity Analysis in Hybrid Simulation using Polynomial Chaos Expansion EU-US-Asia workshop on hybrid testing Ispra, 5-6 October 2015 G. Abbiati, S. Marelli, O.S.
More informationMarkov Chain Monte Carlo Methods for Stochastic Optimization
Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,
More informationMachine Learning! in just a few minutes. Jan Peters Gerhard Neumann
Machine Learning! in just a few minutes Jan Peters Gerhard Neumann 1 Purpose of this Lecture Foundations of machine learning tools for robotics We focus on regression methods and general principles Often
More informationScaling up Bayesian Inference
Scaling up Bayesian Inference David Dunson Departments of Statistical Science, Mathematics & ECE, Duke University May 1, 2017 Outline Motivation & background EP-MCMC amcmc Discussion Motivation & background
More informationVariational inference
Simon Leglaive Télécom ParisTech, CNRS LTCI, Université Paris Saclay November 18, 2016, Télécom ParisTech, Paris, France. Outline Introduction Probabilistic model Problem Log-likelihood decomposition EM
More informationA BAYESIAN APPROACH FOR PREDICTING BUILDING COOLING AND HEATING CONSUMPTION
A BAYESIAN APPROACH FOR PREDICTING BUILDING COOLING AND HEATING CONSUMPTION Bin Yan, and Ali M. Malkawi School of Design, University of Pennsylvania, Philadelphia PA 19104, United States ABSTRACT This
More informationApproximate Inference Part 1 of 2
Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ 1 Bayesian paradigm Consistent use of probability theory
More informationan introduction to bayesian inference
with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena
More informationScaling Neighbourhood Methods
Quick Recap Scaling Neighbourhood Methods Collaborative Filtering m = #items n = #users Complexity : m * m * n Comparative Scale of Signals ~50 M users ~25 M items Explicit Ratings ~ O(1M) (1 per billion)
More informationApplication Note. The Optimization of Injection Molding Processes Using Design of Experiments
The Optimization of Injection Molding Processes Using Design of Experiments Problem Manufacturers have three primary goals: 1) produce goods that meet customer specifications; 2) improve process efficiency
More informationAt A Glance. UQ16 Mobile App.
At A Glance UQ16 Mobile App Scan the QR code with any QR reader and download the TripBuilder EventMobile app to your iphone, ipad, itouch or Android mobile device. To access the app or the HTML 5 version,
More informationStochastic Spectral Approaches to Bayesian Inference
Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to
More informationParticle Filtering Approaches for Dynamic Stochastic Optimization
Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,
More informationRobust Monte Carlo Methods for Sequential Planning and Decision Making
Robust Monte Carlo Methods for Sequential Planning and Decision Making Sue Zheng, Jason Pacheco, & John Fisher Sensing, Learning, & Inference Group Computer Science & Artificial Intelligence Laboratory
More informationBayesian Deep Learning
Bayesian Deep Learning Mohammad Emtiyaz Khan AIP (RIKEN), Tokyo http://emtiyaz.github.io emtiyaz.khan@riken.jp June 06, 2018 Mohammad Emtiyaz Khan 2018 1 What will you learn? Why is Bayesian inference
More informationApproximate inference in Energy-Based Models
CSC 2535: 2013 Lecture 3b Approximate inference in Energy-Based Models Geoffrey Hinton Two types of density model Stochastic generative model using directed acyclic graph (e.g. Bayes Net) Energy-based
More information