Optimal State Estimation for Boolean Dynamical Systems using a Boolean Kalman Smoother

Size: px
Start display at page:

Download "Optimal State Estimation for Boolean Dynamical Systems using a Boolean Kalman Smoother"

Transcription

1 Optimal State Estimation for Boolean Dynamical Systems using a Boolean Kalman Smoother Mahdi Imani and Ulisses Braga-Neto Department of Electrical and Computer Engineering Texas A&M University College Station, TX, USA m.imani88@tamu.edu, ulisses@ece.tamu.edu Abstract This paper is concerned with state estimation at a fixed time point in a given time series of observations of a Boolean dynamical system. Towards this end, we introduce the Boolean Kalman Smoother, which provides an efficient algorithm to compute the optimal MMSE state estimator for this problem. Performance is investigated using a Boolean network model of the p53-mdm2 negative feedback loop gene regulatory network observed through time series of Next-Generation Sequencing (NGS) data. Index Terms Boolean Dynamical System, Boolean Kalman Smoother, Gene Regulatory Network I. INTRODUCTION Gene regulatory networks govern the functioning of key cellular processes, such as the cell cycle, stress response, and DNA repair. Boolean networks [1], [2] have emerged as an effective model of the dynamical behavior of regulatory network consisting of genes in activated/inactivated states, the relationship among which is governed by networks of logical gates updated at discrete time intervals. The Boolean Kalman Filter (BKF) [3] [8] is an online, recursive algorithm to compute the optimal state estimator for a signal model consisting of a Boolean state process measured through a noisy observation process. The state process corresponds to a Boolean network with noisy state transitions, while the observation process is quite general. In this paper, we present an extension of the BKF to the smoothing problem, when a fixed interval of data is acquired and available for processing offline. This algorithm, called the Boolean Kalman Smoother (BKS), bears similarities to the forward-backward algorithm used in the inference of hidden markov models [9]. Performance of the BKS in the inference of Boolean networks is investigated using a model of the p53- MDM2 negative feedback loop network observed through next-generation sequencing data. The results indicate that on average, the BKS has lower MSE and lower error rates than the BKF, as expected. II. STOCHASTIC SIGNAL MODEL Deterministic Boolean network models are unable to cope with (1) uncertainty in state transition due to system noise and the effect of unmodeled variables; the fact that (2) the Boolean states of a system are never observed directly, but only indirectly through expression-based technologies, such as RNA-seq. This calls for a stochastic approach. We describe below the stochastic signal model for Boolean dynamical systems, first proposed in [3], which will be employed here. A. State Model Assume that the system is described by a state process {X k ; k = 0, 1,...}, where X k {0, 1} d is a Boolean vector of size d in the case of a gene regulatory network, the components of X k represent the activation/inactivation state, at discrete time k, of the genes comprising the network. The state is assumed to be updated at each discrete time through the following nonlinear signal model: X k = f (X k 1, u k 1 ) n k (state model) (1) for k = 1, 2,...; where u k 1 {0, 1} p is an input vector of dimension p at time k 1, {n k ; k = 1, 2,...} is a white noise process with n k = (N k1,..., N kd ) {0, 1} d, f {0, 1} d+p {0, 1} d is a network function, and indicates component-wise modulo-2 addition. In this paper, we employ a specific model for the network function that is motivated by gene pathway diagrams commonly encountered in biomedical research. We assume that the state vector is updated according to

2 the following equation (where the input u k 1 is omitted): 1, d j=1 a ij (X k 1 ) j + b i > 0, (X k ) i = f i (X k 1 ) = 0, d j=1 a ij (X k 1 ) j + b i < 0, (2) for i = 1, 2,..., d, where the parameter a ij can take three values: +1 if there is positive regulation (activation) from gene j to gene i; 1 if there is negative regulation (inhibition) from gene j to gene i; and 0 if gene i is not an input to gene j. The second set of parameters, biases (b i ), can take two values: +1/2 if gene i is positively biased, in the sense that an equal number of activation and inhibition inputs activate the gene; 1/2 if the reverse is true. Each element of state at time step k can be obtained by adding noise to the output of equation 2. The proposed model is depicted in figure 1 (where the input u k 1 is omitted). f where λ ki is the mean read count of transcript i at time k. Recall that, according to the Boolean state model, there are two possible states for the abundance of transcript i at time k: high (X ki = 1, active gene) and low (X ki = 0, inactive gene). Accordingly, the parameter λ ki is modeled as follows: log(λ ki ) = log(s) + µ b + θ i, if X ki = 0, log(λ ki ) = log(s) + µ b + δ i + θ i, if X ki = 1. where the parameter s is the sequencing depth, µ b > 0 is the baseline level of expression in the inactivated transcriptional state, δ i > 0 expresses the effect on the observed RNA-seq read count as gene i goes from the inactivated to the activated state, and θ i is a noise parameter that models unknown and unwanted technical effects that may occur during the experiment. We assume δ i to be Gaussian with mean µ δ > 0 and variance σδ 2, common to all transcripts, where σ δ is assumed to be small enough to keep δ i positive. Furthermore, we assume that θ i is zero-mean Gaussian noise with small variance σθ 2, common to all transcripts. Typical values for all these parameters are given in section IV. (5) III. BOOLEAN KALMAN SMOOTHER Fig. 1: Proposed network model (without inputs). B. Observation Model In most real-world applications, the system state is only partially observable, and distortion is introduced in the observations by sensor noise this is certainly the case with RNA-seq transcriptomic data. Let Y k be the observation corresponding to the state X k at time k. The observation Y k is formed from the state X k through the equation: Y k = h (X k 1, v k ) (observation model) (3) for k = 1, 2,...; where v k is observation noise. Assume Y k = (Y k1,..., Y kd ) is a vector containing the RNAseq data at time k, for k = 1, 2,... A single-lane NGS platform is considered here, in which Y ki is the read count corresponding to transcript i in the single lane, for i = 1, 2,... In this study, we choose to use a Poisson model for the number of reads for each transcript: P (Y ki = m λ ki ) = e λ λm ki ki, m = 0, 1,... (4) m! The optimal smoothing problem consists of, given a time point 1 < S < T and data in the interval {Y 1,..., Y T }, finding an estimator ˆX S = g(y 1,..., Y T ) of the Boolean state X S at time S that minimizes the mean-square error (MSE): MSE(Y 1,..., Y T ) = E [ ˆX S X S 2 Y 1,..., Y T ] (6) at each value of {Y 1,..., Y T }. A recursive algorithm to solve this problem, called the (fixed-interval) Boolean Kalman Smoother (BKS), is described next. Let (x 1,..., x 2d ) be an arbitrary enumeration of the possible state vectors. Define the following distribution vectors of length 2 d : Π k k (i) = P (X k = x i Y k,..., Y 1 ), Π k k 1 (i) = P (X k = x i Y k 1,..., Y 1 ), k k (i) = P (Y k+1,..., Y T X k = x i ), k k 1 (i) = P (Y k,..., Y T X k = x i ), (7) for i = 1,..., 2 d and k = 1,..., T, where T T = 1 d 1, by definition. Also define Π 0 0 to be the initial (prior) distribution of the states at time zero. Let the prediction matrix M k of size 2 d 2 d be the transition matrix of the Markov chain defined by the state

3 model: (M k ) ij = P (X k = x i X k 1 = x j ) = P (n k = x i f(x j, u k 1 )), (8) for i, j = 1,..., 2 d. Additionally, given a value of the observation vector Y k at time k, the update matrix T k, also of size 2 d 2 d, is a diagonal matrix defined by: (T k ) jj = p (Y k X k = x j ) (9) for j = 1,..., 2 d. In the specific case of RNA-seq data, considered in the previous section, we have: (T k ) jj = e ( d i=1 λji) d λ Y ki ji i=1 Y ki! d = s( i=1 Y ki) d d i=1 Y ki! exp s exp(µ b + θ i + δ i (x j ) i ) i=1 + (µ b + θ i + δ i (x j ) i )Y ki ), (10) for j = 1,..., 2 d. Finally, define the matrix A of size d 2 d via A = [x 1 x 2d ]. The following result, given here without proof, provides a recursive algorithm to compute the optimal MMSE state estimator. Theorem 1. (Boolean Kalman Smoother.) The optimal minimum MSE estimator ˆX S of the state X S given the observations Y 1,..., Y T, where 1 < S < T, is given by: ˆX S = E [X S Y 1,..., Y T ], (11) where v(i) = I v(i)>1/2 for i = 1,..., d. This estimator and its optimal MSE can be computed by the following procedure. Forward Estimator: 1) Initialization Step: Π 1 0 = M 1 Π 0 0. For k = 1, 2,..., S 1, do: 2) Update Step: β k = T k (y k ) Π k k 1. 3) Normalization Step: Π k k = β k / β k 1. 4) Prediction Step: Π k+1 k = M k+1 Π k k. Backward Estimator: 1) Initialization Step: T T 1 = T T (y T )1 d 1. For k = T 1, T 2,..., S, do: 2) Prediction Step: k k = M k+1 T k+1 k. 3) Update Step: k k 1 = T k (y k ) k k. Smoothed Distribution Vector: Π S S 1 S S 1 Π S T =, Π S S 1 S S 1 1 where denotes componentwise vector multiplication. MMSE Estimator: The MMSE estimator is given by: with optimal conditional MSE ˆX S = AΠ S T (12) MSE(Y 1,..., Y T ) = min{aπ S T, (AΠ S T ) c } 1, (13) where the minimum is applied component-wise, and v c (i) = 1 v(i), for i = 1,..., d. Estimation of the state at time k = T requires only the forward estimation step, in which case the Boolean Kalman Smoother reduces to the Boolean Kalman Filter (BKF), introduced in [3]. The normalization step in the forward estimator is not strictly necessary and can be skipped, by letting Π k k = β k (though in this case the meaning of the vectors Π k k and Π k+1 k change, of course). In addition, the matrix T k can be scaled at will. In particular, the constant s ( d i=1 Y ki) / d i=1 Y ki! in (10) can be dropped, which results in significant computational savings. IV. NUMERICAL EXPERIMENT In this section, we conduct a numerical experiment using a Boolean network based on the well-known pathway for the p53 MDM2 negative feedback system, shown in figure 2. We consider the input to be either no stress, dna dsb = 0 or DNA damage, dna dsb = 1, separately. The process noise is assumed to have independent Wip1 dna dsb ATM p53 Mdm2 Fig. 2: Activation/inactivation diagram for the p53 MDM2 negative feedback loop. components distributed as Bernoulli(p), where the noise parameter p gives the amount of perturbation to the Boolean state process; the process noise is categorized into two levels, p = 0.01 (small noise) and p = 0.1 (large noise). On the other hand, σθ 2 (see Section II-B) is the technical effect variance, which specifies the uncertainty of the observations. This value is likewise categorized into two levels: σθ 2 = 0.01 (clean observations) and

4 Average MSE σθ 2 = 0.1 (noisy observations). Two sequencing depths are considered for the observations in the simulations, s = and s = which correspond to 1K 50K and 50K 100K reads respectively. The parameters δ i are generated from a Gaussian distribution with mean (µ δ ) 3 and variance (σδ 2 ) 0.5. The baseline expression µ b is set to as the number of reads or the noise increases. In addition, when the input dna dsb is 0 (no DNA damage), the performance of both methods is better than when the input is 1 (DNA damage). The reason can be found by looking at the state transitions of p53 MDM2 network for both inputs. When the input is 0, the network has one singleton attractor ( 0000 ), while for input 1, there are two cyclic attractors, the first of which contains two states, while the other contains six states. In the presence of cyclic or multiple attractors, due to the changes of state trajectories, the estimation process is more difficult. TABLE I: Performance of the BKF and the BKS. Noise Parameters dna dsb = 0 dna dsb = 1 Reads BKF BKS BKF BKS p = K-50K σθ 2 = K-100K p = 0.1 1K-50K σθ 2 = K-100K p = K-50K σθ 2 = 0.1 5K-100K p = 0.1 1K-50K σθ 2 = 0.1 5K-100K V. CONCLUSION Time Fig. 3: Average MSE of BKF and BKS over 1000 runs. Figure 3 displays the average MSE achieved by the BKS at a fixed time point, as well as the BKF, for T = 100 observations, over 1000 independent runs. It is seen that BKS has smaller MSE on average in comparison to BKF as expected, since the BKS uses future observations, but the BKF uses only the observations up to the present time. Furthermore, we can see that the average MSE of both methods is higher in the presence of large noise. Next, the performance of the BKS and the BKF with different values of noise and different number of reads is examined. The average performance of the methods is defined here as the estimation error rate, i.e., the average number of correctly estimated states (over the length T = 100 of the signal and 1000 runs), which is presented in table I. The results show that the average performance of the BKS is higher than that of the BKF. Furthermore, the performance of both methods decreases This paper introduced a method for the inference of gene regulatory networks that is based on a novel algorithm, called the Boolean Kalman Smoother, which efficiently computes the optimal state estimator for discretetime Boolean dynamical systems given the entire history of observations. The smoothing process at each time step contains two estimators: the forward estimator, in which the previous observations are involved in the process, and the backward estimator, in which the future observations are employed in the process of estimation, in a process that bears similarities to the forward-backward algorithm commonly applied to the inference of hidden markov models [9]. The method was illustrated by application to the p53 MDM2 negative feedback network observed through next-generation sequencing data. The results indicate that on average, the BKS has lower MSE and lower error rates than the BKF. ACKNOWLEDGMENT The authors acknowledge the support of the National Science Foundation, through NSF award CCF

5 REFERENCES [1] S. Kauffman, Metabolic stability and epigenesis in randomly constructed genetic nets, Journal of Theoretical Biology, vol. 22, pp , [2] S. Kauffman, The Origins of Order: Self-Organization and Selection in Evolution. Oxford University Press, [3] U. Braga-Neto, Optimal state estimation for Boolean dynamical systems, Proceedings of 45th Annual Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA. [4] U. Braga-Neto, Joint state and parameter estimation for Boolean dynamical systems, Proceedings of the IEEE Statistical Signal Processing Workshop (SSP 12), Ann Arbor, MI. [5] U. Braga-Neto, Particle filtering approach to state estimation in boolean dynamical systems, Proceedings of the IEEE Global Conference on Signal and Image Processing (Global- SIP 13), Austin, TX. [6] A. Bahadorinejad and U. Braga-Neto, Optimal fault detection in stochastic boolean regulatory networks, Proceedings of the IEEE International Workshop on Genomic Signal Processing and Statistics (GENSIPS 2014), Atlanta, GA. [7] A. Bahadorinejad and U. Braga-Neto, Optimal fault detection and diagnosis in transcriptional circuits using next-generation sequencing, IEEE/ACM Transactions on Computational Biology and Bioinformatics, Preprint. [8] M. Imani and U. Braga-Neto, Optimal gene regulatory network inference using the boolean kalman filter and multiple model adaptive estimation, Proceedings of the 49th Annual Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA. [9] L. Rabiner, A tutorial on hidden markov models and selected applications in speech recognition, Proceedings of the IEEE, vol. 77, no. 2, pp , 1989.

State-Feedback Control of Partially-Observed Boolean Dynamical Systems Using RNA-Seq Time Series Data

State-Feedback Control of Partially-Observed Boolean Dynamical Systems Using RNA-Seq Time Series Data State-Feedback Control of Partially-Observed Boolean Dynamical Systems Using RNA-Seq Time Series Data Mahdi Imani and Ulisses Braga-Neto Department of Electrical and Computer Engineering Texas A&M University

More information

Multiple Model Adaptive Controller for Partially-Observed Boolean Dynamical Systems

Multiple Model Adaptive Controller for Partially-Observed Boolean Dynamical Systems Multiple Model Adaptive Controller for Partially-Observed Boolean Dynamical Systems Mahdi Imani and Ulisses Braga-Neto Abstract This paper is concerned with developing an adaptive controller for Partially-Observed

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

GLOBEX Bioinformatics (Summer 2015) Genetic networks and gene expression data

GLOBEX Bioinformatics (Summer 2015) Genetic networks and gene expression data GLOBEX Bioinformatics (Summer 2015) Genetic networks and gene expression data 1 Gene Networks Definition: A gene network is a set of molecular components, such as genes and proteins, and interactions between

More information

Completion Time of Fuzzy GERT-type Networks with Loops

Completion Time of Fuzzy GERT-type Networks with Loops Completion Time of Fuzzy GERT-type Networks with Loops Sina Ghaffari 1, Seyed Saeid Hashemin 2 1 Department of Industrial Engineering, Ardabil Branch, Islamic Azad university, Ardabil, Iran 2 Department

More information

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time Handling uncertainty over time: predicting, estimating, recognizing, learning Chris Atkeson 2004 Why do we care? Speech recognition makes use of dependence of words and phonemes across time. Knowing where

More information

Introduction to Bioinformatics

Introduction to Bioinformatics Systems biology Introduction to Bioinformatics Systems biology: modeling biological p Study of whole biological systems p Wholeness : Organization of dynamic interactions Different behaviour of the individual

More information

Intrinsic Noise in Nonlinear Gene Regulation Inference

Intrinsic Noise in Nonlinear Gene Regulation Inference Intrinsic Noise in Nonlinear Gene Regulation Inference Chao Du Department of Statistics, University of Virginia Joint Work with Wing H. Wong, Department of Statistics, Stanford University Transcription

More information

Basic modeling approaches for biological systems. Mahesh Bule

Basic modeling approaches for biological systems. Mahesh Bule Basic modeling approaches for biological systems Mahesh Bule The hierarchy of life from atoms to living organisms Modeling biological processes often requires accounting for action and feedback involving

More information

Chapter 05: Hidden Markov Models

Chapter 05: Hidden Markov Models LEARNING AND INFERENCE IN GRAPHICAL MODELS Chapter 05: Hidden Markov Models Dr. Martin Lauer University of Freiburg Machine Learning Lab Karlsruhe Institute of Technology Institute of Measurement and Control

More information

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning Handling uncertainty over time: predicting, estimating, recognizing, learning Chris Atkeson 004 Why do we care? Speech recognition makes use of dependence of words and phonemes across time. Knowing where

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Bioinformatics 2 - Lecture 4

Bioinformatics 2 - Lecture 4 Bioinformatics 2 - Lecture 4 Guido Sanguinetti School of Informatics University of Edinburgh February 14, 2011 Sequences Many data types are ordered, i.e. you can naturally say what is before and what

More information

Human-Oriented Robotics. Temporal Reasoning. Kai Arras Social Robotics Lab, University of Freiburg

Human-Oriented Robotics. Temporal Reasoning. Kai Arras Social Robotics Lab, University of Freiburg Temporal Reasoning Kai Arras, University of Freiburg 1 Temporal Reasoning Contents Introduction Temporal Reasoning Hidden Markov Models Linear Dynamical Systems (LDS) Kalman Filter 2 Temporal Reasoning

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Hidden Markov Models. AIMA Chapter 15, Sections 1 5. AIMA Chapter 15, Sections 1 5 1

Hidden Markov Models. AIMA Chapter 15, Sections 1 5. AIMA Chapter 15, Sections 1 5 1 Hidden Markov Models AIMA Chapter 15, Sections 1 5 AIMA Chapter 15, Sections 1 5 1 Consider a target tracking problem Time and uncertainty X t = set of unobservable state variables at time t e.g., Position

More information

Introduction to Bioinformatics

Introduction to Bioinformatics CSCI8980: Applied Machine Learning in Computational Biology Introduction to Bioinformatics Rui Kuang Department of Computer Science and Engineering University of Minnesota kuang@cs.umn.edu History of Bioinformatics

More information

Human Pose Tracking I: Basics. David Fleet University of Toronto

Human Pose Tracking I: Basics. David Fleet University of Toronto Human Pose Tracking I: Basics David Fleet University of Toronto CIFAR Summer School, 2009 Looking at People Challenges: Complex pose / motion People have many degrees of freedom, comprising an articulated

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Cybergenetics: Control theory for living cells

Cybergenetics: Control theory for living cells Department of Biosystems Science and Engineering, ETH-Zürich Cybergenetics: Control theory for living cells Corentin Briat Joint work with Ankit Gupta and Mustafa Khammash Introduction Overview Cybergenetics:

More information

Lecture 5: Logistic Regression. Neural Networks

Lecture 5: Logistic Regression. Neural Networks Lecture 5: Logistic Regression. Neural Networks Logistic regression Comparison with generative models Feed-forward neural networks Backpropagation Tricks for training neural networks COMP-652, Lecture

More information

Multiscale Systems Engineering Research Group

Multiscale Systems Engineering Research Group Hidden Markov Model Prof. Yan Wang Woodruff School of Mechanical Engineering Georgia Institute of echnology Atlanta, GA 30332, U.S.A. yan.wang@me.gatech.edu Learning Objectives o familiarize the hidden

More information

Computational Genomics. Systems biology. Putting it together: Data integration using graphical models

Computational Genomics. Systems biology. Putting it together: Data integration using graphical models 02-710 Computational Genomics Systems biology Putting it together: Data integration using graphical models High throughput data So far in this class we discussed several different types of high throughput

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Synchronous state transition graph

Synchronous state transition graph Heike Siebert, FU Berlin, Molecular Networks WS10/11 2-1 Synchronous state transition graph (0, 2) (1, 2) vertex set X (state space) edges (x,f(x)) every state has only one successor attractors are fixed

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

Simulation of Gene Regulatory Networks

Simulation of Gene Regulatory Networks Simulation of Gene Regulatory Networks Overview I have been assisting Professor Jacques Cohen at Brandeis University to explore and compare the the many available representations and interpretations of

More information

A Higher-Order Interactive Hidden Markov Model and Its Applications Wai-Ki Ching Department of Mathematics The University of Hong Kong

A Higher-Order Interactive Hidden Markov Model and Its Applications Wai-Ki Ching Department of Mathematics The University of Hong Kong A Higher-Order Interactive Hidden Markov Model and Its Applications Wai-Ki Ching Department of Mathematics The University of Hong Kong Abstract: In this talk, a higher-order Interactive Hidden Markov Model

More information

ON SCALABLE CODING OF HIDDEN MARKOV SOURCES. Mehdi Salehifar, Tejaswi Nanjundaswamy, and Kenneth Rose

ON SCALABLE CODING OF HIDDEN MARKOV SOURCES. Mehdi Salehifar, Tejaswi Nanjundaswamy, and Kenneth Rose ON SCALABLE CODING OF HIDDEN MARKOV SOURCES Mehdi Salehifar, Tejaswi Nanjundaswamy, and Kenneth Rose Department of Electrical and Computer Engineering University of California, Santa Barbara, CA, 93106

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Sensitivity Analysis for Discrete-Time Randomized Service Priority Queues

Sensitivity Analysis for Discrete-Time Randomized Service Priority Queues Sensitivity Analysis for Discrete-Time Randomized Service Priority Queues George Kesidis 1, Takis Konstantopoulos 2, Michael Zazanis 3 1. Elec. & Comp. Eng. Dept, University of Waterloo, Waterloo, ON,

More information

arxiv: v1 [q-bio.mn] 7 Nov 2018

arxiv: v1 [q-bio.mn] 7 Nov 2018 Role of self-loop in cell-cycle network of budding yeast Shu-ichi Kinoshita a, Hiroaki S. Yamada b a Department of Mathematical Engineering, Faculty of Engeneering, Musashino University, -- Ariake Koutou-ku,

More information

Note Set 5: Hidden Markov Models

Note Set 5: Hidden Markov Models Note Set 5: Hidden Markov Models Probabilistic Learning: Theory and Algorithms, CS 274A, Winter 2016 1 Hidden Markov Models (HMMs) 1.1 Introduction Consider observed data vectors x t that are d-dimensional

More information

Networks in systems biology

Networks in systems biology Networks in systems biology Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 4500, Spring 2017 M. Macauley (Clemson) Networks in systems

More information

5.3 METABOLIC NETWORKS 193. P (x i P a (x i )) (5.30) i=1

5.3 METABOLIC NETWORKS 193. P (x i P a (x i )) (5.30) i=1 5.3 METABOLIC NETWORKS 193 5.3 Metabolic Networks 5.4 Bayesian Networks Let G = (V, E) be a directed acyclic graph. We assume that the vertices i V (1 i n) represent for example genes and correspond to

More information

Probabilistic reconstruction of the tumor progression process in gene regulatory networks in the presence of uncertainty

Probabilistic reconstruction of the tumor progression process in gene regulatory networks in the presence of uncertainty Probabilistic reconstruction of the tumor progression process in gene regulatory networks in the presence of uncertainty Mohammad Shahrokh Esfahani, Byung-Jun Yoon, Edward R. Dougherty,2 Department of

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by

More information

Inferring biological dynamics Iterated filtering (IF)

Inferring biological dynamics Iterated filtering (IF) Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating

More information

Modeling with Itô Stochastic Differential Equations

Modeling with Itô Stochastic Differential Equations Modeling with Itô Stochastic Differential Equations 2.4-2.6 E. Allen presentation by T. Perälä 27.0.2009 Postgraduate seminar on applied mathematics 2009 Outline Hilbert Space of Stochastic Processes (

More information

Introduction to Artificial Intelligence (AI)

Introduction to Artificial Intelligence (AI) Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 10 Oct, 13, 2011 CPSC 502, Lecture 10 Slide 1 Today Oct 13 Inference in HMMs More on Robot Localization CPSC 502, Lecture

More information

Machine Learning for OR & FE

Machine Learning for OR & FE Machine Learning for OR & FE Hidden Markov Models Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Additional References: David

More information

State Space and Hidden Markov Models

State Space and Hidden Markov Models State Space and Hidden Markov Models Kunsch H.R. State Space and Hidden Markov Models. ETH- Zurich Zurich; Aliaksandr Hubin Oslo 2014 Contents 1. Introduction 2. Markov Chains 3. Hidden Markov and State

More information

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks INFOB2KI 2017-2018 Utrecht University The Netherlands ARTIFICIAL INTELLIGENCE Artificial Neural Networks Lecturer: Silja Renooij These slides are part of the INFOB2KI Course Notes available from www.cs.uu.nl/docs/vakken/b2ki/schema.html

More information

Linear Discrete-time State Space Realization of a Modified Quadruple Tank System with State Estimation using Kalman Filter

Linear Discrete-time State Space Realization of a Modified Quadruple Tank System with State Estimation using Kalman Filter Journal of Physics: Conference Series PAPER OPEN ACCESS Linear Discrete-time State Space Realization of a Modified Quadruple Tank System with State Estimation using Kalman Filter To cite this article:

More information

On Optimal Coding of Hidden Markov Sources

On Optimal Coding of Hidden Markov Sources 2014 Data Compression Conference On Optimal Coding of Hidden Markov Sources Mehdi Salehifar, Emrah Akyol, Kumar Viswanatha, and Kenneth Rose Department of Electrical and Computer Engineering University

More information

Cellular Systems Biology or Biological Network Analysis

Cellular Systems Biology or Biological Network Analysis Cellular Systems Biology or Biological Network Analysis Joel S. Bader Department of Biomedical Engineering Johns Hopkins University (c) 2012 December 4, 2012 1 Preface Cells are systems. Standard engineering

More information

Hidden Markov models 1

Hidden Markov models 1 Hidden Markov models 1 Outline Time and uncertainty Markov process Hidden Markov models Inference: filtering, prediction, smoothing Most likely explanation: Viterbi 2 Time and uncertainty The world changes;

More information

Lecture 4: Hidden Markov Models: An Introduction to Dynamic Decision Making. November 11, 2010

Lecture 4: Hidden Markov Models: An Introduction to Dynamic Decision Making. November 11, 2010 Hidden Lecture 4: Hidden : An Introduction to Dynamic Decision Making November 11, 2010 Special Meeting 1/26 Markov Model Hidden When a dynamical system is probabilistic it may be determined by the transition

More information

A NONLINEARITY MEASURE FOR ESTIMATION SYSTEMS

A NONLINEARITY MEASURE FOR ESTIMATION SYSTEMS AAS 6-135 A NONLINEARITY MEASURE FOR ESTIMATION SYSTEMS Andrew J. Sinclair,JohnE.Hurtado, and John L. Junkins The concept of nonlinearity measures for dynamical systems is extended to estimation systems,

More information

EM-algorithm for Training of State-space Models with Application to Time Series Prediction

EM-algorithm for Training of State-space Models with Application to Time Series Prediction EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

PROBABILISTIC REASONING OVER TIME

PROBABILISTIC REASONING OVER TIME PROBABILISTIC REASONING OVER TIME In which we try to interpret the present, understand the past, and perhaps predict the future, even when very little is crystal clear. Outline Time and uncertainty Inference:

More information

Household Energy Disaggregation based on Difference Hidden Markov Model

Household Energy Disaggregation based on Difference Hidden Markov Model 1 Household Energy Disaggregation based on Difference Hidden Markov Model Ali Hemmatifar (ahemmati@stanford.edu) Manohar Mogadali (manoharm@stanford.edu) Abstract We aim to address the problem of energy

More information

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems Modeling CSC752 Autonomous Robotic Systems Ubbo Visser Department of Computer Science University of Miami February 21, 2017 Outline 1 Modeling and state estimation 2 Examples 3 State estimation 4 Probabilities

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

1 What is a hidden Markov model?

1 What is a hidden Markov model? 1 What is a hidden Markov model? Consider a Markov chain {X k }, where k is a non-negative integer. Suppose {X k } embedded in signals corrupted by some noise. Indeed, {X k } is hidden due to noise and

More information

Enhancing a Model-Free Adaptive Controller through Evolutionary Computation

Enhancing a Model-Free Adaptive Controller through Evolutionary Computation Enhancing a Model-Free Adaptive Controller through Evolutionary Computation Anthony Clark, Philip McKinley, and Xiaobo Tan Michigan State University, East Lansing, USA Aquatic Robots Practical uses autonomous

More information

Sig2GRN: A Software Tool Linking Signaling Pathway with Gene Regulatory Network for Dynamic Simulation

Sig2GRN: A Software Tool Linking Signaling Pathway with Gene Regulatory Network for Dynamic Simulation Sig2GRN: A Software Tool Linking Signaling Pathway with Gene Regulatory Network for Dynamic Simulation Authors: Fan Zhang, Runsheng Liu and Jie Zheng Presented by: Fan Wu School of Computer Science and

More information

LEARNING DYNAMIC SYSTEMS: MARKOV MODELS

LEARNING DYNAMIC SYSTEMS: MARKOV MODELS LEARNING DYNAMIC SYSTEMS: MARKOV MODELS Markov Process and Markov Chains Hidden Markov Models Kalman Filters Types of dynamic systems Problem of future state prediction Predictability Observability Easily

More information

Hidden Markov models for time series of counts with excess zeros

Hidden Markov models for time series of counts with excess zeros Hidden Markov models for time series of counts with excess zeros Madalina Olteanu and James Ridgway University Paris 1 Pantheon-Sorbonne - SAMM, EA4543 90 Rue de Tolbiac, 75013 Paris - France Abstract.

More information

CS 5522: Artificial Intelligence II

CS 5522: Artificial Intelligence II CS 5522: Artificial Intelligence II Hidden Markov Models Instructor: Wei Xu Ohio State University [These slides were adapted from CS188 Intro to AI at UC Berkeley.] Pacman Sonar (P4) [Demo: Pacman Sonar

More information

Modelling residual wind farm variability using HMMs

Modelling residual wind farm variability using HMMs 8 th World IMACS/MODSIM Congress, Cairns, Australia 3-7 July 2009 http://mssanz.org.au/modsim09 Modelling residual wind farm variability using HMMs Ward, K., Korolkiewicz, M. and Boland, J. School of Mathematics

More information

Written Exam 15 December Course name: Introduction to Systems Biology Course no

Written Exam 15 December Course name: Introduction to Systems Biology Course no Technical University of Denmark Written Exam 15 December 2008 Course name: Introduction to Systems Biology Course no. 27041 Aids allowed: Open book exam Provide your answers and calculations on separate

More information

Course 495: Advanced Statistical Machine Learning/Pattern Recognition

Course 495: Advanced Statistical Machine Learning/Pattern Recognition Course 495: Advanced Statistical Machine Learning/Pattern Recognition Lecturer: Stefanos Zafeiriou Goal (Lectures): To present discrete and continuous valued probabilistic linear dynamical systems (HMMs

More information

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents Navtech Part #s Volume 1 #1277 Volume 2 #1278 Volume 3 #1279 3 Volume Set #1280 Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents Volume 1 Preface Contents

More information

Lecture 7: Simple genetic circuits I

Lecture 7: Simple genetic circuits I Lecture 7: Simple genetic circuits I Paul C Bressloff (Fall 2018) 7.1 Transcription and translation In Fig. 20 we show the two main stages in the expression of a single gene according to the central dogma.

More information

Measures for information propagation in Boolean networks

Measures for information propagation in Boolean networks Physica D 227 (2007) 100 104 www.elsevier.com/locate/physd Measures for information propagation in Boolean networks Pauli Rämö a,, Stuart Kauffman b, Juha Kesseli a, Olli Yli-Harja a a Institute of Signal

More information

CS 5522: Artificial Intelligence II

CS 5522: Artificial Intelligence II CS 5522: Artificial Intelligence II Hidden Markov Models Instructor: Alan Ritter Ohio State University [These slides were adapted from CS188 Intro to AI at UC Berkeley. All materials available at http://ai.berkeley.edu.]

More information

Hidden Markov Models. By Parisa Abedi. Slides courtesy: Eric Xing

Hidden Markov Models. By Parisa Abedi. Slides courtesy: Eric Xing Hidden Markov Models By Parisa Abedi Slides courtesy: Eric Xing i.i.d to sequential data So far we assumed independent, identically distributed data Sequential (non i.i.d.) data Time-series data E.g. Speech

More information

Chaotic Dynamics in an Electronic Model of a Genetic Network

Chaotic Dynamics in an Electronic Model of a Genetic Network Journal of Statistical Physics, Vol. 121, Nos. 5/6, December 2005 ( 2005) DOI: 10.1007/s10955-005-7009-y Chaotic Dynamics in an Electronic Model of a Genetic Network Leon Glass, 1 Theodore J. Perkins,

More information

CSEP 573: Artificial Intelligence

CSEP 573: Artificial Intelligence CSEP 573: Artificial Intelligence Hidden Markov Models Luke Zettlemoyer Many slides over the course adapted from either Dan Klein, Stuart Russell, Andrew Moore, Ali Farhadi, or Dan Weld 1 Outline Probabilistic

More information

Discovering molecular pathways from protein interaction and ge

Discovering molecular pathways from protein interaction and ge Discovering molecular pathways from protein interaction and gene expression data 9-4-2008 Aim To have a mechanism for inferring pathways from gene expression and protein interaction data. Motivation Why

More information

MACHINE LEARNING 2 UGM,HMMS Lecture 7

MACHINE LEARNING 2 UGM,HMMS Lecture 7 LOREM I P S U M Royal Institute of Technology MACHINE LEARNING 2 UGM,HMMS Lecture 7 THIS LECTURE DGM semantics UGM De-noising HMMs Applications (interesting probabilities) DP for generation probability

More information

DNA regulatory circuits can be often described by

DNA regulatory circuits can be often described by SUBMITTED TO IEEE/ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS 1 Statistical Detection of Boolean Regulatory Relationships Ting Chen, Student Member, IEEE, and Ulisses M. Braga-Neto, Senior

More information

Page 1. References. Hidden Markov models and multiple sequence alignment. Markov chains. Probability review. Example. Markovian sequence

Page 1. References. Hidden Markov models and multiple sequence alignment. Markov chains. Probability review. Example. Markovian sequence Page Hidden Markov models and multiple sequence alignment Russ B Altman BMI 4 CS 74 Some slides borrowed from Scott C Schmidler (BMI graduate student) References Bioinformatics Classic: Krogh et al (994)

More information

Fuzzy Clustering of Gene Expression Data

Fuzzy Clustering of Gene Expression Data Fuzzy Clustering of Gene Data Matthias E. Futschik and Nikola K. Kasabov Department of Information Science, University of Otago P.O. Box 56, Dunedin, New Zealand email: mfutschik@infoscience.otago.ac.nz,

More information

CSE 473: Artificial Intelligence

CSE 473: Artificial Intelligence CSE 473: Artificial Intelligence Hidden Markov Models Dieter Fox --- University of Washington [Most slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials

More information

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I

More information

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C.

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Spall John Wiley and Sons, Inc., 2003 Preface... xiii 1. Stochastic Search

More information

Miscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier

Miscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier Miscellaneous Regarding reading materials Reading materials will be provided as needed If no assigned reading, it means I think the material from class is sufficient Should be enough for you to do your

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Undirected Graphical Models

Undirected Graphical Models Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Hidden Markov Models Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca

More information

Probability and Time: Hidden Markov Models (HMMs)

Probability and Time: Hidden Markov Models (HMMs) Probability and Time: Hidden Markov Models (HMMs) Computer Science cpsc322, Lecture 32 (Textbook Chpt 6.5.2) Nov, 25, 2013 CPSC 322, Lecture 32 Slide 1 Lecture Overview Recap Markov Models Markov Chain

More information

9 Multi-Model State Estimation

9 Multi-Model State Estimation Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State

More information

arxiv: v1 [q-bio.mn] 2 Aug 2018

arxiv: v1 [q-bio.mn] 2 Aug 2018 arxiv:1808.00775v1 [q-bio.mn] 2 Aug 2018 Representing Model Ensembles as Boolean Functions Robert Schwieger, Heike Siebert Department of Mathematics, Freie Universität Berlin, Germany Abstract August 3,

More information

arxiv: v1 [cs.sy] 25 Oct 2017

arxiv: v1 [cs.sy] 25 Oct 2017 Reconstruct the Logical Network from the Transition Matrix Cailu Wang, Yuegang Tao School of Control Science and Engineering, Hebei University of Technology, Tianjin, 300130, P. R. China arxiv:1710.09681v1

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Hidden Markov Models Hamid R. Rabiee

Hidden Markov Models Hamid R. Rabiee Hidden Markov Models Hamid R. Rabiee 1 Hidden Markov Models (HMMs) In the previous slides, we have seen that in many cases the underlying behavior of nature could be modeled as a Markov process. However

More information

Regulation of Gene Expression

Regulation of Gene Expression Chapter 18 Regulation of Gene Expression PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions from

More information

Convolutional networks. Sebastian Seung

Convolutional networks. Sebastian Seung Convolutional networks Sebastian Seung Convolutional network Neural network with spatial organization every neuron has a location usually on a grid Translation invariance synaptic strength depends on locations

More information

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian

More information

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft 1 Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft K. Meier and A. Desai Abstract Using sensors that only measure the bearing angle and range of an aircraft, a Kalman filter is implemented

More information

The Ensemble Kalman Filter:

The Ensemble Kalman Filter: p.1 The Ensemble Kalman Filter: Theoretical formulation and practical implementation Geir Evensen Norsk Hydro Research Centre, Bergen, Norway Based on Evensen 23, Ocean Dynamics, Vol 53, No 4 p.2 The Ensemble

More information

A Probabilistic Relational Model for Characterizing Situations in Dynamic Multi-Agent Systems

A Probabilistic Relational Model for Characterizing Situations in Dynamic Multi-Agent Systems A Probabilistic Relational Model for Characterizing Situations in Dynamic Multi-Agent Systems Daniel Meyer-Delius 1, Christian Plagemann 1, Georg von Wichert 2, Wendelin Feiten 2, Gisbert Lawitzky 2, and

More information

New Statistical Model for the Enhancement of Noisy Speech

New Statistical Model for the Enhancement of Noisy Speech New Statistical Model for the Enhancement of Noisy Speech Electrical Engineering Department Technion - Israel Institute of Technology February 22, 27 Outline Problem Formulation and Motivation 1 Problem

More information