1 The algorithm for variable-order linear-chain CRFs

Size: px
Start display at page:

Download "1 The algorithm for variable-order linear-chain CRFs"

Transcription

1 1 he algorith for variable-order linear-chain CRFs In this section, we introduce three algoriths that constitute the ain part of this paper. One is Su-Difference algorith, which is used to calculate the ectations of the feature functions to estiate paraeters. his algorith uses su scores and difference ones when calculating forward and backward scores. his way, we can apply dynaic prograing. he other two are decoding algorith used to infer the best labels for an sequence of observations. One process the sequences fro head to tail and the other fro tail to head. 1.1 Notations We introduce the notations used in this paper. denotes the length of a sequence. x and y represent respectively a sequence of observations and labels of length and + 2. y is a sequence which has positions 0 and + 1, which represent respectively the beginning and ending of a sequence. hey have the labels l BOS and l EOS. Suppose we have a set of labels Y {l 1,, l L }. We ine the set of labels Y t at a particular position t as follows. {l BOS } if t 0 Y t {l EOS } if t Y otherwise We denote z Y n: when a label sequence z satisfies z n + 1, t : 1 t n + 1z n Y n+t 1. For a label sequence z, we ine z i:j z i j. When j < i, let z i:j ϵ, where ϵ represents an epty string. When two label sequences z 1 and z 2 are concatenated into z 3, we denote it as z 3 z 1 + z 2. When a label sequence z 1 is a suffix of z 2, we denote it as z 1 s z 2, otherwise z 1 s z 2. For any label sequence z, ϵ s z. When z 1 is a proper suffix of z 2, we denote it as z 1 < s z 2. he following two equations can be trivially derived: z 1 < s z 2, z 1 ϵ z 1 1: z 1 1 < s z 2 1: z We ine sz 1, S as follows: z 1 < s z 3, z 2 < s z 3 z 1 < s z 2 or z 2 s z 1 3 sz 1, S z 2 if and only if z 2 S and z 2 < s z 1 and z Sz < s z 1 z s z 2 4 and call it longest proper suffix of z 1 with respect to S. z 1 ay or ay not be an eleent of S. 1

2 Siilarly, we ine Sz 1, S as follows: Sz 1, S z 2 if and only if z 2 S and z 2 s z 1 and z Sz s z 1 z s z 2 5 and call it the longest suffix of z 1 with respect to S. Sz, S z if and only if z is an eleent of S. For any set S, let sϵ, S. Let be an iaginary label sequence that never equals to any other label sequences zz and let n be such a label sequence that has the length n. Let the feature functions be f 1,, f. Each feature function is a binary function that takes three arguents obervations, label sequence, position and can be ressed as a product of two binary functions b i x, t we call it observation function and L i y, t we call it label function. f i x, y, t b i x, tl i y, t 6 Each L i is associated with a label sequence z i and is ined as follows: { 1 if yt zi +1:t z L i y, t i 0 otherwise 7 We also ine L i ys, t and L z as alternate representations of the label function, where y s is a suffix of a label sequence y and z is an arbitrary label sequence. We use zero-based indexes for y s. 1 if t + y s y z i and L iy s, t y t+ ys y z i +1:t+ y s y z i 8 0 otherwise We ine f i L i z 1 if z z i and z z zi +1: z z i 0 otherwise and f i as products of b i and L i, b i and L i respectively. 9 f ix, y s, t b i x, tl iy s, t 10 f i x, z, t b i x, tl i z When we ine f i, f i and f i as above, the following can be said: f i x, y, t f i x, y t zi +1:t, t 13 f ix, y s, t f i x, yt+ y s s y z i +1:t+ y s y, t 14 2

3 1.2 raining of High-order CRF Here is the ected su of f i based on current paraeters: E[f i ] x,ỹ P y x y +1 t z i 1 f i x, y, t 15 We will need ectations of f i later when we update the paraeters. Here we introduce a way to calculate the Patterns We ine P t, the pattern set at the position t, as follows: P t Y t {ϵ} {z k 1: z k t t b kx, t 1, z k t t} 16 t t We call each eleent of P t a pattern at the position t. A pattern is either one of the label sequences that are associated with the feature functions or a prefix of another pattern at the position t > t, with t t labels reoved. When a label sequence y satisfies y t zp +1:t z p for a pattern z p at a position t, we say that y contains z p at the position t. When we ine the pattern sets as in 16, the following can be said: We can also say the following: z P t, z ϵ, t > 0 z 1: z 1 P t 1 17 f i x, z, t f i x, Sz, P t, t 18 Assue this is not true and that there is a feature function f i x, Sz, P t, t 0, f i x, z, t 1 for a Sz, P t < s z s z. his eans that z P t and this contradicts with the inition of longest suffix given in Scores of Patterns Considering positions 0 and + 1 as the beginning and ending of a label sequence, conditional probability distributions of a linear-chan CRF is ined as P y x Z x y/z x, where Z x y +1 i1 t z i 1 f ix, y, tλ i and Z x y Z xy. We call Z x y the score of y. Let σz p, t denote the su of scores of label sequences y that contain z p at a position t. It can be ressed in the following way: σz p, t +1 f i x, y, t λ i 19 y:y Y 0:,y t z p +1:t z i1 t 1 Unlike first order linear-chain CRFs, we cannot directly decopose this using forward and backward scores. Since the fact that a label sequence y contains 3

4 a pattern z p at a position t does not necessarily ean it is the longest pattern that y contains at that position, backward scores cannot be deterined. So we introduce suppleentary variables θz p, t that denote the probability that a label sequence y contains a pattern z p P t at a position t and does not contain any longer patterns at the sae position, i.e. does not contain any patterns that have z p as their longest proper suffix. θz p, t y:y Y 0:,y t z p +1:t z p, z P t,z p sz y t z +1:t z +1 f i x, y, t λ i i1 t 1 y:y Y 0:,y t z p +1:t z p, z P t,sz,p tz p y t z +1:t z +1 f i x, y, t λ i i1 t 1 When we ine θz p, t like this, we can ress σz p, t as a su of θ. σz p, t z:z P t,z p s z 20 θz, t 21 By the inition of pattern given in 16, if a label sequence y contains z p at a position t and does not contain any longer patterns, we can know that there are no feature functions that are affected by the labels before t z + 1. So we can decopose θz p, t in three parts, the forward part 1 t t and backward part t + 1 t + 1. he forward part can also be represented in the for of substraction. Let αz p, t and βz p, t denote the forward and backward parts respectively. θz p, t z:z Y 0:t,z t z p +1:t z p, z P t,sz,p tz p z t z +1:t z i1 +1 f ix, z p + z, t λ i z:z Y t+1: +1 i1 t t+1 t t 1 f i x, z, t λ i αz p, tβz p, t Pattern weights For a pattern z p P t and a position t, we ine wz p, t as follows: wz p, t i:z i z p,b i x,t1 λ i 23 4

5 his is the su of the weights of the feature functions associated with the pattern z p at the position t. We also ine W z p, t for a pattern z p P t and a position t as follows: W z p, t i:z i sz p,b ix,t1 λ i i1 f i x, z p, tλ i 24 his is the su of the weigths of the feature functions associated with the pattern z p or any of its suffixes at the position t. Fro 24 and 23, we can ress W using w as follows: W z p, t wz i, t 25 i:z i P t,z i s z p When we ine W and w as above, we can say that for any z p ϵ, W z p, t W sz, P t, t + wz p, t he Forward Variables As shown in 22, αz p, t is ined for a pattern z p and a position t 1 as follows: αz p, t z:z Y 0:t,z t z p +1:t z p, z P t,sz,p t z p z t z +1:t z i1 t 1 t f i x, z, t λ i 26 Let αϵ, t 0 for any t. In order to calculate these using dynaic prograing, we also ine γz p, t for a pattern z p P t and a position t as follows: γz p, t t f i x, z, t λ i 27 z:z Y 0:t,z t z p +1:t z p i1 t 1 When we ine α and γ like this, we can say the following using 24: αz p, t γz p, t z:z Y 0:t,z t z p +1:t z p, z P t,sz,p tz p z t z +1:t z t 1 f i x, z 1: z 1,t λ i f i x, z p, tλ i i1 t 1 γz p 1: z p 1, t 1 z:z P t,sz,p t z p i1 γz1: z 1, t 1 W z p, t 28 αz, t 29 z P t,z p s z 5

6 1.2.5 he Backward Variables Here we rewrite the inition of βz p, t given in 22 using f i ined in 11: +1 βz p, t f i x, t zp +1 + z p + z, t λ i z:z Y t+1: +1 z:z Y t+1: +1 i1 t t+1 +1 i1 t t+1 f ix, z p + z, t λ i 30 In order to calculate these using dynaic prograing, we also ine δz p, t for a pattern z p P t and a position t as follows: +1 z Y t+1: +1 i1 t t+1 f i x, z, t λ i if z p ϵ δz p +1, t z Y t+1: +1 i1 t t+1 f i x, zp + z, t λ i +1 i1 t t+1 f i x, szp, P t + z, t λ i otherwise When we ine β and δ like this, we can say the following: δz p z P t+1,z 1: z 1 z βz, t + 1 W z, t + 1 if p zp ϵ, t z P t+1,z 1: z 1 z βz, t + 1 W z, t + 1 p βsz, P t+1, t + 1a W sz, P t+1, t + 1 otherwise βz p, t z P t,z sz p δz, t 33 Here we present the derivation of 32 for z p ϵ, which is not straightforward. First, we rewrite the right hand side of 31 as follows: +1 f ix, z p + z + z, t λ i f i x, z p + z, t + 1λ i z Y t+2: +1 z Y t+1 +1 i1 t t+2 i1 t t+2 f ix, sz p, P t + z + z, t λ i Using the equation 18, we can rewrite 34 as follows: i1 i1 f i x, sz p, P t + z + z, tλ i t t+2 i1 t t+2 i1 i1 z Y t+2: +1 z Y t+1:t+1 f ix, Sz p + z, P t+1 + z, t λ i f i x, Sz p + z, P t+1, t + 1λ i f ix, Ssz p, P t + z, P t+1 + z, t λ i i1 f i x, Ssz p, P t + z, P t+1, tλ i 35 6

7 Here we can say the following: sz p + z, P t+1 s sz p, P t + z 36 Assue 36 is not true. By the inition of longest proper suffix in 4, sz p + z, P t+1 < s z p + z and sz p, P t + z < s z p + z. Fro 3 and the assuption that 36 is not true, sz p, P t + z < s sz p + z, P t+1. Let z sz p + z, P t+1. Since z P t+1 and z ϵ, we can derive z 1: z 1 P t fro 17 and sz p, P t < s z 1: z 1 < s z p fro 2. his contradicts the inition of longest proper suffix in 4. Fro the inition of longest proper suffix given in 4, we can also say that there is no such z P t+1 that satisfies sz p + z, P t+1 < s z < s z p + z. Fro the above arguent, we can say the following: Ssz p, P t + z, P t+1 sz p + z, P t+1 37 When z p + z P t+1, we can say the following fro 4 and 5. Sz p + z, P t+1 sz p + z, P t+1 38 So in 35, for z p + z P t+1, we can cancel out the right and left sides of the subtraction and rewrite it as follows: z Y t+2: +1 z P t+1,z z 1 zp +1 t t+2 i1 +1 t t+2 i1 f i x, z + z, t λ i f i x, z, t + 1λ i f i x, sz, P t+1 + z, t λ i f i x, sz, P t+1 + z, P t+1, tλ i 39 Applying the inition of β given in 30 and that of W given in 24, we can derive the equation 32. Now that α s at a position t can be calculated using γ s at the position t 1 and γ s using α s at the sae position, β s at a position t using δ s at the position t + 1 and δ s using β s at the sae position and αbos, 0 1, δϵ, + 1 1, we can calculate all the α s, β s, γ s and δ s using dynaic prograing. We can then calculate θ s using α s and β s, and then, σ s using θ s. he noralization factor Z x can be ressed as γϵ, + 1, because: t γϵ, + 1 f i x, z, t λ i Z x y Z x z:z Y 0: +1,z +2: +1 ϵ i1 t 1 y 40 So we can now calculate the ectations of each feature function as follows: i1 i1 +1 P f i x σz i, t/zx 41 t1 7

8 1.2.6 he algorith to calculate the variables Here we present the algorith that calculates efficiently the variables introduced in the previous section. First we enuerate the patterns that belong to P t at positions t 1,, + 1. We also enuerate the feature functions whose values equal to 1 at the position t if a label sequence contains the pattern z p at that position and ake a set F zp,t which has the as eleents. his procedure can be described as in Algorith 1. We can easily link a pattern z ϵ P t to its corresponding prefix Algorith 1 Enuerate patterns for t 1 to + 1 do F {f k b k x, t 1} for all f i F do F z i,t F z i,t {f i } for j 0 to z i do P t j P t j {z i 1: z j } pattern, which we here ine as z 1: z 1 P t 1. We can also link a pattern z P to its longest proper suffix sz, P t by storing the pattern sets in tries using the reversed lavel sequences of patterns as keys. We can sort the patterns in a order such that z 1 appears before z 2 when z 1 < s z 2. We call this order ascending order and the reverse descending order. Since the sorting order of the patterns and their prefix patterns and longest proper suffixes are invariable over iterations, we only have to execute this procesure once and store the result. We execute Algorith 2 for each iteration to calculate the ectations of the feature functions. Assue that all wz, t and other nueric variables are initialized by 0. he coplexity of this procedure is O F zp,t + P t 42 t1 z p :z p P t t1 for an iteration on a sequence. γ s and δ s are teporary variables and old ones can be discarded. 1.3 Inference We will describe the procedure to infer the labels using the estiated paraeters. First, the label sequence which we want to infer can be described as follows: y arg ax y 1 Zx 8 λ i f i x, y, t t i

9 Algorith 2 Su-difference for t 1 to + 1 do for all z ϵ P t in ascending order do for all f i F z,t do wz, t wz, t + λ i W z, t W sz, P t, t + wz, t γϵ, 0 1, γl BOS, 0 1 for t 1 to + 1 do for all z ϵ P t in descending order do αsz, P t, t αsz, P t, t γz 1: z 1, t 1 αz, t αz, t + γz 1: z 1, t 1 W z, t αϵ, t 0 for all z ϵ P t in descending order do γz, t γz, t + αz, t γsz, P t, t γsz, P t, t + γz, t Zx γϵ, + 1 δϵ, for t + 1 downto 1 do βϵ, t δϵ, t for all z ϵ P t in ascending order do βz, t βsz, P t, t + δz, t for all z ϵ P t do δz 1: z 1, t 1 δz 1: z 1, t 1 + βz, t W z, t βsz, P t, t W sz, P t for t 1 to + 1 do for all z ϵ P t in descending order do θz p, t αz p, t W z p, tβz p, t σz, t σz, t + θz, t σsz, P t, t σsz, P t, t + σz, t for all f i F z,t do 9 E[f i ] E[f i ] + σz, t/zx

10 arg ax λ i f i x, y, t y he algorith we present here has the coplexity of O t i +1 t1 which is OL ties bigger than that of training for one iteration Inference Algorith 43 i:bx,t1 1 + L +1 t1 We have to calculate the W for each pattern beforehand. his procedure, Algorith 3, is the sae as the one that we used in the Su-Difference algorith. z p :z p P t 1, Algorith 3 Calculate weights for t 1 to + 1 do for all z ϵ P t in ascending order do for all f i F z,t do wz, t wz, t + λ i W z, t W sz, P t, t + wz, t We ine pz p, t and qz p, t for a pattern z p P t as follows. pz p, t qz p, t ax z:z P t 1,z p 1: z p 1 sz, z :z P t 1,z p sz z sz pz, t 1 W z p, t 44 arg ax z:z P t 1,z p 1: z p 1 sz, z :z P t 1,z p s zz s z pz, t 1 45 When we ine p and q like this, we can calculate the using the Algorith 4. After that, we can obtain the optiu label sequence y using the Algorith 5. 10

11 Algorith 4 Inference 1 pϵ, 0 1, pl BOS, 0 1 for t 1 to + 1 do for all z ϵ P t in descending order do if this is the first iteration or the last label in z has changed then z the first z P t 1 in descending order for all z P t 1 do rz, t 1 pz, t 1 uz, t 1 z end if while z z 1: z 1 do if rz, t 1 > rsz, t 1, t 1 then rsz, t 1, t 1 rz, t 1 usz, t 1, t 1 uz, t 1 end if z next z P t 1 in descending order end while pz, t rz, t W z, t qz, t uz, t Algorith 5 Inference 2 z arg ax z P +1 pz, + 1 for t + 1 to 1 do y t z z z qz, t y 0 l BOS 11

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data Suppleentary to Learning Discriinative Bayesian Networks fro High-diensional Continuous Neuroiaging Data Luping Zhou, Lei Wang, Lingqiao Liu, Philip Ogunbona, and Dinggang Shen Proposition. Given a sparse

More information

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

e-companion ONLY AVAILABLE IN ELECTRONIC FORM OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer

More information

List Scheduling and LPT Oliver Braun (09/05/2017)

List Scheduling and LPT Oliver Braun (09/05/2017) List Scheduling and LPT Oliver Braun (09/05/207) We investigate the classical scheduling proble P ax where a set of n independent jobs has to be processed on 2 parallel and identical processors (achines)

More information

On Certain C-Test Words for Free Groups

On Certain C-Test Words for Free Groups Journal of Algebra 247, 509 540 2002 doi:10.1006 jabr.2001.9001, available online at http: www.idealibrary.co on On Certain C-Test Words for Free Groups Donghi Lee Departent of Matheatics, Uni ersity of

More information

A Note on Scheduling Tall/Small Multiprocessor Tasks with Unit Processing Time to Minimize Maximum Tardiness

A Note on Scheduling Tall/Small Multiprocessor Tasks with Unit Processing Time to Minimize Maximum Tardiness A Note on Scheduling Tall/Sall Multiprocessor Tasks with Unit Processing Tie to Miniize Maxiu Tardiness Philippe Baptiste and Baruch Schieber IBM T.J. Watson Research Center P.O. Box 218, Yorktown Heights,

More information

Deflation of the I-O Series Some Technical Aspects. Giorgio Rampa University of Genoa April 2007

Deflation of the I-O Series Some Technical Aspects. Giorgio Rampa University of Genoa April 2007 Deflation of the I-O Series 1959-2. Soe Technical Aspects Giorgio Rapa University of Genoa g.rapa@unige.it April 27 1. Introduction The nuber of sectors is 42 for the period 1965-2 and 38 for the initial

More information

Polygonal Designs: Existence and Construction

Polygonal Designs: Existence and Construction Polygonal Designs: Existence and Construction John Hegean Departent of Matheatics, Stanford University, Stanford, CA 9405 Jeff Langford Departent of Matheatics, Drake University, Des Moines, IA 5011 G

More information

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points

More information

Algorithms for parallel processor scheduling with distinct due windows and unit-time jobs

Algorithms for parallel processor scheduling with distinct due windows and unit-time jobs BULLETIN OF THE POLISH ACADEMY OF SCIENCES TECHNICAL SCIENCES Vol. 57, No. 3, 2009 Algoriths for parallel processor scheduling with distinct due windows and unit-tie obs A. JANIAK 1, W.A. JANIAK 2, and

More information

Least Squares Fitting of Data

Least Squares Fitting of Data Least Squares Fitting of Data David Eberly, Geoetric Tools, Redond WA 98052 https://www.geoetrictools.co/ This work is licensed under the Creative Coons Attribution 4.0 International License. To view a

More information

Boosting with log-loss

Boosting with log-loss Boosting with log-loss Marco Cusuano-Towner Septeber 2, 202 The proble Suppose we have data exaples {x i, y i ) i =... } for a two-class proble with y i {, }. Let F x) be the predictor function with the

More information

ALGEBRA REVIEW. MULTINOMIAL An algebraic expression consisting of more than one term.

ALGEBRA REVIEW. MULTINOMIAL An algebraic expression consisting of more than one term. Page 1 of 6 ALGEBRAIC EXPRESSION A cobination of ordinary nubers, letter sybols, variables, grouping sybols and operation sybols. Nubers reain fixed in value and are referred to as constants. Letter sybols

More information

ORIGAMI CONSTRUCTIONS OF RINGS OF INTEGERS OF IMAGINARY QUADRATIC FIELDS

ORIGAMI CONSTRUCTIONS OF RINGS OF INTEGERS OF IMAGINARY QUADRATIC FIELDS #A34 INTEGERS 17 (017) ORIGAMI CONSTRUCTIONS OF RINGS OF INTEGERS OF IMAGINARY QUADRATIC FIELDS Jürgen Kritschgau Departent of Matheatics, Iowa State University, Aes, Iowa jkritsch@iastateedu Adriana Salerno

More information

Bayes Decision Rule and Naïve Bayes Classifier

Bayes Decision Rule and Naïve Bayes Classifier Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.

More information

A Simple Regression Problem

A Simple Regression Problem A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where

More information

1 Identical Parallel Machines

1 Identical Parallel Machines FB3: Matheatik/Inforatik Dr. Syaantak Das Winter 2017/18 Optiizing under Uncertainty Lecture Notes 3: Scheduling to Miniize Makespan In any standard scheduling proble, we are given a set of jobs J = {j

More information

Two Posts to Fill On School Board

Two Posts to Fill On School Board Y Y 9 86 4 4 qz 86 x : ( ) z 7 854 Y x 4 z z x x 4 87 88 Y 5 x q x 8 Y 8 x x : 6 ; : 5 x ; 4 ( z ; ( ) ) x ; z 94 ; x 3 3 3 5 94 ; ; ; ; 3 x : 5 89 q ; ; x ; x ; ; x : ; ; ; ; ; ; 87 47% : () : / : 83

More information

Order Recursion Introduction Order versus Time Updates Matrix Inversion by Partitioning Lemma Levinson Algorithm Interpretations Examples

Order Recursion Introduction Order versus Time Updates Matrix Inversion by Partitioning Lemma Levinson Algorithm Interpretations Examples Order Recursion Introduction Order versus Tie Updates Matrix Inversion by Partitioning Lea Levinson Algorith Interpretations Exaples Introduction Rc d There are any ways to solve the noral equations Solutions

More information

Analyzing Simulation Results

Analyzing Simulation Results Analyzing Siulation Results Dr. John Mellor-Cruey Departent of Coputer Science Rice University johnc@cs.rice.edu COMP 528 Lecture 20 31 March 2005 Topics for Today Model verification Model validation Transient

More information

Sequence labeling. Taking collective a set of interrelated instances x 1,, x T and jointly labeling them

Sequence labeling. Taking collective a set of interrelated instances x 1,, x T and jointly labeling them HMM, MEMM and CRF 40-957 Special opics in Artificial Intelligence: Probabilistic Graphical Models Sharif University of echnology Soleymani Spring 2014 Sequence labeling aking collective a set of interrelated

More information

Kernel Methods and Support Vector Machines

Kernel Methods and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic

More information

Homework 3 Solutions CSE 101 Summer 2017

Homework 3 Solutions CSE 101 Summer 2017 Hoework 3 Solutions CSE 0 Suer 207. Scheduling algoriths The following n = 2 jobs with given processing ties have to be scheduled on = 3 parallel and identical processors with the objective of iniizing

More information

Physics 215 Winter The Density Matrix

Physics 215 Winter The Density Matrix Physics 215 Winter 2018 The Density Matrix The quantu space of states is a Hilbert space H. Any state vector ψ H is a pure state. Since any linear cobination of eleents of H are also an eleent of H, it

More information

Sequence Modelling with Features: Linear-Chain Conditional Random Fields. COMP-599 Oct 6, 2015

Sequence Modelling with Features: Linear-Chain Conditional Random Fields. COMP-599 Oct 6, 2015 Sequence Modelling with Features: Linear-Chain Conditional Random Fields COMP-599 Oct 6, 2015 Announcement A2 is out. Due Oct 20 at 1pm. 2 Outline Hidden Markov models: shortcomings Generative vs. discriminative

More information

Hidden Markov Model. Ying Wu. Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208

Hidden Markov Model. Ying Wu. Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 Hidden Markov Model Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/19 Outline Example: Hidden Coin Tossing Hidden

More information

Chapter 1. Complex Numbers. Dr. Pulak Sahoo

Chapter 1. Complex Numbers. Dr. Pulak Sahoo Chapter 1 Complex Numbers BY Dr. Pulak Sahoo Assistant Professor Department of Mathematics University Of Kalyani West Bengal, India E-mail : sahoopulak1@gmail.com 1 Module-3: Straight Line and Circle in

More information

1 Bounding the Margin

1 Bounding the Margin COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost

More information

Tracking using CONDENSATION: Conditional Density Propagation

Tracking using CONDENSATION: Conditional Density Propagation Tracking using CONDENSATION: Conditional Density Propagation Goal Model-based visual tracking in dense clutter at near video frae rates M. Isard and A. Blake, CONDENSATION Conditional density propagation

More information

Upper bound on false alarm rate for landmine detection and classification using syntactic pattern recognition

Upper bound on false alarm rate for landmine detection and classification using syntactic pattern recognition Upper bound on false alar rate for landine detection and classification using syntactic pattern recognition Ahed O. Nasif, Brian L. Mark, Kenneth J. Hintz, and Nathalia Peixoto Dept. of Electrical and

More information

Egyptian Mathematics Problem Set

Egyptian Mathematics Problem Set (Send corrections to cbruni@uwaterloo.ca) Egyptian Matheatics Proble Set (i) Use the Egyptian area of a circle A = (8d/9) 2 to copute the areas of the following circles with given diaeter. d = 2. d = 3

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE7C (Spring 018: Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee7c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee7c@berkeley.edu October 15,

More information

16.333: Lecture # 7. Approximate Longitudinal Dynamics Models

16.333: Lecture # 7. Approximate Longitudinal Dynamics Models 16.333: Lecture # 7 Approxiate Longitudinal Dynaics Models A couple ore stability derivatives Given ode shapes found identify sipler odels that capture the ain responses Fall 24 16.333 6 1 More Stability

More information

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes

More information

Understanding Machine Learning Solution Manual

Understanding Machine Learning Solution Manual Understanding Machine Learning Solution Manual Written by Alon Gonen Edited by Dana Rubinstein Noveber 17, 2014 2 Gentle Start 1. Given S = ((x i, y i )), define the ultivariate polynoial p S (x) = i []:y

More information

Introduction to Vectors

Introduction to Vectors Introduction to Vectors K. Behrend January 31, 008 Abstract An introduction to vectors in R and R 3. Lines and planes in R 3. Linear dependence. 1 Contents Introduction 3 1 Vectors 4 1.1 Plane vectors...............................

More information

STATS 306B: Unsupervised Learning Spring Lecture 5 April 14

STATS 306B: Unsupervised Learning Spring Lecture 5 April 14 STATS 306B: Unsupervised Learning Spring 2014 Lecture 5 April 14 Lecturer: Lester Mackey Scribe: Brian Do and Robin Jia 5.1 Discrete Hidden Markov Models 5.1.1 Recap In the last lecture, we introduced

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition

More information

Sequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5,

Sequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5, Sequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5, 2015 31 11 Motif Finding Sources for this section: Rouchka, 1997, A Brief Overview of Gibbs Sapling. J. Buhler, M. Topa:

More information

ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD

ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD PROCEEDINGS OF THE YEREVAN STATE UNIVERSITY Physical and Matheatical Sciences 04,, p. 7 5 ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD M a t h e a t i c s Yu. A. HAKOPIAN, R. Z. HOVHANNISYAN

More information

Handout 7. and Pr [M(x) = χ L (x) M(x) =? ] = 1.

Handout 7. and Pr [M(x) = χ L (x) M(x) =? ] = 1. Notes on Coplexity Theory Last updated: October, 2005 Jonathan Katz Handout 7 1 More on Randoized Coplexity Classes Reinder: so far we have seen RP,coRP, and BPP. We introduce two ore tie-bounded randoized

More information

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay

More information

MA304 Differential Geometry

MA304 Differential Geometry MA304 Differential Geoetry Hoework 4 solutions Spring 018 6% of the final ark 1. The paraeterised curve αt = t cosh t for t R is called the catenary. Find the curvature of αt. Solution. Fro hoework question

More information

Training an RBM: Contrastive Divergence. Sargur N. Srihari

Training an RBM: Contrastive Divergence. Sargur N. Srihari Training an RBM: Contrastive Divergence Sargur N. srihari@cedar.buffalo.edu Topics in Partition Function Definition of Partition Function 1. The log-likelihood gradient 2. Stochastic axiu likelihood and

More information

Singularities of divisors on abelian varieties

Singularities of divisors on abelian varieties Singularities of divisors on abelian varieties Olivier Debarre March 20, 2006 This is joint work with Christopher Hacon. We work over the coplex nubers. Let D be an effective divisor on an abelian variety

More information

Math 203A - Solution Set 1

Math 203A - Solution Set 1 Math 203A - Solution Set 1 Problem 1. Show that the Zariski topology on A 2 is not the product of the Zariski topologies on A 1 A 1. Answer: Clearly, the diagonal Z = {(x, y) : x y = 0} A 2 is closed in

More information

1 Proof of learning bounds

1 Proof of learning bounds COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #4 Scribe: Akshay Mittal February 13, 2013 1 Proof of learning bounds For intuition of the following theore, suppose there exists a

More information

Ch 12: Variations on Backpropagation

Ch 12: Variations on Backpropagation Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith

More information

Approximation in Stochastic Scheduling: The Power of LP-Based Priority Policies

Approximation in Stochastic Scheduling: The Power of LP-Based Priority Policies Approxiation in Stochastic Scheduling: The Power of -Based Priority Policies Rolf Möhring, Andreas Schulz, Marc Uetz Setting (A P p stoch, r E( w and (B P p stoch E( w We will assue that the processing

More information

Data-Driven Imaging in Anisotropic Media

Data-Driven Imaging in Anisotropic Media 18 th World Conference on Non destructive Testing, 16- April 1, Durban, South Africa Data-Driven Iaging in Anisotropic Media Arno VOLKER 1 and Alan HUNTER 1 TNO Stieltjesweg 1, 6 AD, Delft, The Netherlands

More information

Math 203A - Solution Set 1

Math 203A - Solution Set 1 Math 203A - Solution Set 1 Problem 1. Show that the Zariski topology on A 2 is not the product of the Zariski topologies on A 1 A 1. Answer: Clearly, the diagonal Z = {(x, y) : x y = 0} A 2 is closed in

More information

lecture 37: Linear Multistep Methods: Absolute Stability, Part I lecture 38: Linear Multistep Methods: Absolute Stability, Part II

lecture 37: Linear Multistep Methods: Absolute Stability, Part I lecture 38: Linear Multistep Methods: Absolute Stability, Part II lecture 37: Linear Multistep Methods: Absolute Stability, Part I lecture 3: Linear Multistep Methods: Absolute Stability, Part II 5.7 Linear ultistep ethods: absolute stability At this point, it ay well

More information

Feature Extraction Techniques

Feature Extraction Techniques Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that

More information

Combining Classifiers

Combining Classifiers Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/

More information

Machine Learning Basics: Estimators, Bias and Variance

Machine Learning Basics: Estimators, Bias and Variance Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics

More information

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all Lecture 6 Introduction to kinetic theory of plasa waves Introduction to kinetic theory So far we have been odeling plasa dynaics using fluid equations. The assuption has been that the pressure can be either

More information

Rational Filter Wavelets*

Rational Filter Wavelets* Ž Journal of Matheatical Analysis and Applications 39, 744 1999 Article ID jaa19996550, available online at http:wwwidealibraryco on Rational Filter Wavelets* Kuang Zheng and Cui Minggen Departent of Matheatics,

More information

Donald Fussell. October 28, Computer Science Department The University of Texas at Austin. Point Masses and Force Fields.

Donald Fussell. October 28, Computer Science Department The University of Texas at Austin. Point Masses and Force Fields. s Vector Moving s and Coputer Science Departent The University of Texas at Austin October 28, 2014 s Vector Moving s Siple classical dynaics - point asses oved by forces Point asses can odel particles

More information

A1. Find all ordered pairs (a, b) of positive integers for which 1 a + 1 b = 3

A1. Find all ordered pairs (a, b) of positive integers for which 1 a + 1 b = 3 A. Find all ordered pairs a, b) of positive integers for which a + b = 3 08. Answer. The six ordered pairs are 009, 08), 08, 009), 009 337, 674) = 35043, 674), 009 346, 673) = 3584, 673), 674, 009 337)

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer

More information

Probability Distributions

Probability Distributions Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples

More information

Chaotic Coupled Map Lattices

Chaotic Coupled Map Lattices Chaotic Coupled Map Lattices Author: Dustin Keys Advisors: Dr. Robert Indik, Dr. Kevin Lin 1 Introduction When a syste of chaotic aps is coupled in a way that allows the to share inforation about each

More information

Computational and Statistical Learning Theory

Computational and Statistical Learning Theory Coputational and Statistical Learning Theory Proble sets 5 and 6 Due: Noveber th Please send your solutions to learning-subissions@ttic.edu Notations/Definitions Recall the definition of saple based Radeacher

More information

Ufuk Demirci* and Feza Kerestecioglu**

Ufuk Demirci* and Feza Kerestecioglu** 1 INDIRECT ADAPTIVE CONTROL OF MISSILES Ufuk Deirci* and Feza Kerestecioglu** *Turkish Navy Guided Missile Test Station, Beykoz, Istanbul, TURKEY **Departent of Electrical and Electronics Engineering,

More information

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds

More information

Supporting information for Self-assembly of multicomponent structures in and out of equilibrium

Supporting information for Self-assembly of multicomponent structures in and out of equilibrium Supporting inforation for Self-assebly of ulticoponent structures in and out of equilibriu Stephen Whitela 1, Rebecca Schulan 2, Lester Hedges 1 1 Molecular Foundry, Lawrence Berkeley National Laboratory,

More information

COS 424: Interacting with Data. Written Exercises

COS 424: Interacting with Data. Written Exercises COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well

More information

Estimating Parameters for a Gaussian pdf

Estimating Parameters for a Gaussian pdf Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3

More information

Feedforward Networks

Feedforward Networks Feedforward Neural Networks - Backpropagation Feedforward Networks Gradient Descent Learning and Backpropagation CPSC 533 Fall 2003 Christian Jacob Dept.of Coputer Science,University of Calgary Feedforward

More information

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13 CSE55: Randoied Algoriths and obabilistic Analysis May 6, Lecture Lecturer: Anna Karlin Scribe: Noah Siegel, Jonathan Shi Rando walks and Markov chains This lecture discusses Markov chains, which capture

More information

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion Suppleentary Material for Fast and Provable Algoriths for Spectrally Sparse Signal Reconstruction via Low-Ran Hanel Matrix Copletion Jian-Feng Cai Tianing Wang Ke Wei March 1, 017 Abstract We establish

More information

FPTAS for optimizing polynomials over the mixed-integer points of polytopes in fixed dimension

FPTAS for optimizing polynomials over the mixed-integer points of polytopes in fixed dimension Matheatical Prograing anuscript No. (will be inserted by the editor) Jesús A. De Loera Rayond Heecke Matthias Köppe Robert Weisantel FPTAS for optiizing polynoials over the ixed-integer points of polytopes

More information

RELIABILITY AND CAPABILITY MODELING OF TECHNOLOGICAL SYSTEMS WITH BUFFER STORAGE

RELIABILITY AND CAPABILITY MODELING OF TECHNOLOGICAL SYSTEMS WITH BUFFER STORAGE RT&A # 0 (7 (Vol. 00, June RELIABILITY AND CAPABILITY MODELING OF TECHNOLOGICAL SYSTEMS WITH BUFFER STORAGE Aren S. Stepanyants, Valentina S. Victorova Institute of Control Science, Russian Acadey of Sciences

More information

Course 2BA1: Hilary Term 2007 Section 8: Quaternions and Rotations

Course 2BA1: Hilary Term 2007 Section 8: Quaternions and Rotations Course BA1: Hilary Term 007 Section 8: Quaternions and Rotations David R. Wilkins Copyright c David R. Wilkins 005 Contents 8 Quaternions and Rotations 1 8.1 Quaternions............................ 1 8.

More information

Chapter 16. Structured Probabilistic Models for Deep Learning

Chapter 16. Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 1 Chapter 16 Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 2 Structured Probabilistic Models way of using graphs to describe

More information

. The univariate situation. It is well-known for a long tie that denoinators of Pade approxiants can be considered as orthogonal polynoials with respe

. The univariate situation. It is well-known for a long tie that denoinators of Pade approxiants can be considered as orthogonal polynoials with respe PROPERTIES OF MULTIVARIATE HOMOGENEOUS ORTHOGONAL POLYNOMIALS Brahi Benouahane y Annie Cuyt? Keywords Abstract It is well-known that the denoinators of Pade approxiants can be considered as orthogonal

More information

Nonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy

Nonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy Storage Capacity and Dynaics of Nononotonic Networks Bruno Crespi a and Ignazio Lazzizzera b a. IRST, I-38050 Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I-38050 Povo (Trento) Italy INFN Gruppo

More information

Quantum Chemistry Exam 2 Take-home Solutions

Quantum Chemistry Exam 2 Take-home Solutions Cheistry 60 Fall 07 Dr Jean M Standard Nae KEY Quantu Cheistry Exa Take-hoe Solutions 5) (0 points) In this proble, the nonlinear variation ethod will be used to deterine an approxiate solution for the

More information

11.3 Decoding Algorithm

11.3 Decoding Algorithm 11.3 Decoding Algorithm 393 For convenience, we have introduced π 0 and π n+1 as the fictitious initial and terminal states begin and end. This model defines the probability P(x π) for a given sequence

More information

Fixed-to-Variable Length Distribution Matching

Fixed-to-Variable Length Distribution Matching Fixed-to-Variable Length Distribution Matching Rana Ali Ajad and Georg Böcherer Institute for Counications Engineering Technische Universität München, Gerany Eail: raa2463@gail.co,georg.boecherer@tu.de

More information

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

arxiv: v1 [math.nt] 14 Sep 2014

arxiv: v1 [math.nt] 14 Sep 2014 ROTATION REMAINDERS P. JAMESON GRABER, WASHINGTON AND LEE UNIVERSITY 08 arxiv:1409.411v1 [ath.nt] 14 Sep 014 Abstract. We study properties of an array of nubers, called the triangle, in which each row

More information

Expectation Maximization (EM)

Expectation Maximization (EM) Expectation Maximization (EM) The Expectation Maximization (EM) algorithm is one approach to unsupervised, semi-supervised, or lightly supervised learning. In this kind of learning either no labels are

More information

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair Proceedings of the 6th SEAS International Conference on Siulation, Modelling and Optiization, Lisbon, Portugal, Septeber -4, 006 0 A Siplified Analytical Approach for Efficiency Evaluation of the eaving

More information

A note on the multiplication of sparse matrices

A note on the multiplication of sparse matrices Cent. Eur. J. Cop. Sci. 41) 2014 1-11 DOI: 10.2478/s13537-014-0201-x Central European Journal of Coputer Science A note on the ultiplication of sparse atrices Research Article Keivan Borna 12, Sohrab Aboozarkhani

More information

Analysis of Polynomial & Rational Functions ( summary )

Analysis of Polynomial & Rational Functions ( summary ) Analysis of Polynoial & Rational Functions ( suary ) The standard for of a polynoial function is ( ) where each of the nubers are called the coefficients. The polynoial of is said to have degree n, where

More information

CONTROL SYSTEMS, ROBOTICS AND AUTOMATION Vol. XV - Modeling of Discrete Event Systems - Stéphane Lafortune

CONTROL SYSTEMS, ROBOTICS AND AUTOMATION Vol. XV - Modeling of Discrete Event Systems - Stéphane Lafortune CONTROL SYSTEMS, ROBOTICS AND AUTOMATION Vol. XV - Modeling of Discrete Event Systes - Stéphane Lafortune MODELING OF DISCRETE EVENT SYSTEMS Stéphane Lafortune The University of Michigan, USA Keywords:

More information

P3.C8.COMPLEX NUMBERS

P3.C8.COMPLEX NUMBERS Recall: Within the real number system, we can solve equation of the form and b 2 4ac 0. ax 2 + bx + c =0, where a, b, c R What is R? They are real numbers on the number line e.g: 2, 4, π, 3.167, 2 3 Therefore,

More information

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters journal of ultivariate analysis 58, 96106 (1996) article no. 0041 The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Paraeters H. S. Steyn

More information

Feedforward Networks

Feedforward Networks Feedforward Networks Gradient Descent Learning and Backpropagation Christian Jacob CPSC 433 Christian Jacob Dept.of Coputer Science,University of Calgary CPSC 433 - Feedforward Networks 2 Adaptive "Prograing"

More information

Research Article Robust ε-support Vector Regression

Research Article Robust ε-support Vector Regression Matheatical Probles in Engineering, Article ID 373571, 5 pages http://dx.doi.org/10.1155/2014/373571 Research Article Robust ε-support Vector Regression Yuan Lv and Zhong Gan School of Mechanical Engineering,

More information

The Hilbert Schmidt version of the commutator theorem for zero trace matrices

The Hilbert Schmidt version of the commutator theorem for zero trace matrices The Hilbert Schidt version of the coutator theore for zero trace atrices Oer Angel Gideon Schechtan March 205 Abstract Let A be a coplex atrix with zero trace. Then there are atrices B and C such that

More information

Defect-Aware SOC Test Scheduling

Defect-Aware SOC Test Scheduling Defect-Aware SOC Test Scheduling Erik Larsson +, Julien Pouget*, and Zebo Peng + Ebedded Systes Laboratory + LIRMM* Departent of Coputer Science Montpellier 2 University Linköpings universitet CNRS Sweden

More information

On Hyper-Parameter Estimation in Empirical Bayes: A Revisit of the MacKay Algorithm

On Hyper-Parameter Estimation in Empirical Bayes: A Revisit of the MacKay Algorithm On Hyper-Paraeter Estiation in Epirical Bayes: A Revisit of the MacKay Algorith Chune Li 1, Yongyi Mao, Richong Zhang 1 and Jinpeng Huai 1 1 School of Coputer Science and Engineering, Beihang University,

More information

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering

More information

Celal S. Konor Release 1.1 (identical to 1.0) 3/21/08. 1-Hybrid isentropic-sigma vertical coordinate and governing equations in the free atmosphere

Celal S. Konor Release 1.1 (identical to 1.0) 3/21/08. 1-Hybrid isentropic-sigma vertical coordinate and governing equations in the free atmosphere Celal S. Konor Release. (identical to.0) 3/2/08 -Hybrid isentropic-siga vertical coordinate governing equations in the free atosphere This section describes the equations in the free atosphere of the odel.

More information

Solutions of some selected problems of Homework 4

Solutions of some selected problems of Homework 4 Solutions of soe selected probles of Hoework 4 Sangchul Lee May 7, 2018 Proble 1 Let there be light A professor has two light bulbs in his garage. When both are burned out, they are replaced, and the next

More information

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are,

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are, Page of 8 Suppleentary Materials: A ultiple testing procedure for ulti-diensional pairwise coparisons with application to gene expression studies Anjana Grandhi, Wenge Guo, Shyaal D. Peddada S Notations

More information

Physically Based Modeling CS Notes Spring 1997 Particle Collision and Contact

Physically Based Modeling CS Notes Spring 1997 Particle Collision and Contact Physically Based Modeling CS 15-863 Notes Spring 1997 Particle Collision and Contact 1 Collisions with Springs Suppose we wanted to ipleent a particle siulator with a floor : a solid horizontal plane which

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information