Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism

Size: px
Start display at page:

Download "Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism"

Transcription

1 Learning o Joinly Translae and Predic Dropped Pronouns wih a Shared Reconsrucion Mechanism Longyue Wang Tencen AI Lab vinnylywang@encen.com Andy Way Dublin Ciy Universiy andy.way@adapcenre.ie Absrac Pronouns are frequenly omied in pro-drop languages, such as Chinese, generally leading o significan challenges wih respec o he producion of complee ranslaions. Recenly, Wang e al. (2018) proposed a novel reconsrucion-based approach o alleviaing dropped pronoun (DP) ranslaion problems for neural machine ranslaion models. In his work, we improve he original model from wo perspecives. Firs, we employ a shared reconsrucor o beer exploi encoder and decoder represenaions. Second, we joinly learn o ranslae and predic DPs in an end-o-end manner, o avoid he errors propagaed from an exernal DP predicion model. Experimenal resuls show ha our approach significanly improves boh ranslaion performance and DP predicion accuracy. 1 Inroducion Pronouns are imporan in naural languages as hey imply rich discourse informaion. However, in pro-drop languages such as Chinese and Japanese, pronouns are frequenly omied when heir referens can be pragmaically inferred from he conex. When ranslaing senences from a pro-drop language ino a non-pro-drop language (e.g. Chinese-o-English), ranslaion models generally fail o ranslae invisible dropped pronouns (DPs). This phenomenon leads o various ranslaion problems in erms of compleeness, synax and even semanics of ranslaions. A number of approaches have been invesigaed for DP ranslaion (Le Nagard and Koehn, 2010; Xiang e al., 2013; Wang e al., 2016, 2018). Wang e al. (2018) is a pioneering work o model DP ranslaion for neural machine rans- Zhaopeng Tu is he corresponding auhor of he paper. This work was conduced when Longyue Wang was sudying and Qun Liu was working a he ADAPT Cenre in he School of Compuing a Dublin Ciy Universiy. Zhaopeng Tu Tencen AI Lab zpu@encen.com Qun Liu Huawei Noah s Ark Lab qun.liu@huawei.com laion (NMT) models. They employ wo separae reconsrucors (Tu e al., 2017) o respecively reconsruc encoder and decoder represenaions back o he DP-annoaed source senence. The annoaion of DP is provided by an exernal predicion model, which is rained on he parallel corpus using auomaically learned alignmen informaion (Wang e al., 2016). Alhough his model achieved significan improvemens, here noneheless exis wo drawbacks: 1) here is no ineracion beween he wo separae reconsrucors, which misses he opporuniy o exploi useful relaions beween encoder and decoder represenaions; and 2) he exernal DP predicion model only has an accuracy of 66% in F1-score, which propagaes numerous errors o he ranslaion model. In his work, we propose o improve he original model from wo perspecives. Firs, we use a shared reconsrucor o read hidden saes from boh encoder and decoder. Second, we inegrae a DP predicor ino NMT o joinly learn o ranslae and predic DPs. Incorporaing hese as wo auxiliary loss erms can guide boh he encoder and decoder saes o learn criical informaion relevan o DPs. Experimenal resuls on a largescale Chinese English subile corpus show ha he wo modificaions can accumulaively improve ranslaion performance, and he bes resul is +1.5 BLEU poins beer han ha repored by Wang e al. (2018). In addiion, he joinly learned DP predicion model significanly ouperforms is exernal counerpar by 9% in F1-score. 2 Background As shown in Figure 1, Wang e al. (2018) inroduced wo independen reconsrucors wih heir own parameers, which reconsruc he DPannoaed source senence from he encoder and decoder hidden saes, respecively. The cenral 2997 Proceedings of he 2018 Conference on Empirical Mehods in Naural Language Processing, pages Brussels, Belgium, Ocober 31 - November 4, c 2018 Associaion for Compuaional Linguisics

2 y x decoder encoder Did you bake i???? ˆx encoderreconsrucor decoderreconsrucor Figure 1: Archiecure of separae reconsrucors. Predicion F1-score Example DP Posiion 88% 你烤的 #DP# 吗? DP Words 66% 你烤的它吗? Table 1: Evaluaion of exernal models on predicing he posiions of DPs ( DP Posiion ) and he exac words of DP ( DP Words ). idea underpinning heir approach is o guide he corresponding hidden saes o embed he recalled source-side DP informaion and subsequenly o help he NMT model generae he missing pronouns wih hese enhanced hidden represenaions. The DPs can be auomaically annoaed for raining and es daa using wo differen sraegies (Wang e al., 2016). In he raining phase, where he arge senence is available, we annoae DPs for he source senence using alignmen informaion. These annoaed source senences can be used o build a neural-based DP predicor, which can be used o annoae es senences since he arge senence is no available during he esing phase. As shown in Table 1, Wang e al. (2016, 2018) explored o predic he exac DP words 1, he accuracy of which is only 66% in F1-score. By analyzing he ranslaion oupus, we found ha 16.2% of errors are newly inroduced and caused by errors from he DP predicor. Forunaely, he accuracy of predicing DP posiions (DPPs) is much higher, which provides he chance o alleviae he error propagaion problem. Inuiively, we can learn o generae DPs a he prediced posiions using a joinly rained DP predicor, which is fed wih informaive represenaions in he reconsrucor. 1 Unless oherwise indicaed, in he paper, he erms DP and DP word are idenical. ˆx 3 Approach 3.1 Shared Reconsrucor Recen work shows ha NMT models can benefi from sharing a componen across differen asks and languages. Taking muli-language ranslaion as an example, Fira e al. (2016) share an aenion model across languages while Dong e al. (2015) share an encoder. Our work is mos similar o he work of Zoph and Knigh (2016) and Anasasopoulos and Chiang (2018), which share a decoder and wo separae aenion models o read from wo differen sources. In conras, we share informaion a he level of reconsruced frames. The archiecures of our proposed shared reconsrucion model are shown in Figure 2(a). Formally, he reconsrucor reads from boh he encoder and decoder hidden saes, as well as he DP-annoaed source senence, and oupus a reconsrucion score. I uses wo separae aenion models o reconsruc he annoaed source senence ˆx = {ˆx 1, ˆx 2,..., ˆx T } word by word, and he reconsrucion score is compued by R(ˆx h enc, h dec ) = T =1 g r (ˆx 1, h rec, ĉ enc, ĉ dec ) where h rec is he hidden sae in he reconsrucor, and compued by Equaion (1): h rec = f r (ˆx 1, h rec 1, ĉ enc, ĉ dec ) (1) Here g r ( ) and f r ( ) are respecively sofmax and acivaion funcions for he reconsrucor. The conex vecors ĉ enc and ĉ dec are he weighed sum of h enc and h dec, respecively, as in Equaion (2) and (3): ĉ enc ĉ dec = J j=1 ˆαenc,j h enc j (2) = I i=1 ˆαdec,i h dec i (3) Noe ha he weighs ˆα enc and ˆα dec are calculaed by wo separae aenion models. We propose wo aenion sraegies which differ as o wheher he wo aenion models have ineracions or no. Independen Aenion calculaes he wo weigh marices independenly, as in Equaion (4) and (5): ˆα enc = ATT enc (ˆx 1, h rec 1, h enc ) (4) ˆα dec = ATT dec (ˆx 1, h rec 1, h dec ) (5) where ATT enc ( ) and ATT dec ( ) are wo separae aenion models wih heir own parameers. 2998

3 y Did you bake i? y Did you bake i? decoder? ˆx shared reconsrucor decoder #DP#? ˆx shared reconsrucor encoder encoder dp join predicion x? x? (a) Shared reconsrucor. (b) Shared reconsrucor wih join predicion. Figure 2: Model archiecures in which he words in red are auomaically annoaed DPs and DPPs. Ineracive Aenion feeds he conex vecor produced by one aenion model o anoher aenion model. The inuiion behind his is ha he ineracion beween wo aenion models can lead o a beer exploiaion of he encoder and decoder represenaions. As he ineracive aenion is direcional, we have wo opions (Equaion (6) and (7)) which modify eiher ATT enc ( ) or ATT dec ( ) while leaving he oher one unchanged: enc dec: ˆα dec = ATT dec (ˆx 1, h rec 1, h dec, ĉ enc ) (6) dec enc: ˆα enc = ATT enc (ˆx 1, h rec 1, h enc, ĉ dec ) (7) 3.2 Join Predicion of Dropped Pronouns Inspired by recen successes of muli-ask learning (Dong e al., 2015; Luong e al., 2016), we propose o joinly learn o ranslae and predic DPs (as shown in Figure 2(b)). To ease he learning difficuly, we leverage he informaion of DPPs prediced by an exernal model, which can achieve an accuracy of 88% in F1-score. Accordingly, we ransform he original DP predicion problem o DP word generaion given he pre-prediced DP posiions. Since he DPP-annoaed source senence serves as he reconsruced inpu, we inroduce an addiional DP-generaion loss, which measures how well he DP is generaed from he corresponding hidden sae in he reconsrucor. Le dp = {dp 1, dp 2,..., dp D } be he lis of DPs in he annoaed source senence, and h rec = {h rec 1, hrec 2,..., hrec D } be he corresponding hidden saes in he reconsrucor. The generaion probabiliy is compued by P (dp h rec ) = = D d=1 D d=1 P (dp d h rec d ) g p (dp d h rec d ) where g p ( ) is sofmax for he DP predicor. 3.3 Training and Tesing (8) We rain boh he encoder-decoder and he shared reconsrucors ogeher in a single end-o-end process, and he raining objecive is J(θ, γ, ψ) = arg max θ,γ,ψ { log L(y x; θ) }{{} likelihood + log R(ˆx h enc, h dec ; θ, γ) }{{} reconsrucion } + log P (dp ĥrec ; θ, γ, ψ) }{{} predicion (9) where {θ, γ, ψ} are respecively he parameers associaed wih he encoder-decoder, shared reconsrucor and he DP predicion model. The auxiliary reconsrucion objecive R( ) guides he relaed par of he parameer marix θ o learn beer laen represenaions, which are used o reconsruc he DPP-annoaed source senence. The auxiliary predicion loss P ( ) guides he relaed par of boh he encoder-decoder and he reconsrucor o learn beer laen represenaions, which are used o predic he DPs in he source senence. Following Tu e al. (2017) and Wang e al. (2018), we use he reconsrucion score 2999

4 # Model #Params Speed Train Decode BLEU Exising sysem (Wang e al., 2018) 1 Baseline 86.7M 1.60K Baseline (+DPs) 86.7M 1.59K Separae-Recs (+DPs) +73.8M 0.57K Our sysem 4 Baseline (+DPPs) 86.7M 1.54K Shared-Rec independen (+DPPs) +86.6M 0.52K Shared-Rec independen (+DPPs) + join predicion +87.9M 0.51K Shared-Rec enc dec (+DPPs) + join predicion +91.9M 0.48K Shared-Rec dec enc (+DPPs) + join predicion +89.9M 0.49K Table 2: Evaluaion of ranslaion performance for Chinese English. Baseline is rained and evaluaed on he original daa, while Baseline (+DPs) and Baseline (+DPPs) are rained on he daa annoaed wih DPs and DPPs, respecively. Training and decoding (beam size is 10) speeds are measured in words/second. and indicae saisically significan difference (p < 0.01) from Baseline (+DDPs) and Separae-Recs (+DPs), respecively. as a reranking echnique o selec he bes ranslaion candidae from he generaed n-bes lis a esing ime. Differen from Wang e al. (2018), we reconsruc DPP-annoaed source senence, which is prediced by an exernal model. 4 Experimen 4.1 Seup To compare our work wih he resuls repored by previous work (Wang e al., 2018), we conduced experimens on heir released Chinese English TV Subile corpus. 2 The raining, validaion, and es ses conain 2.15M, 1.09K, and 1.15K senence pairs, respecively. We used case-insensiive 4-gram NIST BLEU merics (Papineni e al., 2002) for evaluaion, and sign-es (Collins e al., 2005) o es for saisical significance. We implemened our models on he code reposiory released by Wang e al. (2018). 3 We used he same configuraions (e.g. vocabulary size = 30K, hidden size = 1000) and reproduced heir repored resuls. I should be emphasized ha we did no use he pre-rain sraegy as done in Wang e al. (2018), since we found raining from scrach achieved a beer performance in he shared reconsrucor seing. 2 hps://gihub.com/longyuewangdcu/ vsub 3 hps://gihub.com/uzhaopeng/nm 4.2 Resuls Table 2 shows he ranslaion resuls. I is clear ha he proposed models significanly ouperform he baselines in all cases, alhough here are considerable differences among differen variaions. Baselines (Rows 1-4): The hree baselines (Rows 1, 2, and 4) differ regarding he raining daa used. Separae-Recs (+DPs) (Row 3) is he bes model repored in Wang e al. (2018), which we employed as anoher srong baseline. The baseline rained on he DPP-annoaed daa ( Baseline (+DPPs), Row 4) ouperforms he oher wo counerpars, indicaing ha he error propagaion problem does affec he performance of ranslaing DPs. I suggess he necessiy of joinly learning o ranslae and predic DPs. Our Models (Rows 5-8): Using our shared reconsrucor (Row 5) no only ouperforms he corresponding baseline (Row 4), bu also surpasses is separae reconsrucor counerpar (Row 3). Inroducing a join predicion objecive (Row 6) can achieve a furher improvemen of BLEU poins. These resuls verify ha shared reconsrucor and joinly predicing DPs can accumulaively improve ranslaion performance. Among he variaions of shared reconsrucors (Rows 6-8), we found ha an ineracion aenion from encoder o decoder (Row 7) achieves he bes performance, which is BLEU poins beer han our baseline (Row 4) and BLEU poins 3000

5 beer han he bes resul repored by Wang e al. (2018) (Row 3). We aribue he superior performance of Shared-Rec enc dec o he fac ha he aenion conex over encoder represenaions embeds useful DP informaion, which can help o beer aend o he represenaions of he corresponding pronouns in he decoder side. Similar o Wang e al. (2018), he proposed approach improves BLEU scores a he cos of decreased raining and decoding speed, which is due o he large number of newly inroduced parameers resuling from he incorporaion of reconsrucors ino he NMT model. 4.3 Analysis Models Precision Recall F1-score Exernal Join Table 3: Evaluaion of DP predicion accuracy. Exernal model is separaely rained on DP-annoaed daa wih exernal neural mehods (Wang e al., 2016), while Join model is joinly rained wih he NMT model (Secion 3.2). Model Auo. Man. Seperae-Recs (+DPs) Shared-Rec (+DPPs) Table 5: Translaion performance gap ( ) beween manually ( Man. ) and auomaically ( Auo. ) labelling DPs/DPPs for inpu senences in esing. Effec of DPP Labelling Accuracy For each senence in esing, he DPs and DPPs are labelled auomaically by wo separae exernal predicion models, he accuracy of which are respecively 66% and 88% measured in F1 score. We invesigae he bes performance he models can achieve wih manual labelling, which can be regarded as an Oracle, as shown in Table 5. As seen, here sill exiss a significan gap in performance, and his could be improved by improving he accuracy of our DPP generaor. In addiion, our models show a relaively smaller disance in performance from he oracle performance ( Man ), indicaing ha he error propagaion problem is alleviaed o some exen. DP Predicion Accuracy As shown in Table 3, he joinly learned model significanly ouperforms he exernal one by 9% in F1-score. We aribue his o he useful conexual informaion embedded in he reconsrucor represenaions, which are used o generae he exac DP words. Model Tes Baseline (+DPPs) Separae-Recs (+DPs) Shared-Rec (+DPPs) Table 4: Translaion resuls when reconsrucion is used in raining only while no used in esing. Conribuion Analysis Table 4 liss ranslaion resuls when he reconsrucion model is used in raining only. We can see ha he proposed model ouperforms boh he srong baseline and he bes model repored in Wang e al. (2018). This is encouraging since no exra resources and compuaion are inroduced o online decoding, which makes he approach highly pracical, for example for ranslaion in indusry applicaions. 5 Conclusion In his paper, we proposed effecive approaches of ranslaing DPs wih NMT models: shared reconsrucor and joinly learning o ranslae and predic DPs. Through experimens we verified ha 1) shared reconsrucion is helpful o share knowledge beween he encoder and decoder; and 2) join learning of he DP predicion model indeed alleviaes he error propagaion problem by improving predicion accuracy. The wo approaches accumulaively improve ranslaion performance. The mehod is no resriced o he DP ranslaion ask and could poenially be applied o oher sequence generaion problems where addiional source-side informaion could be incorporaed. In fuure work we plan o: 1) build a fully end-o-end NMT model for DP ranslaion, which does no depend on any exernal componen (i.e. DPP predicor); 2) exploi cross-senence conex (Wang e al., 2017) o furher improve DP ranslaion; 3) invesigae a new research srand ha adaps our model in an inverse ranslaion direcion by learning o drop pronouns insead of recovering DPs. 3001

6 Acknowledgmens The ADAPT Cenre for Digial Conen Technology is funded under he SFI Research Cenres Programme (Gran 13/RC/2106) and is co-funded under he European Regional Developmen Fund. We hank he anonymous reviewers for heir insighful commens. References Anonios Anasasopoulos and David Chiang Tied muliask learning for neural speech ranslaion. In Proceedings of he 2018 Conference of he Norh American Chaper of he Associaion for Compuaional Linguisics: Human Language Technologies, pages 82 91, New Orleans, Louisiana, USA. Michael Collins, Philipp Koehn, and Ivona Kucerova Clause resrucuring for saisical machine ranslaion. In Proceedings of he 43rd Annual Meeing of he Associaion for Compuaional Linguisics, pages , Ann Arbor, Michigan, USA. Daxiang Dong, Hua Wu, Wei He, Dianhai Yu, and Haifeng Wang Muli-ask learning for muliple language ranslaion. In Proceedings of he 53rd Annual Meeing of he Associaion for Compuaional Linguisics, pages , Beijing, China. Orhan Fira, Kyunghyun Cho, and Yoshua Bengio Muli-way, mulilingual neural machine ranslaion wih a shared aenion mechanism. In Proceedings of he 2016 Conference of he Norh American Chaper of he Associaion for Compuaional Linguisics: Human Language Technologies, pages , San Diego, California, USA. Ronan Le Nagard and Philipp Koehn Aiding pronoun ranslaion wih co-reference resoluion. In Proceedings of he Join 5h Workshop on Saisical Machine Translaion and MericsMATR, pages , Uppsala, Sweden. Minh-Thang Luong, Quoc V Le, Ilya Suskever, Oriol Vinyals, and Lukasz Kaiser Muli-ask sequence o sequence learning. In Proceedings of he 27h Inernaional Conference on Compuaional Linguisics, pages , Sana Fe, New Mexico, USA. Kishore Papineni, Salim Roukos, Todd Ward, and Wei- Jing Zhu BLEU: A mehod for auomaic evaluaion of machine ranslaion. In Proceedings of he 40h Annual Meeing on Associaion for Compuaional Linguisics, pages , Philadelphia, Pennsylvania, USA. Zhaopeng Tu, Yang Liu, Lifeng Shang, Xiaohua Liu, and Hang Li Neural machine ranslaion wih reconsrucion. In Proceedings of he 31s AAAI Conference on Arificial Inelligence, pages , San Francisco, California, USA. Longyue Wang, Zhaopeng Tu, Shuming Shi, Tong Zhang, Yvee Graham, and Qun Liu Translaing pro-drop languages wih reconsrucion models. In Proceedings of he 32nd AAAI Conference on Arificial Inelligence, pages , New Orleans, Louisiana, USA. Longyue Wang, Zhaopeng Tu, Andy Way, and Qun Liu Exploiing cross-senence conex for neural machine ranslaion. In Proceedings of he 2017 Conference on Empirical Mehods in Naural Language Processing, pages , Copenhagen, Denmark. Longyue Wang, Zhaopeng Tu, Xiaojun Zhang, Hang Li, Andy Way, and Qun Liu A novel approach for dropped pronoun ranslaion. In Proceedings of he 2016 Conference of he Norh American Chaper of he Associaion for Compuaional Linguisics: Human Language Technologies, pages , San Diego, California, USA. Bing Xiang, Xiaoqiang Luo, and Bowen Zhou Enlising he ghos: Modeling empy caegories for machine ranslaion. In Proceedings of he 51s Annual Meeing of he Associaion for Compuaional Linguisics, pages , Sofia, Bulgaria. Barre Zoph and Knigh Knigh Muli-source neural ranslaion. In Proceedings of he 2016 Conference of he Norh American Chaper of he Associaion for Compuaional Linguisics: Human Language Technologies, pages 30 34, San Diego, California. 3002

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions

More information

Ensamble methods: Bagging and Boosting

Ensamble methods: Bagging and Boosting Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par

More information

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,

More information

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he

More information

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models Journal of Saisical and Economeric Mehods, vol.1, no.2, 2012, 65-70 ISSN: 2241-0384 (prin), 2241-0376 (online) Scienpress Ld, 2012 A Specificaion Tes for Linear Dynamic Sochasic General Equilibrium Models

More information

Content-Based Shape Retrieval Using Different Shape Descriptors: A Comparative Study Dengsheng Zhang and Guojun Lu

Content-Based Shape Retrieval Using Different Shape Descriptors: A Comparative Study Dengsheng Zhang and Guojun Lu Conen-Based Shape Rerieval Using Differen Shape Descripors: A Comparaive Sudy Dengsheng Zhang and Guojun Lu Gippsland School of Compuing and Informaion Technology Monash Universiy Churchill, Vicoria 3842

More information

Ensamble methods: Boosting

Ensamble methods: Boosting Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room

More information

STATE-SPACE MODELLING. A mass balance across the tank gives:

STATE-SPACE MODELLING. A mass balance across the tank gives: B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing

More information

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H.

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H. ACE 564 Spring 2006 Lecure 7 Exensions of The Muliple Regression Model: Dumm Independen Variables b Professor Sco H. Irwin Readings: Griffihs, Hill and Judge. "Dumm Variables and Varing Coefficien Models

More information

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Robust estimation based on the first- and third-moment restrictions of the power transformation model h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,

More information

Computation of the Effect of Space Harmonics on Starting Process of Induction Motors Using TSFEM

Computation of the Effect of Space Harmonics on Starting Process of Induction Motors Using TSFEM Journal of elecrical sysems Special Issue N 01 : November 2009 pp: 48-52 Compuaion of he Effec of Space Harmonics on Saring Process of Inducion Moors Using TSFEM Youcef Ouazir USTHB Laboraoire des sysèmes

More information

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Georey E. Hinton. University oftoronto.   Technical Report CRG-TR February 22, Abstract Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical

More information

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix

More information

A Framework for Efficient Document Ranking Using Order and Non Order Based Fitness Function

A Framework for Efficient Document Ranking Using Order and Non Order Based Fitness Function A Framework for Efficien Documen Ranking Using Order and Non Order Based Finess Funcion Hazra Imran, Adii Sharan Absrac One cenral problem of informaion rerieval is o deermine he relevance of documens

More information

Presentation Overview

Presentation Overview Acion Refinemen in Reinforcemen Learning by Probabiliy Smoohing By Thomas G. Dieerich & Didac Busques Speaer: Kai Xu Presenaion Overview Bacground The Probabiliy Smoohing Mehod Experimenal Sudy of Acion

More information

Single-Pass-Based Heuristic Algorithms for Group Flexible Flow-shop Scheduling Problems

Single-Pass-Based Heuristic Algorithms for Group Flexible Flow-shop Scheduling Problems Single-Pass-Based Heurisic Algorihms for Group Flexible Flow-shop Scheduling Problems PEI-YING HUANG, TZUNG-PEI HONG 2 and CHENG-YAN KAO, 3 Deparmen of Compuer Science and Informaion Engineering Naional

More information

HW6: MRI Imaging Pulse Sequences (7 Problems for 100 pts)

HW6: MRI Imaging Pulse Sequences (7 Problems for 100 pts) HW6: MRI Imaging Pulse Sequences (7 Problems for 100 ps) GOAL The overall goal of HW6 is o beer undersand pulse sequences for MRI image reconsrucion. OBJECTIVES 1) Design a spin echo pulse sequence o image

More information

Lecture 4 Kinetics of a particle Part 3: Impulse and Momentum

Lecture 4 Kinetics of a particle Part 3: Impulse and Momentum MEE Engineering Mechanics II Lecure 4 Lecure 4 Kineics of a paricle Par 3: Impulse and Momenum Linear impulse and momenum Saring from he equaion of moion for a paricle of mass m which is subjeced o an

More information

Lecture Notes 2. The Hilbert Space Approach to Time Series

Lecture Notes 2. The Hilbert Space Approach to Time Series Time Series Seven N. Durlauf Universiy of Wisconsin. Basic ideas Lecure Noes. The Hilber Space Approach o Time Series The Hilber space framework provides a very powerful language for discussing he relaionship

More information

Retrieval Models. Boolean and Vector Space Retrieval Models. Common Preprocessing Steps. Boolean Model. Boolean Retrieval Model

Retrieval Models. Boolean and Vector Space Retrieval Models. Common Preprocessing Steps. Boolean Model. Boolean Retrieval Model 1 Boolean and Vecor Space Rerieval Models Many slides in his secion are adaped from Prof. Joydeep Ghosh (UT ECE) who in urn adaped hem from Prof. Dik Lee (Univ. of Science and Tech, Hong Kong) Rerieval

More information

An recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes

An recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes WHAT IS A KALMAN FILTER An recursive analyical echnique o esimae ime dependen physical parameers in he presence of noise processes Example of a ime and frequency applicaion: Offse beween wo clocks PREDICTORS,

More information

Particle Swarm Optimization Combining Diversification and Intensification for Nonlinear Integer Programming Problems

Particle Swarm Optimization Combining Diversification and Intensification for Nonlinear Integer Programming Problems Paricle Swarm Opimizaion Combining Diversificaion and Inensificaion for Nonlinear Ineger Programming Problems Takeshi Masui, Masaoshi Sakawa, Kosuke Kao and Koichi Masumoo Hiroshima Universiy 1-4-1, Kagamiyama,

More information

Hidden Markov Models

Hidden Markov Models Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe

More information

Math 315: Linear Algebra Solutions to Assignment 6

Math 315: Linear Algebra Solutions to Assignment 6 Mah 35: Linear Algebra s o Assignmen 6 # Which of he following ses of vecors are bases for R 2? {2,, 3, }, {4,, 7, 8}, {,,, 3}, {3, 9, 4, 2}. Explain your answer. To generae he whole R 2, wo linearly independen

More information

WEEK-3 Recitation PHYS 131. of the projectile s velocity remains constant throughout the motion, since the acceleration a x

WEEK-3 Recitation PHYS 131. of the projectile s velocity remains constant throughout the motion, since the acceleration a x WEEK-3 Reciaion PHYS 131 Ch. 3: FOC 1, 3, 4, 6, 14. Problems 9, 37, 41 & 71 and Ch. 4: FOC 1, 3, 5, 8. Problems 3, 5 & 16. Feb 8, 018 Ch. 3: FOC 1, 3, 4, 6, 14. 1. (a) The horizonal componen of he projecile

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks - Deep Learning: Theory, Techniques & Applicaions - Recurren Neural Neworks - Prof. Maeo Maeucci maeo.maeucci@polimi.i Deparmen of Elecronics, Informaion and Bioengineering Arificial Inelligence and Roboics

More information

Learning to Process Natural Language in Big Data Environment

Learning to Process Natural Language in Big Data Environment CCF ADL 2015 Nanchang Oc 11, 2015 Learning o Process Naural Language in Big Daa Environmen Hang Li Noah s Ark Lab Huawei Technologies Par 2: Useful Deep Learning Tools Powerful Deep Learning Tools (Unsupervised

More information

Solutions to Odd Number Exercises in Chapter 6

Solutions to Odd Number Exercises in Chapter 6 1 Soluions o Odd Number Exercises in 6.1 R y eˆ 1.7151 y 6.3 From eˆ ( T K) ˆ R 1 1 SST SST SST (1 R ) 55.36(1.7911) we have, ˆ 6.414 T K ( ) 6.5 y ye ye y e 1 1 Consider he erms e and xe b b x e y e b

More information

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Article from. Predictive Analytics and Futurism. July 2016 Issue 13 Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning

More information

A Shooting Method for A Node Generation Algorithm

A Shooting Method for A Node Generation Algorithm A Shooing Mehod for A Node Generaion Algorihm Hiroaki Nishikawa W.M.Keck Foundaion Laboraory for Compuaional Fluid Dynamics Deparmen of Aerospace Engineering, Universiy of Michigan, Ann Arbor, Michigan

More information

Spring Ammar Abu-Hudrouss Islamic University Gaza

Spring Ammar Abu-Hudrouss Islamic University Gaza Chaper 7 Reed-Solomon Code Spring 9 Ammar Abu-Hudrouss Islamic Universiy Gaza ١ Inroducion A Reed Solomon code is a special case of a BCH code in which he lengh of he code is one less han he size of he

More information

2.3 SCHRÖDINGER AND HEISENBERG REPRESENTATIONS

2.3 SCHRÖDINGER AND HEISENBERG REPRESENTATIONS Andrei Tokmakoff, MIT Deparmen of Chemisry, 2/22/2007 2-17 2.3 SCHRÖDINGER AND HEISENBERG REPRESENTATIONS The mahemaical formulaion of he dynamics of a quanum sysem is no unique. So far we have described

More information

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still. Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in

More information

Kinematics and kinematic functions

Kinematics and kinematic functions Kinemaics and kinemaic funcions Kinemaics deals wih he sudy of four funcions (called kinemaic funcions or KFs) ha mahemaically ransform join variables ino caresian variables and vice versa Direc Posiion

More information

NCSS Statistical Software. , contains a periodic (cyclic) component. A natural model of the periodic component would be

NCSS Statistical Software. , contains a periodic (cyclic) component. A natural model of the periodic component would be NCSS Saisical Sofware Chaper 468 Specral Analysis Inroducion This program calculaes and displays he periodogram and specrum of a ime series. This is someimes nown as harmonic analysis or he frequency approach

More information

Particle Swarm Optimization

Particle Swarm Optimization Paricle Swarm Opimizaion Speaker: Jeng-Shyang Pan Deparmen of Elecronic Engineering, Kaohsiung Universiy of Applied Science, Taiwan Email: jspan@cc.kuas.edu.w 7/26/2004 ppso 1 Wha is he Paricle Swarm Opimizaion

More information

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality Marix Versions of Some Refinemens of he Arihmeic-Geomeric Mean Inequaliy Bao Qi Feng and Andrew Tonge Absrac. We esablish marix versions of refinemens due o Alzer ], Carwrigh and Field 4], and Mercer 5]

More information

10. State Space Methods

10. State Space Methods . Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he

More information

CS224n: Natural Language Processing with Deep Learning 1 Lecture Notes: Part V Language Models, RNN, GRU and LSTM 2 Winter 2019

CS224n: Natural Language Processing with Deep Learning 1 Lecture Notes: Part V Language Models, RNN, GRU and LSTM 2 Winter 2019 CS224n: Naural Language Processing wih Deep Learning 1 Lecure Noes: Par V Language Models, RNN, GRU and LSTM 2 Winer 2019 1 Course Insrucors: Chrisopher Manning, Richard Socher 2 Auhors: Milad Mohammadi,

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION SUPPLEMENTARY INFORMATION DOI: 0.038/NCLIMATE893 Temporal resoluion and DICE * Supplemenal Informaion Alex L. Maren and Sephen C. Newbold Naional Cener for Environmenal Economics, US Environmenal Proecion

More information

A DELAY-DEPENDENT STABILITY CRITERIA FOR T-S FUZZY SYSTEM WITH TIME-DELAYS

A DELAY-DEPENDENT STABILITY CRITERIA FOR T-S FUZZY SYSTEM WITH TIME-DELAYS A DELAY-DEPENDENT STABILITY CRITERIA FOR T-S FUZZY SYSTEM WITH TIME-DELAYS Xinping Guan ;1 Fenglei Li Cailian Chen Insiue of Elecrical Engineering, Yanshan Universiy, Qinhuangdao, 066004, China. Deparmen

More information

A Dynamic Model of Economic Fluctuations

A Dynamic Model of Economic Fluctuations CHAPTER 15 A Dynamic Model of Economic Flucuaions Modified for ECON 2204 by Bob Murphy 2016 Worh Publishers, all righs reserved IN THIS CHAPTER, OU WILL LEARN: how o incorporae dynamics ino he AD-AS model

More information

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data Chaper 2 Models, Censoring, and Likelihood for Failure-Time Daa William Q. Meeker and Luis A. Escobar Iowa Sae Universiy and Louisiana Sae Universiy Copyrigh 1998-2008 W. Q. Meeker and L. A. Escobar. Based

More information

DEPARTMENT OF STATISTICS

DEPARTMENT OF STATISTICS A Tes for Mulivariae ARCH Effecs R. Sco Hacker and Abdulnasser Haemi-J 004: DEPARTMENT OF STATISTICS S-0 07 LUND SWEDEN A Tes for Mulivariae ARCH Effecs R. Sco Hacker Jönköping Inernaional Business School

More information

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8) I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression

More information

14 Autoregressive Moving Average Models

14 Autoregressive Moving Average Models 14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class

More information

The Multiple Regression Model: Hypothesis Tests and the Use of Nonsample Information

The Multiple Regression Model: Hypothesis Tests and the Use of Nonsample Information Chaper 8 The Muliple Regression Model: Hypohesis Tess and he Use of Nonsample Informaion An imporan new developmen ha we encouner in his chaper is using he F- disribuion o simulaneously es a null hypohesis

More information

Random Walk with Anti-Correlated Steps

Random Walk with Anti-Correlated Steps Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and

More information

di Bernardo, M. (1995). A purely adaptive controller to synchronize and control chaotic systems.

di Bernardo, M. (1995). A purely adaptive controller to synchronize and control chaotic systems. di ernardo, M. (995). A purely adapive conroller o synchronize and conrol chaoic sysems. hps://doi.org/.6/375-96(96)8-x Early version, also known as pre-prin Link o published version (if available):.6/375-96(96)8-x

More information

Inventory Control of Perishable Items in a Two-Echelon Supply Chain

Inventory Control of Perishable Items in a Two-Echelon Supply Chain Journal of Indusrial Engineering, Universiy of ehran, Special Issue,, PP. 69-77 69 Invenory Conrol of Perishable Iems in a wo-echelon Supply Chain Fariborz Jolai *, Elmira Gheisariha and Farnaz Nojavan

More information

Basilio Bona ROBOTICA 03CFIOR 1

Basilio Bona ROBOTICA 03CFIOR 1 Indusrial Robos Kinemaics 1 Kinemaics and kinemaic funcions Kinemaics deals wih he sudy of four funcions (called kinemaic funcions or KFs) ha mahemaically ransform join variables ino caresian variables

More information

Forecasting Stock Exchange Movements Using Artificial Neural Network Models and Hybrid Models

Forecasting Stock Exchange Movements Using Artificial Neural Network Models and Hybrid Models Forecasing Sock Exchange Movemens Using Arificial Neural Nework Models and Hybrid Models Erkam GÜREEN and Gülgün KAYAKUTLU Isanbul Technical Universiy, Deparmen of Indusrial Engineering, Maçka, 34367 Isanbul,

More information

Multi-component Levi Hierarchy and Its Multi-component Integrable Coupling System

Multi-component Levi Hierarchy and Its Multi-component Integrable Coupling System Commun. Theor. Phys. (Beijing, China) 44 (2005) pp. 990 996 c Inernaional Academic Publishers Vol. 44, No. 6, December 5, 2005 uli-componen Levi Hierarchy and Is uli-componen Inegrable Coupling Sysem XIA

More information

Experiments on logistic regression

Experiments on logistic regression Experimens on logisic regression Ning Bao March, 8 Absrac In his repor, several experimens have been conduced on a spam daa se wih Logisic Regression based on Gradien Descen approach. Firs, he overfiing

More information

MANY FACET, COMMON LATENT TRAIT POLYTOMOUS IRT MODEL AND EM ALGORITHM. Dimitar Atanasov

MANY FACET, COMMON LATENT TRAIT POLYTOMOUS IRT MODEL AND EM ALGORITHM. Dimitar Atanasov Pliska Sud. Mah. Bulgar. 20 (2011), 5 12 STUDIA MATHEMATICA BULGARICA MANY FACET, COMMON LATENT TRAIT POLYTOMOUS IRT MODEL AND EM ALGORITHM Dimiar Aanasov There are many areas of assessmen where he level

More information

d 1 = c 1 b 2 - b 1 c 2 d 2 = c 1 b 3 - b 1 c 3

d 1 = c 1 b 2 - b 1 c 2 d 2 = c 1 b 3 - b 1 c 3 and d = c b - b c c d = c b - b c c This process is coninued unil he nh row has been compleed. The complee array of coefficiens is riangular. Noe ha in developing he array an enire row may be divided or

More information

References are appeared in the last slide. Last update: (1393/08/19)

References are appeared in the last slide. Last update: (1393/08/19) SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be

More information

An introduction to the theory of SDDP algorithm

An introduction to the theory of SDDP algorithm An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking

More information

THE DISCRETE WAVELET TRANSFORM

THE DISCRETE WAVELET TRANSFORM . 4 THE DISCRETE WAVELET TRANSFORM 4 1 Chaper 4: THE DISCRETE WAVELET TRANSFORM 4 2 4.1 INTRODUCTION TO DISCRETE WAVELET THEORY The bes way o inroduce waveles is hrough heir comparison o Fourier ransforms,

More information

A NOTE ON THE STRUCTURE OF BILATTICES. A. Avron. School of Mathematical Sciences. Sackler Faculty of Exact Sciences. Tel Aviv University

A NOTE ON THE STRUCTURE OF BILATTICES. A. Avron. School of Mathematical Sciences. Sackler Faculty of Exact Sciences. Tel Aviv University A NOTE ON THE STRUCTURE OF BILATTICES A. Avron School of Mahemaical Sciences Sacler Faculy of Exac Sciences Tel Aviv Universiy Tel Aviv 69978, Israel The noion of a bilaice was rs inroduced by Ginsburg

More information

23.5. Half-Range Series. Introduction. Prerequisites. Learning Outcomes

23.5. Half-Range Series. Introduction. Prerequisites. Learning Outcomes Half-Range Series 2.5 Inroducion In his Secion we address he following problem: Can we find a Fourier series expansion of a funcion defined over a finie inerval? Of course we recognise ha such a funcion

More information

Explaining Total Factor Productivity. Ulrich Kohli University of Geneva December 2015

Explaining Total Factor Productivity. Ulrich Kohli University of Geneva December 2015 Explaining Toal Facor Produciviy Ulrich Kohli Universiy of Geneva December 2015 Needed: A Theory of Toal Facor Produciviy Edward C. Presco (1998) 2 1. Inroducion Toal Facor Produciviy (TFP) has become

More information

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing Applicaion of a Sochasic-Fuzzy Approach o Modeling Opimal Discree Time Dynamical Sysems by Using Large Scale Daa Processing AA WALASZE-BABISZEWSA Deparmen of Compuer Engineering Opole Universiy of Technology

More information

A unit root test based on smooth transitions and nonlinear adjustment

A unit root test based on smooth transitions and nonlinear adjustment MPRA Munich Personal RePEc Archive A uni roo es based on smooh ransiions and nonlinear adjusmen Aycan Hepsag Isanbul Universiy 5 Ocober 2017 Online a hps://mpra.ub.uni-muenchen.de/81788/ MPRA Paper No.

More information

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED 0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable

More information

1 Review of Zero-Sum Games

1 Review of Zero-Sum Games COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any

More information

Probabilistic learning

Probabilistic learning Probabilisic learning Charles Elkan November 8, 2012 Imporan: These lecure noes are based closely on noes wrien by Lawrence Saul. Tex may be copied direcly from his noes, or paraphrased. Also, hese ypese

More information

What Ties Return Volatilities to Price Valuations and Fundamentals? On-Line Appendix

What Ties Return Volatilities to Price Valuations and Fundamentals? On-Line Appendix Wha Ties Reurn Volailiies o Price Valuaions and Fundamenals? On-Line Appendix Alexander David Haskayne School of Business, Universiy of Calgary Piero Veronesi Universiy of Chicago Booh School of Business,

More information

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H. ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple

More information

Solutions to Exercises in Chapter 12

Solutions to Exercises in Chapter 12 Chaper in Chaper. (a) The leas-squares esimaed equaion is given by (b)!i = 6. + 0.770 Y 0.8 R R = 0.86 (.5) (0.07) (0.6) Boh b and b 3 have he expeced signs; income is expeced o have a posiive effec on

More information

Testing for a Single Factor Model in the Multivariate State Space Framework

Testing for a Single Factor Model in the Multivariate State Space Framework esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics

More information

Single and Double Pendulum Models

Single and Double Pendulum Models Single and Double Pendulum Models Mah 596 Projec Summary Spring 2016 Jarod Har 1 Overview Differen ypes of pendulums are used o model many phenomena in various disciplines. In paricular, single and double

More information

Lab #2: Kinematics in 1-Dimension

Lab #2: Kinematics in 1-Dimension Reading Assignmen: Chaper 2, Secions 2-1 hrough 2-8 Lab #2: Kinemaics in 1-Dimension Inroducion: The sudy of moion is broken ino wo main areas of sudy kinemaics and dynamics. Kinemaics is he descripion

More information

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN The MIT Press, 2014 Lecure Slides for INTRODUCTION TO MACHINE LEARNING 3RD EDITION alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/i2ml3e CHAPTER 2: SUPERVISED LEARNING Learning a Class

More information

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H. ACE 56 Fall 5 Lecure 8: The Simple Linear Regression Model: R, Reporing he Resuls and Predicion by Professor Sco H. Irwin Required Readings: Griffihs, Hill and Judge. "Explaining Variaion in he Dependen

More information

PROC NLP Approach for Optimal Exponential Smoothing Srihari Jaganathan, Cognizant Technology Solutions, Newbury Park, CA.

PROC NLP Approach for Optimal Exponential Smoothing Srihari Jaganathan, Cognizant Technology Solutions, Newbury Park, CA. PROC NLP Approach for Opimal Exponenial Smoohing Srihari Jaganahan, Cognizan Technology Soluions, Newbury Park, CA. ABSTRACT Esimaion of smoohing parameers and iniial values are some of he basic requiremens

More information

CONFIDENCE LIMITS AND THEIR ROBUSTNESS

CONFIDENCE LIMITS AND THEIR ROBUSTNESS CONFIDENCE LIMITS AND THEIR ROBUSTNESS Rajendran Raja Fermi Naional Acceleraor laboraory Baavia, IL 60510 Absrac Confidence limis are common place in physics analysis. Grea care mus be aken in heir calculaion

More information

Solutions from Chapter 9.1 and 9.2

Solutions from Chapter 9.1 and 9.2 Soluions from Chaper 9 and 92 Secion 9 Problem # This basically boils down o an exercise in he chain rule from calculus We are looking for soluions of he form: u( x) = f( k x c) where k x R 3 and k is

More information

EKF SLAM vs. FastSLAM A Comparison

EKF SLAM vs. FastSLAM A Comparison vs. A Comparison Michael Calonder, Compuer Vision Lab Swiss Federal Insiue of Technology, Lausanne EPFL) michael.calonder@epfl.ch The wo algorihms are described wih a planar robo applicaion in mind. Generalizaion

More information

FITTING OF A PARTIALLY REPARAMETERIZED GOMPERTZ MODEL TO BROILER DATA

FITTING OF A PARTIALLY REPARAMETERIZED GOMPERTZ MODEL TO BROILER DATA FITTING OF A PARTIALLY REPARAMETERIZED GOMPERTZ MODEL TO BROILER DATA N. Okendro Singh Associae Professor (Ag. Sa.), College of Agriculure, Cenral Agriculural Universiy, Iroisemba 795 004, Imphal, Manipur

More information

6.003: Signals and Systems Lecture 20 November 17, 2011

6.003: Signals and Systems Lecture 20 November 17, 2011 6.3: Signals and Sysems Lecure November 7, 6.3: Signals and Sysems Applicaions of Fourier ransforms Filering Noion of a filer. LI sysems canno creae new frequencies. can only scale magniudes and shif phases

More information

EECS 2602 Winter Laboratory 3 Fourier series, Fourier transform and Bode Plots in MATLAB

EECS 2602 Winter Laboratory 3 Fourier series, Fourier transform and Bode Plots in MATLAB EECS 6 Winer 7 Laboraory 3 Fourier series, Fourier ransform and Bode Plos in MATLAB Inroducion: The objecives of his lab are o use MATLAB:. To plo periodic signals wih Fourier series represenaion. To obain

More information

Air Quality Index Prediction Using Error Back Propagation Algorithm and Improved Particle Swarm Optimization

Air Quality Index Prediction Using Error Back Propagation Algorithm and Improved Particle Swarm Optimization Air Qualiy Index Predicion Using Error Back Propagaion Algorihm and Improved Paricle Swarm Opimizaion Jia Xu ( ) and Lang Pei College of Compuer Science, Wuhan Qinchuan Universiy, Wuhan, China 461406563@qq.com

More information

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé Bias in Condiional and Uncondiional Fixed Effecs Logi Esimaion: a Correcion * Tom Coupé Economics Educaion and Research Consorium, Naional Universiy of Kyiv Mohyla Academy Address: Vul Voloska 10, 04070

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

Sequential Learning of Classifiers for Structured Prediction Problems

Sequential Learning of Classifiers for Structured Prediction Problems Sequenial Learning of Classifiers for Srucured Predicion Problems Dan Roh Dep. of Compuer Science Univ. of Illinois a U-C danr@illinois.edu Kevin Small Dep. of Compuer Science Univ. of Illinois a U-C ksmall@illinois.edu

More information

Removing Useless Productions of a Context Free Grammar through Petri Net

Removing Useless Productions of a Context Free Grammar through Petri Net Journal of Compuer Science 3 (7): 494-498, 2007 ISSN 1549-3636 2007 Science Publicaions Removing Useless Producions of a Conex Free Grammar hrough Peri Ne Mansoor Al-A'ali and Ali A Khan Deparmen of Compuer

More information

Reliability of Technical Systems

Reliability of Technical Systems eliabiliy of Technical Sysems Main Topics Inroducion, Key erms, framing he problem eliabiliy parameers: Failure ae, Failure Probabiliy, Availabiliy, ec. Some imporan reliabiliy disribuions Componen reliabiliy

More information

STA 114: Statistics. Notes 2. Statistical Models and the Likelihood Function

STA 114: Statistics. Notes 2. Statistical Models and the Likelihood Function STA 114: Saisics Noes 2. Saisical Models and he Likelihood Funcion Describing Daa & Saisical Models A physicis has a heory ha makes a precise predicion of wha s o be observed in daa. If he daa doesn mach

More information

An Auto-Regressive Model for Improved Low Delay Distributed Video Coding

An Auto-Regressive Model for Improved Low Delay Distributed Video Coding An Auo-Regressive Model for Improved Low Delay Disribued Video Coding Yongbing Zhang* a, Debin Zhao a, Siwei Ma b, Ronggang Wang c, Wen Gao b a Deparmen of Compuer Science, Harbin Insiue of Technology,

More information

Robotics I. April 11, The kinematics of a 3R spatial robot is specified by the Denavit-Hartenberg parameters in Tab. 1.

Robotics I. April 11, The kinematics of a 3R spatial robot is specified by the Denavit-Hartenberg parameters in Tab. 1. Roboics I April 11, 017 Exercise 1 he kinemaics of a 3R spaial robo is specified by he Denavi-Harenberg parameers in ab 1 i α i d i a i θ i 1 π/ L 1 0 1 0 0 L 3 0 0 L 3 3 able 1: able of DH parameers of

More information

Latent Spaces and Matrix Factorization

Latent Spaces and Matrix Factorization Compuaional Linguisics Laen Spaces and Marix Facorizaion Dierich Klakow FR 4.7 Allgemeine Linguisik (Compuerlinguisik) Universiä des Saarlandes Summer 0 Goal Goal: rea documen clusering and word clusering

More information

Hardware-Software Co-design of Slimmed Optical Neural Networks

Hardware-Software Co-design of Slimmed Optical Neural Networks Hardware-Sofware Co-design of Slimmed Opical Neural Neworks Zheng Zhao 1, Derong Liu 1, Meng Li 1, Zhoufeng Ying 1, Lu Zhang 2, Biying Xu 1, Bei Yu 2, Ray Chen 1, David Pan 1 The Universiy of Texas a Ausin

More information

Multi-scale 2D acoustic full waveform inversion with high frequency impulsive source

Multi-scale 2D acoustic full waveform inversion with high frequency impulsive source Muli-scale D acousic full waveform inversion wih high frequency impulsive source Vladimir N Zubov*, Universiy of Calgary, Calgary AB vzubov@ucalgaryca and Michael P Lamoureux, Universiy of Calgary, Calgary

More information

Stability and Bifurcation in a Neural Network Model with Two Delays

Stability and Bifurcation in a Neural Network Model with Two Delays Inernaional Mahemaical Forum, Vol. 6, 11, no. 35, 175-1731 Sabiliy and Bifurcaion in a Neural Nework Model wih Two Delays GuangPing Hu and XiaoLing Li School of Mahemaics and Physics, Nanjing Universiy

More information

Notes on online convex optimization

Notes on online convex optimization Noes on online convex opimizaion Karl Sraos Online convex opimizaion (OCO) is a principled framework for online learning: OnlineConvexOpimizaion Inpu: convex se S, number of seps T For =, 2,..., T : Selec

More information

Online Convex Optimization Example And Follow-The-Leader

Online Convex Optimization Example And Follow-The-Leader CSE599s, Spring 2014, Online Learning Lecure 2-04/03/2014 Online Convex Opimizaion Example And Follow-The-Leader Lecurer: Brendan McMahan Scribe: Sephen Joe Jonany 1 Review of Online Convex Opimizaion

More information

EE100 Lab 3 Experiment Guide: RC Circuits

EE100 Lab 3 Experiment Guide: RC Circuits I. Inroducion EE100 Lab 3 Experimen Guide: A. apaciors A capacior is a passive elecronic componen ha sores energy in he form of an elecrosaic field. The uni of capaciance is he farad (coulomb/vol). Pracical

More information

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),

More information