Tensorial Recurrent Neural Networks for Longitudinal Data Analysis
|
|
- Maryann Shelton
- 6 years ago
- Views:
Transcription
1 1 Tensorial Recurren Neural Neworks for Longiudinal Daa Analysis Mingyuan Bai Boyan Zhang and Junbin Gao arxiv: v1 [cs.lg] 1 Aug 2017 Absrac Tradiional Recurren Neural Neworks assume vecorized daa as inpus. However many daa from modern science and echnology come in cerain srucures such as ensorial ime series daa. To apply he recurren neural neworks for his ype of daa a vecorisaion process is necessary while such a vecorisaion leads o he loss of he precise informaion of he spaial or longiudinal dimensions. In addiion such a vecorized daa is no an opimum soluion for learning he represenaion for he longiudinal daa. In his paper we propose a new varian of ensorial neural neworks which direcly ake ensorial ime series daa as inpus. We call his new varian as Tensorial Recurren Neural Nework TRNN. The proposed TRNN is based on ensor Tucker decomposiion. I. INTRODUCTION In recen years he ineress in ime series wih sequenial effecs among he daa have been consanly growing in boh academic field and indusry. These ineress are from he developmen of echnology and social science including bu no limied o mulimedia social nework and economic and poliical nework especially inernaional relaionship sudy. Time series daa acquired from many discipline is no only in large volume in erms of ime bu also in more ever complicaed srucures such as in muli- and high-dimension. The rise of massive muli-dimensional daa has led o new demands for Machine Learning ML sysems o learn complex models wih millions o billions of parameers for new ypes of daa srucures ha promise adequae capaciy o diges massive daases and offer powerful predicive analyics hereupon. As ensorial daa come wih a special spaial srucure i is highly desired o mainain his srucure informaion in learning process. We have seen he mos recen developmen in deep learning archiecure for muli-dimensional ensor daa exending convenional vecor neural neworks o srucured daa such as he marix neural nework [1] [2] wo independen works on ensorial neural neworks [3] [4] graph daa [5] [6] and even neural neworks for manifold-valued daa [7]. The recurren neural neworks RNN as a commonly applied ool in longiudinal daa analysis have consanly been invesigaed in he las couple of decades wih many successful applicaions such as language processing [8] speech recogniion [9] and human acion recogniion [10] [11] ec. There are many differen archiecures for RNN such as he basic Mingyuan Bai and Junbin Gao are wih wih he Discipline of Business Analyics The Universiy of Sydney Business School The Universiy of Sydney NSW 2006 Ausralia. mbai8854@uni.sydney.edu.au junbin.gao@sydney.edu.au Boyang Zhang is School of Informaion Technologies The Universiy of Sydney NSW 2006 Ausralia. bzha8220@uni.sydney.edu.au recurren nework Elman neworks or Jordan neworks long shor-erm memory LSTM and gaed recurren uni GRU ec. The mos recen developmens of recurren neural neworks are generally focused on he LSTM model. For example he LSTM has been combined wih he convoluional neural neworks CNN for sequence represenaion learning [12]. However he radiional LSTM model can only deal wih vecorised daa which leads o he loss of some spaial informaion for mulidimensional ime series. Our inenion in his paper is o propose a fully ensorial conneced neural neworks for ensorial longiudinal daa. A recen paper has also considered ensorial srucure in he classical recurren neural neworks however he main purpose was o reduce he number of neworks parameers when vecorial daa are in very high-dimension [13]. The res of his paper is organized as follows. Secion II inroduces basic recurren neural nework and wo ypes of ensorial RNN i.e. ensorial LSTM LSTM and ensorial GRU GRU. In Secion III we derive he backpropagaion algorihms for he proposed LSTM and rgru. In Secion IV experimenal resuls are presened o evaluae he performance of he proposed models. Finally conclusions and fuure works are summarized in Secion V. II. TENSORIAL RECURRENT NEURAL NETWORKS The simple building block for RNN is expressed in he following forward mapping Elman model [14] h σ h W hx x + W hh h 1 + b h y σ y W hy h + b y or in Jordan model [15] h σ h W hx x + W hh y 1 + b h y σ y W hy h + b y Jordan model furher passes on he oupu informaion a ime o he nex ime + 1 as inpus. In his noe we will focus on Elman model 1. However we will consider he seing for ensorial longiudinal daa in general denoed by D {X Y } T 1. where each independen daa X is a ensor of D-ways he ensor dimension and he response daa Y could be a scalar a vecor or more general a ensor of he same dimension as X. The wo mos popular recurren neural nework archiecures are he Long Shor-Term Memory Unis LSTMs and he 1 2
2 2 Gaed Recurren Unis GRUs which will be exended for ensors daa. A. Tensorial LSTM LSTM LSTMs were inroduced in [16] and furher modernized by many people e.g. [17]. The applicaion has demonsraes ha LSTMs work remendously well on a large variey of problems and are now widely used. Based on he classic LSTMs we propose he following ensorial LSTM F σ g H 1 1 W f1 2 W f2 D W fd + X 1 U f1 2 U f2 D U fd + B f 3 I σ g H 1 1 W i1 2 W i2 D W id + X 1 U i1 2 U i2 D U id + B i 4 O σ g H 1 1 W o1 2 W o2 D W od + X 1 U o1 2 U o2 D U od + B o 5 Ĉ σ c H 1 1 W c1 2 W c2 D W cd + X 1 U c1 2 U c2 D U cd + B c 6 C F C 1 + I Ĉ 7 H O σ h C 8 where he operaor denoes he Hadamard produc i.e. he enry-wise produc and W d as well as W d are marices in relevan order applied on hidden ensorial variables and inpu ensorial variables in erms of ensorial mode produc [18] and all B are ensorial biases. Noe O is no he acual oupu of he LSTM. In fac O will be joinly regulaed by boh I and C o make he poenial oupu from he hidden variable H as in 8. Depending on he ype of response daa Y we may apply an exra layer of neural nework on he op of H o conver he ensor hidden H o he shape/srucure of Y. For he sake of noaion simpliciy we assume he ransformed oupu is denoed by Ô. B. Tensorial GRU GRU The Gaed Recurren Uni GRU was inroduced in [19] wih a slighly more dramaic variaion on he LSTM. Similar o GRU in our proposed ensorial GRU he forge F and he inpu I gaes are o be combined ino a single updae gae. GRU is simpler han he aforemenioned LSTM models. R σ g H 1 1 W r1 2 W r2 D W rd + X 1 U r1 2 U r2 D U rd + B r 9 Z σ g H 1 1 W z1 2 W z2 D W zd + X 1 U z1 2 U z2 D U zd + B z 10 R R H 1 11 Ĥ σ h R 1 W h1 2 W h2 D W hd + X 1 U h1 2 U h2 D U hd + B h 12 H Z H Z Ĥ. 13 Similar o LSTM we will add an addiional ransform mapping he hidden variables H o mach he response variable Y. A. Loss Funcion III. RECURRENT BP ALGORITHM According o he way how daa is presened we propose hree ypes of loss funcions. Loss funcion for single daa series. The raining daa is presened as a single ime series and we will apply he LTSM or GRU on he series and collec heir oupus a each ime poin. The simple loss a each ime is defined as l s ly Ô which is he building block for all he oher overall loss funcion. Please noe ha Ô is calculaed hrough he recurren neworks from he inpu {X 1 X 2... X }. l is a loss funcion such as he usual squared loss funcion for regression or he cross-enropy loss for classificaion. The overall loss is defined as l s l s ly Ô Loss funcion for muliple daa series wih same lengh/duraion. Mos of ime we will use a recurren nework srucure wih a cerain duraion. In his case we will assume ha he raining daa consis of a number of raining series D {X j1... X jt Y j } N The loss for he j-h case is only calculaed a ime T as l T j ly j ÔjT where ÔjT is he las oupu of LSTM or GRU from he inpu series X j1... X jt. Hence he overall loss is l T l T j ly j ÔjT. 15 Similar o he simple series case if he response is a series Y j1... Y jt hen he loss can be revised as l m l m j ly j Ôj Loss funcion for Panel Daa In many applicaion case paricularly for panel daa he duraion or period T for each series X j1... X jt may be differen hus he recurren nework will run hrough differen loops. Suppose he daa are D {X j1... X jtj Y j } N hen he loss can be defined as l T p l T p j ly j ÔjT j. 17 or 1 T j T j l mp l m j ly j Ôj Loss funcion 14 is a special case of loss funcion 16 when N 1. For he BP algorihm in he nex subsecion we will focus on losses 15 and 16. Similarly losses 17 and 18 can be processed in he similar way as for 15 and 16 respecively.
3 3 B. BP Algorihm The major difference beween he proposed ensorial RNN RNN and he vecorial RNN is ha he vecorial linear mapping has been replaced wih he ensor muliple linear mapping i.e. Tucker muliplicaion [18]. Le us denoe Tucker mapping by M α H 1 1 W α1 2 W α2 D W αd + X 1 U α1 2 U α2 D U αd + B α where α can be eiher f i o c r z or h. When α h we replace H 1 wih R in 12. Firs we inroduce he resuls from [20] wihou proof. Lemma 1 Denoe by M and H he oal sizes of ensors M α and H 1 respecively hen M α H 1 M H W αd W α1 19 where he maricized form has been applied and means he Kronecker produc of marices. And M α d M α d [W αd W αd 1 W αd+1 W α1 H T 1d ] I h d h d 20 [U αd U αd 1 U αd+1 U α1 Xd T ] I x d x d 21 M α M M I M M 22 where he subscrip d means he d-mode maricizaion of a ensor and boh h d and x d are he dimension of mode d of ensors H 1 and X respecively. Firs le us derive he BP algorihm for each LSTM sep. The compuaion flow defined by equaions 3-8 can be shown in Fig. 1. Along he ime C will be forwarded o he nex LSTM sep while he hidden ensorial H will be carried ono he nex sep and also oupu an exra layer o mach he response Y a. Hence in he BP algorihm on LSTM uni here could be wo pieces of informaion one from he nex LSTM uni denoed by H and he oher from he oupu loss l denoed by o H. Thus he combined derivaive informaion o be furher backpropagaed is We know ha + o. 23 h H H H o H 0 if here is no response Y a ime. According o he flows from C 1 F Ĉ I and O respecively o boh C and H and he chain rule i is easy o read from he diagram + σ C 1 C hc F 24 h H + σ F C hc C 1 25 h H + σ Ĉ C hc I 26 h H + σ I C hc h H Ĉ 27 σ h C 28 O h H Le us inroduce wo operaors: for any ensor M denoe he vecorizaion as m VecM while is inverse operaor ivecm M. Then we will have H 1 ivec Vec F gm f T M f H 1 +ivec Vec M σ gm c T c Ĉ H 1 +ivec Vec M I gm i T i H 1 +ivec Vec M O gm o T o H Similarly we have a ime ivec Vec A gm α T Mα d 30 ivec Vec A gm α T Mα d 31 [ ] sum A gm α cons ensor 32 where A α F f or Ĉ c or I i or O o. Finally he overall derivaives for he parameers are given by Le us consider loss funcions defined in 15 and 16. Firs we noe ha 0. In he case of 16 o H is calculaed according o he given layer and cos funcion for he response Y for T. In he case of 15 we only have he informaion o H when T oherwise 0. Hence 23 will be calculaed accordingly. We summarize he derivaive BP algorihm for LSTM in Algorihm 1
4 4 Fig. 1. LSTM Compuaion Flow. Algorihm 1 The Derivaive BP Algorihm for LSTM for a single raining daa Inpu: The raining case {X j1 X j2 X jt Y jt } or {X j1 X j2 X jt Y j1 Y j2 Y jt } Oupu: D. 1: Se and for α f c i o and d C T 0 0 and calculae o according o he oupu layer for Y jt a ime T 2: for T o 1 do 3: Use 23 o calculae h H while for < T o H 0 if here is no response a oherwise i is calculaed according o he oupu loss a 4: Use o calculae F Ĉ I and O 5: Use o calculae Mα d 6: Use o calculae and Mα d and 7: Use 19 o calculae Mα H 1 for α f c i o 8: Use 24 o updae C 1 and 29 for H 1 9: end for 10: Use o summarize and oupus. The BP algorihm for GRU can be derived in a similar way by looking a he compuaion flow defined by equaions 9-13 as shown in Fig. 2 Similarly he backpropagaed informaion from ime + 1 is as in 23. Hence h Ĥ 1 Z 36 Ĥ h H H 1 Z Ĥ 37 h H R H 1 ivec Vec Ĥ hm h T M h R for 38 ivec Vec M H 1 R gm r T r R + ivec Vec M Z gm z T z Z + Z 39 h H The derivaives wih respec o all hree ses of parameers can be obained from wih A α R r Z z and Ĥ h. The derivaive BP algorihm for GRU is summarized in Algorihm 2 Algorihm 2 The Derivaive BP Algorihm for GRU for a single raining daa Inpu: The raining case {X j1 X j2 X jt Y jt } or {X j1 X j2 X jt Y j1 Y j2 Y jt } Oupu: and D. 1: Se 0 and calculae o layer for Y jt a ime T 2: for T o 1 do 3: Use 23 o calculae for α r z h and d according o he oupu h H while for < T o H 0 if here is no response a oherwise i is calculaed according o he oupu loss a 4: Use 19 o calculae Mα H 1 for α r z and Mh 5: Use o calculae Ĥ 6: Use o calculae Mα d 7: Use o calculae 8: Use 39 o updae H 1 9: end for 10: Use o summarize oupus. A. Daa Descripion IV. EXPERIMENTS Z and R R and Mα d and and To assess he performance of RNN on he real world daa we conduc an empirical sudy. In his sudy he daa is for
5 5 Fig. 2. GRU Compuaion Flow. colleced from Inegraed Crisis Warning Sysem ICEWS 1 which is he same weekly daase applied in he sudy of MLTR [21] for he relaionship beween 25 counries in four ypes of acions: maerial cooperaion maerial conflic verbal cooperaion and verbal conflic from 2004 o mid Thus a any paricular ime poin he daa is a 3D ensor of dimensions Tha is each inpu X R To explore differen ypes of paerns ofen seen in relaional daa and social neworks we organise explanaory ensors X in he following differen ways. As done in [21] we consruc he arge ensor Y a ime as he lagged X 1. In oal we consruc an overall daase {X Y } of size 543 in which all X and Y are 3D ensors. Furher we ake T 7 as he period of ime series secions and use 90% daa for raining as defined in he following wo cases: Case I: We use LSTM in erms of a many-o-one recurren model wih he raining daase as follows D 1 {X j X j+1... X j+6 Y j+6 } 488 and he remaining 55 daa will be used for esing. Case II: We use LSTM in erms of many-o-many recurren model wih he raining daase as follows D 2 {X j X j+1... X j+6 Y j Y j+1... Y j+6 } 488 and he remaining 55 daa will be used for esing. B. Experimen Seing and Resuls Our inenion in hese wo experimens is o quickly demonsrae how LSTM works wih he mos possible convenience. GRU can give similar resuls. In Case I we se he size of he hidden nodes o be doubled he inpu size In Case II we use half he inpu size for he hidden nodes i.e We also use a regulariser for all he marix coefficiens W and U i.e. 1 hp:// adding he following o he loss funcion o have a regularised objecive funcion λ W 2 F + U 2 F all W all U where F is he Frobenius norm for marices and λ > 0 is a parameer o rade-off beween he loss and he regulariser. In our experimen we empirically se λ This parameer should be opimised by using a se of valid daase. Fig. 3 shows he convergence rends for boh cases in raining process wih 1000 epoches. The es errors are for Case I and for Case II. V. CONCLUSION In his paper we inroduced he new recurren neural neworks for high-order ensor daa. Two special recurren srucures i.e. LSTM and GRU are proposed wih deailed BP algorihm derivaion. Two simple experimens have demonsraed he performance of he new recurren neural neworks. More experimens shall be conduced o demonsrae is efficiency and accuracy agains he exising neural neworks. We also inend o explore more applicaions such as for video daa analysis. REFERENCES [1] J. Gao Y. Guo and Z. Wang Marix neural neworks in Proceedings of he14h Inernaional Symposium on Neural Neworks ISNN vol. Acceped Sapporo Japan 2017 pp [2] C. Ionescu O. Vanzos and C. Sminchisescu Marix backpropagaion for deep neworks wih srucured layers in Proceedings of he IEEE Inernaional Conference on Compuer Vision ICCV 2015 pp [3] M. Bai B. Zhang and J. Gao Tensorial neural neworks and is applicaion in longiudinal nework daa analysis in submied o he 24h Inernaional Conference On Neural Informaion Processing ICONIP [4] J.-T. Chien and Y.-T. Bao Tensor-facorized neural neworks IEEE Transacions on Neural Neworks and Learning Sysems vol. PP [Online]. Available: hp://ieeexplore.ieee.org/documen/ / [5] Y. Li D. Tarlow M. Brockschmid and R. Zemel Gaed graph sequence neural neworks in Proceedings of Inernaional Conference on Learning Represenaions ICLR 2016.
6 Case I 3.4 Case II Objecive Objecive Epoches a Epoches b Fig. 3. Convergence Trends for Two Cases. [6] Y. Seo M. Defferrard P. Vandergheyns and X. Bresson Srucured sequence modeling wih graph convoluional recurren neworks in Proceedings of Inernaional Conference on Learning Represenaion [7] Z. Huang and L. V. Gool A Riemannian nework for spd marix learning in Proceedings of he Thiry-Firs AAAI Conference on Arificial Inelligence AAAI [8] W. D. Mulder S. Behard and M.-F. Moens A survey on he applicaion of recurren neural neworks o saisical language modeling Compuer Speech & Language vol. 30 no. 1 pp [9] A. Graves A. rahman Mohamed and G. Hinon Speech recogniion wih deep recurren neural neworks in Proceedings of IEEE Inernaional Conference on Acousics Speech and Signal Processing ICASSP [10] J. Liu A. Shahroudy D. Xu and G. Wang Spaio-emporal LSTM wih rus gaes for 3D human acion recogniion in Proceedings of he European Conference on Compuer Vision ECCV 2016 pp [11] P. Zhang C. Lan J. Xing W. Zeng J. Xue and N. Zheng View adapive recurren neural neworks for high performance human acion recogniion from skeleon daa arxiv: vol [12] Z. Gany Y. Puy R. Henaoy C. Liy X. Hez and L. Carin Learning generic senence represenaions using convoluional neural neworks arxiv: vol [13] C. Jose M. Cissé and F. Fleure Kronecker recurren unis arxiv: [14] J. L. Elman Finding srucure in ime Cogniive Science vol. 14 no. 2 pp [15] M. I. Jordan Serial Order: A Parallel Disribued Processing Approach ser. Advances in Psychology: Neural-Nework Models of Cogniion. Elsevier 1997 vol. 121 ch. 25 pp [16] S. Hochreier and J. Schmidhuber Long shor-erm memory.. Neural Compuaion vol. 9 no. 8 pp [17] F. A. Gers J. Schmidhuber and F. Cummins Learning o forge: Coninual predicion wih LSTM Neural Compuaion vol. 12 no. 10 pp [18] T. G. Kolda and B. W. Bader Tensor decomposiions and applicaions SIAM Review vol. 51 no. 3 pp [19] K. Cho B. van Merriënboer Ç. Gülçehre D. Bahdanau F. Bougares H. Schwenk and Y. Bengio Learning phrase represenaions using rnn encoder decoder for saisical machine ranslaion in Proceedings of he 2014 Conference on Empirical Mehods in Naural Language Processing EMNLP. Doha Qaar: Associaion for Compuaional Linguisics Oc pp [Online]. Available: hp:// [20] T. G. Kolda Mulilinear Operaors for Higher-Order Decomposiions Sandia Naional Laboraories Technical repor [21] P. D. Hoff Mulilinear Tensor Regression for Longiudinal Relaional Daa Ann. Appl. Sa. vol. 9 no. 3 pp
Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -
Deep Learning: Theory, Techniques & Applicaions - Recurren Neural Neworks - Prof. Maeo Maeucci maeo.maeucci@polimi.i Deparmen of Elecronics, Informaion and Bioengineering Arificial Inelligence and Roboics
More informationLearning to Process Natural Language in Big Data Environment
CCF ADL 2015 Nanchang Oc 11, 2015 Learning o Process Naural Language in Big Daa Environmen Hang Li Noah s Ark Lab Huawei Technologies Par 2: Useful Deep Learning Tools Powerful Deep Learning Tools (Unsupervised
More informationSimplified Gating in Long Short-term Memory (LSTM) Recurrent Neural Networks
Simplified Gaing in Long Shor-erm Memory (LSTM) Recurren Neural Neworks Yuzhen Lu and Fahi M. Salem Circuis, Sysems, and Neural Neworks (CSANN) Lab Deparmen of Biosysems and Agriculural Engineering Deparmen
More informationCHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK
175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he
More informationStability and Bifurcation in a Neural Network Model with Two Delays
Inernaional Mahemaical Forum, Vol. 6, 11, no. 35, 175-1731 Sabiliy and Bifurcaion in a Neural Nework Model wih Two Delays GuangPing Hu and XiaoLing Li School of Mahemaics and Physics, Nanjing Universiy
More informationt is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...
Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger
More informationSTATE-SPACE MODELLING. A mass balance across the tank gives:
B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing
More informationTesting for a Single Factor Model in the Multivariate State Space Framework
esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics
More information6.2 Transforms of Derivatives and Integrals.
SEC. 6.2 Transforms of Derivaives and Inegrals. ODEs 2 3 33 39 23. Change of scale. If l( f ()) F(s) and c is any 33 45 APPLICATION OF s-shifting posiive consan, show ha l( f (c)) F(s>c)>c (Hin: In Probs.
More informationChapter 2. First Order Scalar Equations
Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.
More informationGeorey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract
Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical
More informationA Specification Test for Linear Dynamic Stochastic General Equilibrium Models
Journal of Saisical and Economeric Mehods, vol.1, no.2, 2012, 65-70 ISSN: 2241-0384 (prin), 2241-0376 (online) Scienpress Ld, 2012 A Specificaion Tes for Linear Dynamic Sochasic General Equilibrium Models
More information10. State Space Methods
. Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he
More informationA DELAY-DEPENDENT STABILITY CRITERIA FOR T-S FUZZY SYSTEM WITH TIME-DELAYS
A DELAY-DEPENDENT STABILITY CRITERIA FOR T-S FUZZY SYSTEM WITH TIME-DELAYS Xinping Guan ;1 Fenglei Li Cailian Chen Insiue of Elecrical Engineering, Yanshan Universiy, Qinhuangdao, 066004, China. Deparmen
More informationSupplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence
Supplemen for Sochasic Convex Opimizaion: Faser Local Growh Implies Faser Global Convergence Yi Xu Qihang Lin ianbao Yang Proof of heorem heorem Suppose Assumpion holds and F (w) obeys he LGC (6) Given
More informationMulti-scale 2D acoustic full waveform inversion with high frequency impulsive source
Muli-scale D acousic full waveform inversion wih high frequency impulsive source Vladimir N Zubov*, Universiy of Calgary, Calgary AB vzubov@ucalgaryca and Michael P Lamoureux, Universiy of Calgary, Calgary
More informationChapter 3 Boundary Value Problem
Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le
More information23.5. Half-Range Series. Introduction. Prerequisites. Learning Outcomes
Half-Range Series 2.5 Inroducion In his Secion we address he following problem: Can we find a Fourier series expansion of a funcion defined over a finie inerval? Of course we recognise ha such a funcion
More informationA Note on the Equivalence of Fractional Relaxation Equations to Differential Equations with Varying Coefficients
mahemaics Aricle A Noe on he Equivalence of Fracional Relaxaion Equaions o Differenial Equaions wih Varying Coefficiens Francesco Mainardi Deparmen of Physics and Asronomy, Universiy of Bologna, and he
More informationRobust estimation based on the first- and third-moment restrictions of the power transformation model
h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,
More information0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED
0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable
More informationThe field of mathematics has made tremendous impact on the study of
A Populaion Firing Rae Model of Reverberaory Aciviy in Neuronal Neworks Zofia Koscielniak Carnegie Mellon Universiy Menor: Dr. G. Bard Ermenrou Universiy of Pisburgh Inroducion: The field of mahemaics
More informationSingle-Pass-Based Heuristic Algorithms for Group Flexible Flow-shop Scheduling Problems
Single-Pass-Based Heurisic Algorihms for Group Flexible Flow-shop Scheduling Problems PEI-YING HUANG, TZUNG-PEI HONG 2 and CHENG-YAN KAO, 3 Deparmen of Compuer Science and Informaion Engineering Naional
More informationd 1 = c 1 b 2 - b 1 c 2 d 2 = c 1 b 3 - b 1 c 3
and d = c b - b c c d = c b - b c c This process is coninued unil he nh row has been compleed. The complee array of coefficiens is riangular. Noe ha in developing he array an enire row may be divided or
More informationNew effective moduli of isotropic viscoelastic composites. Part I. Theoretical justification
IOP Conference Series: Maerials Science and Engineering PAPE OPEN ACCESS New effecive moduli of isoropic viscoelasic composies. Par I. Theoreical jusificaion To cie his aricle: A A Sveashkov and A A akurov
More informationHow to Deal with Structural Breaks in Practical Cointegration Analysis
How o Deal wih Srucural Breaks in Pracical Coinegraion Analysis Roselyne Joyeux * School of Economic and Financial Sudies Macquarie Universiy December 00 ABSTRACT In his noe we consider he reamen of srucural
More informationDiebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles
Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance
More informationNavneet Saini, Mayank Goyal, Vishal Bansal (2013); Term Project AML310; Indian Institute of Technology Delhi
Creep in Viscoelasic Subsances Numerical mehods o calculae he coefficiens of he Prony equaion using creep es daa and Herediary Inegrals Mehod Navnee Saini, Mayank Goyal, Vishal Bansal (23); Term Projec
More informationApplication of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing
Applicaion of a Sochasic-Fuzzy Approach o Modeling Opimal Discree Time Dynamical Sysems by Using Large Scale Daa Processing AA WALASZE-BABISZEWSA Deparmen of Compuer Engineering Opole Universiy of Technology
More informationKriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number
More informationRéseaux de neurones récurrents Handwriting Recognition with Long Short-Term Memory Networks
Réseaux de neurones récurrens Handwriing Recogniion wih Long Shor-Term Memory Neworks Dr. Marcus Eichenberger-Liwicki DFKI, Germany Marcus.Liwicki@dfki.de Handwriing Recogniion (Sae of he Ar) Transform
More informationSOLUTIONS TO ECE 3084
SOLUTIONS TO ECE 384 PROBLEM 2.. For each sysem below, specify wheher or no i is: (i) memoryless; (ii) causal; (iii) inverible; (iv) linear; (v) ime invarian; Explain your reasoning. If he propery is no
More informationApplying Genetic Algorithms for Inventory Lot-Sizing Problem with Supplier Selection under Storage Capacity Constraints
IJCSI Inernaional Journal of Compuer Science Issues, Vol 9, Issue 1, No 1, January 2012 wwwijcsiorg 18 Applying Geneic Algorihms for Invenory Lo-Sizing Problem wih Supplier Selecion under Sorage Capaciy
More informationArticle from. Predictive Analytics and Futurism. July 2016 Issue 13
Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning
More informationModal identification of structures from roving input data by means of maximum likelihood estimation of the state space model
Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix
More informationPower Market Price Forecasting via Deep Learning
Power Marke Price Forecasing via Deep Learning Yongli Zhu, Renchang Dai, Guangyi Liu, Zhiwei Wang GEIRI Norh America Graph Compuing and Grid Modernizaion Deparmen San Jose, California yzhu16@vols.uk.edu
More informationPhysics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle
Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,
More informationAn Empirical Study on Energy Disaggregation via Deep Learning
Advances in Inelligen Sysems Research, volume 133 2nd Inernaional Conference on Arificial Inelligence and Indusrial Engineering (AIIE2016) An Empirical Sudy on Energy Disaggregaion via Deep Learning Wan
More informationZürich. ETH Master Course: L Autonomous Mobile Robots Localization II
Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),
More informationContent-Based Shape Retrieval Using Different Shape Descriptors: A Comparative Study Dengsheng Zhang and Guojun Lu
Conen-Based Shape Rerieval Using Differen Shape Descripors: A Comparaive Sudy Dengsheng Zhang and Guojun Lu Gippsland School of Compuing and Informaion Technology Monash Universiy Churchill, Vicoria 3842
More informationGENERALIZATION OF THE FORMULA OF FAA DI BRUNO FOR A COMPOSITE FUNCTION WITH A VECTOR ARGUMENT
Inerna J Mah & Mah Sci Vol 4, No 7 000) 48 49 S0670000970 Hindawi Publishing Corp GENERALIZATION OF THE FORMULA OF FAA DI BRUNO FOR A COMPOSITE FUNCTION WITH A VECTOR ARGUMENT RUMEN L MISHKOV Received
More informationEXPLICIT TIME INTEGRATORS FOR NONLINEAR DYNAMICS DERIVED FROM THE MIDPOINT RULE
Version April 30, 2004.Submied o CTU Repors. EXPLICIT TIME INTEGRATORS FOR NONLINEAR DYNAMICS DERIVED FROM THE MIDPOINT RULE Per Krysl Universiy of California, San Diego La Jolla, California 92093-0085,
More informationLearning Naive Bayes Classifier from Noisy Data
UCLA Compuer Science Deparmen Technical Repor CSD-TR No 030056 1 Learning Naive Bayes Classifier from Noisy Daa Yirong Yang, Yi Xia, Yun Chi, and Richard R Munz Universiy of California, Los Angeles, CA
More informationKinematics and kinematic functions
Kinemaics and kinemaic funcions Kinemaics deals wih he sudy of four funcions (called kinemaic funcions or KFs) ha mahemaically ransform join variables ino caresian variables and vice versa Direc Posiion
More informationLinear Response Theory: The connection between QFT and experiments
Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and
More informationMath Week 14 April 16-20: sections first order systems of linear differential equations; 7.4 mass-spring systems.
Mah 2250-004 Week 4 April 6-20 secions 7.-7.3 firs order sysems of linear differenial equaions; 7.4 mass-spring sysems. Mon Apr 6 7.-7.2 Sysems of differenial equaions (7.), and he vecor Calculus we need
More informationBBP-type formulas, in general bases, for arctangents of real numbers
Noes on Number Theory and Discree Mahemaics Vol. 19, 13, No. 3, 33 54 BBP-ype formulas, in general bases, for arcangens of real numbers Kunle Adegoke 1 and Olawanle Layeni 2 1 Deparmen of Physics, Obafemi
More informationVehicle Arrival Models : Headway
Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where
More informationDimitri Solomatine. D.P. Solomatine. Data-driven modelling (part 2). 2
Daa-driven modelling. Par. Daa-driven Arificial di Neural modelling. Newors Par Dimiri Solomaine Arificial neural newors D.P. Solomaine. Daa-driven modelling par. 1 Arificial neural newors ANN: main pes
More informationThe expectation value of the field operator.
The expecaion value of he field operaor. Dan Solomon Universiy of Illinois Chicago, IL dsolom@uic.edu June, 04 Absrac. Much of he mahemaical developmen of quanum field heory has been in suppor of deermining
More informationIntermediate Macro In-Class Problems
Inermediae Macro In-Class Problems Exploring Romer Model June 14, 016 Today we will explore he mechanisms of he simply Romer model by exploring how economies described by his model would reac o exogenous
More informationParticle Swarm Optimization Combining Diversification and Intensification for Nonlinear Integer Programming Problems
Paricle Swarm Opimizaion Combining Diversificaion and Inensificaion for Nonlinear Ineger Programming Problems Takeshi Masui, Masaoshi Sakawa, Kosuke Kao and Koichi Masumoo Hiroshima Universiy 1-4-1, Kagamiyama,
More informationElectrical and current self-induction
Elecrical and curren self-inducion F. F. Mende hp://fmnauka.narod.ru/works.hml mende_fedor@mail.ru Absrac The aricle considers he self-inducance of reacive elemens. Elecrical self-inducion To he laws of
More informationForecasting Stock Exchange Movements Using Artificial Neural Network Models and Hybrid Models
Forecasing Sock Exchange Movemens Using Arificial Neural Nework Models and Hybrid Models Erkam GÜREEN and Gülgün KAYAKUTLU Isanbul Technical Universiy, Deparmen of Indusrial Engineering, Maçka, 34367 Isanbul,
More informationMatrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality
Marix Versions of Some Refinemens of he Arihmeic-Geomeric Mean Inequaliy Bao Qi Feng and Andrew Tonge Absrac. We esablish marix versions of refinemens due o Alzer ], Carwrigh and Field 4], and Mercer 5]
More informationEnsamble methods: Bagging and Boosting
Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par
More informationThe Asymptotic Behavior of Nonoscillatory Solutions of Some Nonlinear Dynamic Equations on Time Scales
Advances in Dynamical Sysems and Applicaions. ISSN 0973-5321 Volume 1 Number 1 (2006, pp. 103 112 c Research India Publicaions hp://www.ripublicaion.com/adsa.hm The Asympoic Behavior of Nonoscillaory Soluions
More informationStable block Toeplitz matrix for the processing of multichannel seismic data
Indian Journal of Marine Sciences Vol. 33(3), Sepember 2004, pp. 215-219 Sable block Toepliz marix for he processing of mulichannel seismic daa Kiri Srivasava* & V P Dimri Naional Geophysical Research
More information2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes
Some common engineering funcions 2.7 Inroducion This secion provides a caalogue of some common funcions ofen used in Science and Engineering. These include polynomials, raional funcions, he modulus funcion
More informationRobotics I. April 11, The kinematics of a 3R spatial robot is specified by the Denavit-Hartenberg parameters in Tab. 1.
Roboics I April 11, 017 Exercise 1 he kinemaics of a 3R spaial robo is specified by he Denavi-Harenberg parameers in ab 1 i α i d i a i θ i 1 π/ L 1 0 1 0 0 L 3 0 0 L 3 3 able 1: able of DH parameers of
More informationODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004
ODEs II, Lecure : Homogeneous Linear Sysems - I Mike Raugh March 8, 4 Inroducion. In he firs lecure we discussed a sysem of linear ODEs for modeling he excreion of lead from he human body, saw how o ransform
More informationImproved Approximate Solutions for Nonlinear Evolutions Equations in Mathematical Physics Using the Reduced Differential Transform Method
Journal of Applied Mahemaics & Bioinformaics, vol., no., 01, 1-14 ISSN: 179-660 (prin), 179-699 (online) Scienpress Ld, 01 Improved Approimae Soluions for Nonlinear Evoluions Equaions in Mahemaical Physics
More informationChapter 1 Fundamental Concepts
Chaper 1 Fundamenal Conceps 1 Signals A signal is a paern of variaion of a physical quaniy, ofen as a funcion of ime (bu also space, disance, posiion, ec). These quaniies are usually he independen variables
More informationClass Meeting # 10: Introduction to the Wave Equation
MATH 8.5 COURSE NOTES - CLASS MEETING # 0 8.5 Inroducion o PDEs, Fall 0 Professor: Jared Speck Class Meeing # 0: Inroducion o he Wave Equaion. Wha is he wave equaion? The sandard wave equaion for a funcion
More informationEnsamble methods: Boosting
Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room
More informationNotes on Kalman Filtering
Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren
More informationAdaptive Noise Estimation Based on Non-negative Matrix Factorization
dvanced cience and Technology Leers Vol.3 (ICC 213), pp.159-163 hp://dx.doi.org/1.14257/asl.213 dapive Noise Esimaion ased on Non-negaive Marix Facorizaion Kwang Myung Jeon and Hong Kook Kim chool of Informaion
More informationNature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.
Supplemenary Figure 1 Spike-coun auocorrelaions in ime. Normalized auocorrelaion marices are shown for each area in a daase. The marix shows he mean correlaion of he spike coun in each ime bin wih he spike
More informationSUPPLEMENTARY INFORMATION
SUPPLEMENTARY INFORMATION DOI: 0.038/NCLIMATE893 Temporal resoluion and DICE * Supplemenal Informaion Alex L. Maren and Sephen C. Newbold Naional Cener for Environmenal Economics, US Environmenal Proecion
More information14 Autoregressive Moving Average Models
14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class
More informationRandom Walk with Anti-Correlated Steps
Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and
More information04. Kinetics of a second order reaction
4. Kineics of a second order reacion Imporan conceps Reacion rae, reacion exen, reacion rae equaion, order of a reacion, firs-order reacions, second-order reacions, differenial and inegraed rae laws, Arrhenius
More informationSTRUCTURAL CHANGE IN TIME SERIES OF THE EXCHANGE RATES BETWEEN YEN-DOLLAR AND YEN-EURO IN
Inernaional Journal of Applied Economerics and Quaniaive Sudies. Vol.1-3(004) STRUCTURAL CHANGE IN TIME SERIES OF THE EXCHANGE RATES BETWEEN YEN-DOLLAR AND YEN-EURO IN 001-004 OBARA, Takashi * Absrac The
More informationRev. Téc. Ing. Univ. Zulia. Vol. 39, Nº 1, , 2016
Rev. Téc. Ing. Univ. Zulia. Vol. 39, Nº 1, 358-363, 216 doi:1.21311/1.39.1.41 Face Deecion and Recogniion Based on an Improved Adaboos Algorihm and Neural Nework Haoian Zhang*, Jiajia Xing, Muian Zhu,
More informationSimulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010
Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid
More informationEECE 301 Signals & Systems Prof. Mark Fowler
EECE 31 Signals & Sysems Prof. Mark Fowler Noe Se #1 C-T Sysems: Convoluion Represenaion Reading Assignmen: Secion 2.6 of Kamen and Heck 1/11 Course Flow Diagram The arrows here show concepual flow beween
More information1 Review of Zero-Sum Games
COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any
More informationTHE DISCRETE WAVELET TRANSFORM
. 4 THE DISCRETE WAVELET TRANSFORM 4 1 Chaper 4: THE DISCRETE WAVELET TRANSFORM 4 2 4.1 INTRODUCTION TO DISCRETE WAVELET THEORY The bes way o inroduce waveles is hrough heir comparison o Fourier ransforms,
More informationDemodulation of Digitally Modulated Signals
Addiional maerial for TSKS1 Digial Communicaion and TSKS2 Telecommunicaion Demodulaion of Digially Modulaed Signals Mikael Olofsson Insiuionen för sysemeknik Linköpings universie, 581 83 Linköping November
More informationStochastic Model for Cancer Cell Growth through Single Forward Mutation
Journal of Modern Applied Saisical Mehods Volume 16 Issue 1 Aricle 31 5-1-2017 Sochasic Model for Cancer Cell Growh hrough Single Forward Muaion Jayabharahiraj Jayabalan Pondicherry Universiy, jayabharahi8@gmail.com
More informationLecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.
Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in
More informationSection 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients
Secion 3.5 Nonhomogeneous Equaions; Mehod of Undeermined Coefficiens Key Terms/Ideas: Linear Differenial operaor Nonlinear operaor Second order homogeneous DE Second order nonhomogeneous DE Soluion o homogeneous
More informationA Generalized Recurrent Neural Architecture for Text Classification with Multi-Task Learning
Proceedings of he Tweny-Sixh Inernaional Join Conference on Arificial Inelligence (IJCAI-17) A Generalized Recurren Neural Archiecure for Tex Classificaion wih Muli-Task Learning Honglun Zhang 1, Liqiang
More informationA Framework for Efficient Document Ranking Using Order and Non Order Based Fitness Function
A Framework for Efficien Documen Ranking Using Order and Non Order Based Finess Funcion Hazra Imran, Adii Sharan Absrac One cenral problem of informaion rerieval is o deermine he relevance of documens
More informationt 2 B F x,t n dsdt t u x,t dxdt
Evoluion Equaions For 0, fixed, le U U0, where U denoes a bounded open se in R n.suppose ha U is filled wih a maerial in which a conaminan is being ranspored by various means including diffusion and convecion.
More informationTime series Decomposition method
Time series Decomposiion mehod A ime series is described using a mulifacor model such as = f (rend, cyclical, seasonal, error) = f (T, C, S, e) Long- Iner-mediaed Seasonal Irregular erm erm effec, effec,
More informationGMM - Generalized Method of Moments
GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................
More informationTwo Coupled Oscillators / Normal Modes
Lecure 3 Phys 3750 Two Coupled Oscillaors / Normal Modes Overview and Moivaion: Today we ake a small, bu significan, sep owards wave moion. We will no ye observe waves, bu his sep is imporan in is own
More information23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes
Represening Periodic Funcions by Fourier Series 3. Inroducion In his Secion we show how a periodic funcion can be expressed as a series of sines and cosines. We begin by obaining some sandard inegrals
More informationBlock Diagram of a DCS in 411
Informaion source Forma A/D From oher sources Pulse modu. Muliplex Bandpass modu. X M h: channel impulse response m i g i s i Digial inpu Digial oupu iming and synchronizaion Digial baseband/ bandpass
More informationWEEK-3 Recitation PHYS 131. of the projectile s velocity remains constant throughout the motion, since the acceleration a x
WEEK-3 Reciaion PHYS 131 Ch. 3: FOC 1, 3, 4, 6, 14. Problems 9, 37, 41 & 71 and Ch. 4: FOC 1, 3, 5, 8. Problems 3, 5 & 16. Feb 8, 018 Ch. 3: FOC 1, 3, 4, 6, 14. 1. (a) The horizonal componen of he projecile
More informationDeep Convolutional Recurrent Network for Segmentation-free Offline Handwritten Japanese Text Recognition
2017 14h IAPR Inernaional Conference on Documen Analysis and Recogniion Deep Convoluional Recurren Nework for Segmenaion-free Offline Handwrien Japanese Tex Recogniion Nam-Tuan Ly Dep. of Compuer Science
More informationLecture 10 - Model Identification
Lecure - odel Idenificaion Wha is ssem idenificaion? Direc impulse response idenificaion Linear regression Regularizaion Parameric model ID nonlinear LS Conrol Engineering - Wha is Ssem Idenificaion? Experimen
More informationψ(t) = V x (0)V x (t)
.93 Home Work Se No. (Professor Sow-Hsin Chen Spring Term 5. Due March 7, 5. This problem concerns calculaions of analyical expressions for he self-inermediae scaering funcion (ISF of he es paricle in
More informationR t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t
Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,
More informationAn EM based training algorithm for recurrent neural networks
An EM based raining algorihm for recurren neural neworks Jan Unkelbach, Sun Yi, and Jürgen Schmidhuber IDSIA,Galleria 2, 6928 Manno, Swizerland {jan.unkelbach,yi,juergen}@idsia.ch hp://www.idsia.ch Absrac.
More informationA Forward-Backward Splitting Method with Component-wise Lazy Evaluation for Online Structured Convex Optimization
A Forward-Backward Spliing Mehod wih Componen-wise Lazy Evaluaion for Online Srucured Convex Opimizaion Yukihiro Togari and Nobuo Yamashia March 28, 2016 Absrac: We consider large-scale opimizaion problems
More informationACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.
ACE 56 Fall 5 Lecure 8: The Simple Linear Regression Model: R, Reporing he Resuls and Predicion by Professor Sco H. Irwin Required Readings: Griffihs, Hill and Judge. "Explaining Variaion in he Dependen
More informationShortcut Sequence Tagging
Shorcu Sequence Tagging Huijia Wu 1,3, Jiajun Zhang 1,2, and Chengqing Zong 1,2,3 1 Naional Laboraory of Paern Recogniion, Insiue of Auomaion, CAS 2 CAS Cener for Excellence in Brain Science and Inelligence
More informationTime Series Forecasting using CCA and Kohonen Maps - Application to Electricity Consumption
ESANN'2000 proceedings - European Symposium on Arificial Neural Neworks Bruges (Belgium), 26-28 April 2000, D-Faco public., ISBN 2-930307-00-5, pp. 329-334. Time Series Forecasing using CCA and Kohonen
More information