Appendix to Online Clustering with Experts

Size: px
Start display at page:

Download "Appendix to Online Clustering with Experts"

Transcription

1 A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper much more e ecvely on all he real daa ses han on he 5 Gaussan expermen. hs s a smulaed daa se from a mxure of Gaussans ha s fxed, and where k s known o be 5. Whle OCE makes a favorable showng on he 3-expers expermen, n he 6-exper expermen, expers ncur loss ha s orders of magnude lower han he remanng ones. We drlled down on he performance of he OCE algorhms wh low values, ncludng Sac-Exper, as well as Learn-, and observed ha he means were smply hur by he algorhms unform prors over expers. ha s, n a few early eraons, OCE ncurred coss from cluserngs gvng nonrval weghs o all he expers predcons, so n hose eraons coss could be orders of magnude hgher han hose of expers Moreover, as menoned above, our regre bounds nsead upper bound loss. Addonal expermenal deals. Expers are 3 varans of k-means# algorhm [3]. In parcular hey are: 4. k-means# ha oupus 3 k log k ceners. 5. k-means# ha oupus.5 k log k ceners. 6. k-means# ha oupus.5 k log k ceners. Alhough OCE can sar ranng from he frs observaon, va our analyss reang smaller wndow szes (encounered a he begnnng and he end of he sequence), n he expermens we used he frs bach of pons as npu o all he cluserng algorhms, and sared ranng OCE varans afer ha. he parameer R was no uned bu was se as follows: R = for all daa ses excep Inruson and Spambase, n whch R =. Addonal expermenal resuls are provded n Secon C.

2 Anna Choromanska, Clare Moneleon B Addonal proofs Frs we provde some lemmas ha wll be used n subsequen proofs. hese follow he approach n [6]. We use he shor-hand noaon L(, ) for L(x,c ), whch s vald snce our loss s symmerc wh respec o s argumens. Lemma 6. P log P n p()e L(,) = log[ P,..., p ( )e L(,) Q e L(,) P (, )] Proof. Frs noe ha followng [5], we desgn he HMM such ha we equae our loss funcon, L(, ), wh he negave log-lkelhood of he observaon gven ha exper s he curren value of he hdden varable. In he unsupervsed seng, he observaon s x. hus L(, ) = log P (x a,x,...,x ). herefore, we can expand he lef hand sde of hep clam as follows. log P n p()e L(,) = P log P n p()p (x a,x,...,x ) = log P (x x,...,x ) = log p (x ) Y P (x x,...,x ) = log P (x,...,x ) = log[ P (x,...,x,..., )P (,..., )],..., = log[ = log[ = log[,..., p ( )P (x,..., ),..., p ( )P (x ),..., p ( )e L(,) Y P (x,...,,x,...,x )P (,...,, )] Y P (x,x,...,x )P (, )] Y e L(,) P (, )] Lemma 7. n D( k )=D( k ) when j = for = j and j = n for 6= j, [, ], P n =. Proof. n D( k )= n n j= j log j j = n [ log + n j6= j log j j ] = n [( )log n + n log j6= n n ]= n [( )log + log ] n n = D( k ) =D( k ) = D( k ) B. Proof of heorem Proof. Usng Defnons 3 and 4, we can express he loss of he algorhm on a pon x as L(x, clus()) = k P n p()(x c )k 4R

3 hen he followng chan of nequales are equvalen o eachoher. c L(x, clus()) log k P n p()(x c )k 4R log e k P n p()(x c )k 4R n p()e kx n c k 4R e n p()e L(x,c ) n p()e kx c k 4R! p()e kx c k 4R k P n p()(x c )k 4R () Le v = x.sncekxk R and R kc kr hen v [ ] d.equaon()sequvaleno n p()e kv k e k P n p()v k hs nequaly holds by Jensen s heorem snce he funcon f(v )=e kv k s concave when v [ ] d. B. Proof of heorem Proof. We can proceed by applyng heorem o bound he cumulave loss of he algorhm and hen use Lemma 6. As we proceed we follow he approach n he proof of heorem.. n [6]. L (alg) = L(x, clus()) n log p ()e L(,) = logp (x,...,x ) n = log P (x,...,x a,,...,a, )P (a,,...,a, ) n Y = log p ()P (x a,) P (x a,x,...,x ) = log n = log n = log n n Y e L(,) e n e P L(,) log L (a )+logn he las nequaly holds for any a, so n parcular for a. B.3 Proof of heorem 3 n L(,) e P L(,) Proof. By applyng frs heorem and hen Lemma 6 and followng he proof of heorem 3 (Man heorem) n he [6] we oban: L (alg) = L (alg) n log p ()e L(,)

4 Anna Choromanska, Clare Moneleon Y Y = log[ p ( ) P (, ) e L(,) ],..., Y = log[ P (,..., ) e L(,) ],..., where Noce ha also: Y P (,..., ) = p ( ) P (, ) ny ny P (,..., ) = p ( ) j= ( j) n j (,..., ) where n j(,..., ) s he number of ransons from sae o sae j, n a sequence,..., and P j nj(,..., ), s he number of mes he sequence was n sae, excep a he fnal me-sep. hus P j nj(,..., )=( )ˆ (,..., ), where ˆ (,..., ) s he emprcal esmae, from he sequence,...,, of he margnal probably of beng n sae, a any me-sep excep he fnal one. I follows ha: n j(,..., )=( )ˆ (,..., )ˆ j(,..., ) where ˆ j(,..., )= s he emprcal esmae of he probably of ha parcular sae ranson, on he bass of,...,. n j (,..., ) Pj n j (,..., ) hus: P (,..., ) = p ( ) ny ny j= ( j) ( )ˆ (,..., )ˆ j (,..., ) = p ( )e ( ) P n P nj= ˆ (,..., )ˆ j (,..., )log j hus: L (alg) Y log[ P (,..., ) e L(,) ],..., = log[,..., p ( )e ( ) P n P nj= ˆ (,..., )ˆ j (,..., )log j Y e L(,) ] Le,..., correspond o he bes segmenaon of he sequence no s segmens meanng he s-paronng wh mnmal cumulave loss. Obvously hen he hndsgh-opmal (cumulave loss mnmzng) seng of swchng rae parameer, gvens, s: = s and snce we are n he fxed-share seng: j = for = j and j = n can connue as follows: log[p ( )e ( ) P n P nj= ˆ (,..., )ˆ j (,..., )log j Y e L(,) ] for 6= j. We n n = logp ( ) ( ) ˆ (,..., )]ˆ j(,..., )log j + L(,) j= n n =logn ( ) ˆ (,..., )ˆ j(,..., )log j + L(,) j= n = L(,)+logn ( ) ˆ (,..., )ˆ j(,..., )log j,j=

5 n n ( ) ˆ (,..., )ˆ j(,..., )log j j=,j6= n = L(,)+logn ( )( )log( ) ˆ (,..., ),j= n ( ) n log( n ) j=,j6= n ˆ (,..., ) = L(,)+logn ( )( )log( ) ( ) log( n ) = L(,)+logn ( )( )log( ) ( ) log +( ) log(n ) = L(,)+logn ( )( )log( ) ( ) log +slog(n ) = L(,)+logn +slog(n ) ( )(( )log( )+ log ) = L(,)+logn +slog(n ) + ( )(H( )+D( k )) B.4 Proof of Lemma Proof. Gven any b, wll su ce o provde a sequence such ha any b-approxmaon algorhm canno oupu x,he curren pon n he sream, as one of s ceners. We wll provde a couner example n he seng where k =. Gven any b, consder a sequence such ha he sream before he curren pon consss enrely of n daa pons locaed a some poson A, and n daa pons locaed a some poson B, where n >n >b. Le x be he curren pon n he sream, and le be locaed a a poson ha les on he lne segmen connecng A and B, bucloseroa. ha s, le ka k = a and kb k = c such ha <a< n n c. hs s refleced by he fgure, whch ncludes some + addonal pons D and E: A D E -B - where A,D,,E,B are lyng on he same lne. Le D [A, ] ande [, B]. Le for parcular locaon of D and E, ka Dk = a, kd k = a, k Ek = c and ke Bk = c,suchhaa + a = a and c + c = c. We wll frs reason abou he opmal k-means cluserng of he sream ncludng x.wewllconsdercases: ) Case : opmal ceners le nsde he nerval (A, ). Any such se canno be opmal snce by mrror-reflecng he cener closes o wh respec o (such ha now les n he nerval (B,) andhashesamedsanceo as before) we can decrease he cos. In parcular he cos of pons n B wll only decrease, leavng he cos of pons n A, plus he cos of, unchanged. ) Case : opmal ceners le nsde he nerval (B,). In hs case we can alernaely mrror-reflec he closes cener o wh respec o and hen wh respec o A (reducng he cos wh each reflecon) unl wll end up n nerval [A, ). he cos of he fnal soluon s smaller han when boh ceners were lyng n (B,), because whle mrror-reflecng wh respec o, hecosofponsna can only decrease, leavng he cos of pons n B and pon unchanged and whle mrror-reflecng wh respec o A, hecosofpon can only decrease, leavng he cos of pons n A and n B unchanged.

6 Anna Choromanska, Clare Moneleon hus he opmal se of ceners (le s call hem: C,C )musbesuchha: C [A, ] andc [B,]. he fgure above reflecs hs suaon and s su cenly general; hus D = C and E = C. We wll now consder all possble locaons of such ceners (C,C )andhercoss: ) (A,): cos = n (c + c ).) (A,E) where E (, B): cos = (a + a ) + n c when c a.) (A,E) where E (, B): cos = c + n c when c <a 3) (A,B): cos = (a + a ) 4) (D,) where D (A, ): cos = n a + n (c + c ) 5.) (D,E) where D (A, ) ande (, B): cos = n a + a + n c when a c 5.) (D,E) where D (A, ) ande (, B): cos = n a + c + n c when a >c 6) (D,B) where D (A, ): cos = n a + a 7) (,E) where E (, B): cos = n (a + a ) + n c 8) (,B) where E (, B): cos = n (a + a ) Noce: cos() > cos(.) hus opmal confguraon of ceners can no be as n case ) cos(7) > cos(8) hus opmal confguraon of ceners can no be as n case 7) cos(4) > cos(5.) hus opmal confguraon of ceners can no be as n case 4) cos(8) > cos(6) hus opmal confguraon of ceners can no be as n case 8) cos(5.) > cos(.) hus opmal confguraon of ceners can no be as n case 5.) cos(5.) > cos(6) hus opmal confguraon of ceners can no be as n case 5.) cos(.) > cos(3) hus opmal confguraon of ceners can no be as n case.) Consder case (.): cos = c + n c = c + n (c c ) and c (,a). Keepng n mnd ha <a< n c,seasy n + o show ha cos >a + n (c a) >n a >n a >a =cos(case(3)). hus he opmal confguraon of ceners can no be as n case.). herefore, he only cases we are lef o consder are cases 3 and 6. In boh cases, one of he opmal ceners les a B. Le hs cener be C. Snce a<c, he remanng pons (n A and ) are assgned o cener C whose locaon can be compued as follows (please see fgure below for noaon smplcy): A C B = C Le ka C k = and ka k = a (as defned above). Snce we proved ha C mus be fxed a B, heponsab wll conrbue o he objecve, and we can solve for C by mnmzng he cos from pons n A, plus he cos from : mn{n +(a ) }.hesoluons = a.husheoalopmalcoss: n + OP = n +(a ) = na (n +) + n a (n +) We wll now consder any -cluserng of se A,, B when one cluser cener s x, whch s herefore locaed a. We can lower bound he k-means cos of any wo se of ceners ha conan, as follows: cos({, ĉ}) n a for any ĉ; he mnmal cos s acheved when he oher cener s locaed a B (when one of he ceners s locaed n, helocaonof he oher cener ha gves he smalles possble k-means cos can be eher A (case ), D (case 4), E (case 7) or B (case 8), where case 8 has he smalles cos from among hem). Volang he b-approxmaon assumpon occurs when cos({, ĉ}) >b OP. Gvenheabove, wouldsu n a >b OP. has: n a na >b (n +) + n a, b<(n (n +) +) ce o show hs holds, as we chose n >b n he begnnng. herefore he b-approxmaon assumpon s volaed. B.5 Proof of Lemma 4 Proof. For ease of noaon, we denoe by ( W,> he k-means cos of algorhm a on he daa seen n he wndow ( W, > (omng he argumen, whch s he se of ceners oupu by algorhm a a me ). Snce a s b-approxmae hen: 8 W ( W,> b OP ( W,> (3) For any W such ha W<,wecandecompose such ha = mw + r where m N and r W. Noce hen ha

7 for any j {,...,W } he followng chan of nequales holds as he drec consequence of Equaon 3: ( W +j, > + ( W +j, W +j> + ( 3W +j, W +j> ( mw +j, (m )W +j> + <, mw +j> b OP ( W +j, > + b OP ( W +j, W +j> + b OP ( 3W +j, W +j> b OP ( mw +j, (m )W +j> + b OP <, mw +j> b OP <, > (4) where he las nequaly s he drec consequence of Lemma 3. Noce ha d eren value of j refers o d eren paronng of he me span <, >. Fgure llusraes an example when = and W = 3. Fgure : D eren paronng of me span = no me wndows, W = 3, wh respec o d eren values of j. Le W refer o he daa chunk seen n me span < max(, W +),>.Wefnshbyshowngha, W W { ( W +j, > (5) j= + ( W +j, W +j> + ( 3W +j, W +j> ( mw +j, (m )W +j> + <, mw +j>} b W OP <, > he lef hand sde of he frs nequaly (5) sums he losses over only a subse of all he wndows ha are nduced by paronng he me span usng all possble values of j. he fnal nequaly follows by applyng (4). o llusrae some of he deas used n hs proof, Fgures and 3 provde schemacs. Fgure 3 shows he wndows over whch he loss s compued on he lef hand sde of nequaly (5), whch s a subse of he se of all wndows nduced by all possble paronng of me span usng all possble values of j, whch s shown n Fgure.

8 Anna Choromanska, Clare Moneleon Fgure 3: Illusraon of he me spans over whch he loss s beng compued n Equaon 5. =, W = 3. Colors red, blue and black correspond o d eren paronngs, wh respec o j, of me span llusraed n Fgure. B.6 Proof of Lemma 5 Proof. he loss of exper a me s defned n Defnon 3 as he scaled componen of he k-means cos of algorhm a s cluserng a me. ha s, (C )= P x W mncc kx ck =mn cc kx ck + P x W \x mn cc kx ck = 4R L (C )+ P x W \x mn cc kx ck. herefore, snce all erms n he sum are posve, 4R L (C ) (C ), and L (a). where n he second 4R nequaly we subsue n our smplfyng noaon, where C are he se of clusers generaed by algorhm a a me, and P s algorhm a s k-means cos on W. Now, summng over eraons, and applyng Lemma 4, we oban b W L(a) OP 4R <, >. B.7 Proof of heorem 7 Proof. he heorem drecly follows from heorem 3, Lemma 4 and Lemma 5. Noce ha boh Lemma 4 and Lemma 5 hold n a more general seng when n each me wndow he deny of he exper (b-approxmae algorhm) may change n an arbrarly way. However we dd no provde he generalzed proofs of hose lemmas snce would only complcae he noaon. B.8 Proof of heorem 4, Corollary, and heorem 8 Lemma 8. L log ( ) = L log (p,)= log 4 p ( )e,..., L(,) 3 Y e L(,) P (, ) 5 Proof. I follows drecly from Lemma 6. Lemma 9. " ( L log ( ) Llog ( )= log Q(~s )exp ~s n n j= ˆ (~s )ˆ j(~s )log( j ) j )# where ~s =,..., and Q(~s ) s he poseror probably over he choces of expers along he sequence, nduced by hndsgh-opmal. Proof. I follows drecly from applyng proof of Lemma A.. n [6] wh redefned (~s )suchha (~s )= Q e L(,).

9 Lemma. L log ( ) Llog ( ) ( ) n D( k ) ( ) max D( k ) Proof. I holds by heorem 3 n [6]. Lemma. L log ( ) Llog ( )+( )D( k ) Proof. I holds by Lemma 7, Lemma. Lemma. L ( ) L log ( )+( )D( k ) Proof. I holds by Lemma and Lemma. Lemma 3. L ( ) bw OP +( R ) n D( ) Proof. L log ( )= L log (p,) =( n log p ()e L(,) ) = =( n log p ()e k x c R k ) Defne as prevously v = x and noce R v [ ; ] d. c hus we connue: n =( log p ()e kv k ) By he proof of Lemma 4 kv k 4R where s he k-means cos of algorhm h cluserng ( h exper) a me. hus we can connue as follows (we om condonng by snce holds for any ~p ): n log p ()e 4R n log p ()e max 4R log e max 4R = max = 4R 8R max bw 8R OP

10 Anna Choromanska, Clare Moneleon he las nequaly follows from he proof of heorem 7. hus fnally: L ( ) L log n ( ) Llog ( )+( ) D( ) bw OP +( 4R ) n D( ) B.9 Proof of heorem 5 Proof. he logloss per me sep of he op-level algorhm (for he ease of noaon we skp ndex log), whch updaes s dsrbuon over - expers (Fxed-Share ( j algorhms) s: L op (p op,)= log m j= p op (j)e Llog (j,) where L log (j, ) = n log p j()e L(,) hus: L op (p op,)= log m j= p op (j) n p j()e L(,) he updae s done va he Sac Exper algorhm. When runnng Learn- (m), m possble values j are esed. Followng [5, 6], he probablsc predcon of he algorhm s defned as: m j= p op (j) n p j()e L(,) Le us frs show ha hs loss-predcon par are (, )-realzable, hus hey sasfy: hus we have o prove ha followng holds: log m j= L op (p op,) log p op m p op(j)e Llog (j,) j= n m (j) p j()e L(,) log j= p op (j)e Llog (j,) hus we have o prove: whch s equvalen o log m j= p op n m (j) p j()e L(,) log j= p op (j)e log P n p j ()e L(,) log m j= p op n m (j) p j()e L(,) log j= p op (j) n p j()e L(,) where he las nequaly holds. Now, snce our logloss-predcon par are (,)-realzable, by Lemma n [9] we have: L op mn L log ( j)+logm { j }

11 where { j} s he dscrezaon of he parameer ha he Learn- algorhm akes as npu. Corollary we ge: L log (alg) Llog ( )+( ) mn D( k j)+logm { j } Now, by applyng C Addonal Expermenal Deals o provde qualave resuls on OCE s performance, n Fgures 4-5 we show cluserng analogs o learnng curves from our expermens n he predcve seng. hese curves generaed he sascs for able. We plo he bach k-means cos of each exper, and he OCE algorhms, on all he daa seen so far, versus. Whle Fxed-Share algorhms wh hgh values of su er large oscllaons n cos, Learn- s performance racks, and ofen surpasses ha of he Fxed-Share algorhms. We also demonsrae he evoluon of he weghs mananed by he Fxed-Share and Learn- OCE algorhms. In Fgures 6-7 we show he evoluon over me of he weghs mananed by he OCE Fxed-Share algorhm over he expers (cluserng algorhms). For smaller values of we observe an nverson n he wegh orderng beween expers 4 and 5 a around eraon 48 for hs parcular expermen. For values closer o / and here s a hgher amoun of shfng of weghs among expers. In Fgure 8 we show he evoluon over me of he weghs mananed by he OCE Learn- algorhm over -expers (Fxed-Share algorhms run wh a d eren seng of he parameer). he expermen was performed wh 45 -expers (Fxed-Share algorhms wh d eren values of he parameer). Lower values receved hgher weghs. One value of (he lowes) receves an ncreasng share of he wegh, whch s conssen wh he fac ha he Sac-Exper algorhm s used o updae weghs over -expers.

12 Anna Choromanska, Clare Moneleon 4.5 x Exper Exper Exper 3 Sac Exper Fxed Share Fxed Share Fxed Share 3 Fxed Share 4 Learn alpha Cloud daa 3 k means cos x 8 Spambase daa 5 4 k means cos x Inruson daa.5 k means cos x 7 Fores fres daa 3.5 k means cos Fgure 4: Cluserng analogs o learnng curves; k-means cos vs. ; op o boom: Cloud, Spambase, Inruson, Fores fres. Legend n upper lef.

13 6 x 4 Robo daa 5 4 k means cos x 7 Mxure of 5 Gaussans 8 k means cos x 6 Mxure of 5 Gaussans k means cos Fgure 5: Cluserng analogs o learnng curves; k-means cos vs. ; op o boom: Robo daa, 5 Gaussans daa, Zoomng n on y-axs of 5 Gaussans. Legend n Fgure 5.

14 Anna Choromanska, Clare Moneleon exper exper exper3 exper4 exper5 exper6 Fores fres daa. Fxed share. alpha =.57 Expers weghs Fores fres daa. Fxed share. alpha = Expers weghs Fores fres daa. Fxed share. alpha = Expers weghs Fgure 6: Evoluon of weghs over expers for Fxed-Share algorhms usng d eren (low) values of he parameer; Fores fres daa. 6-expers. Legend n op lef.

15 .6 Fores fres daa. Fxed share. alpha = Expers weghs Fores fres daa. Fxed share. alpha = Expers weghs Fgure 7: Evoluon of weghs over expers for Fxed-Share algorhms usng d eren values of he parameer (op: close o /; Boom: close o ); Fores fres daa. 6-expers. Legend n Fgure 6..9 Fores fres daa. Learn alpha. W =.k = Expers weghs Fgure 8: Evoluon of weghs mananed by Learn-, overs -expers, n he 6-exper Fores-Fres expermen. Lowes values of receve hghes wegh.

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

CS286.2 Lecture 14: Quantum de Finetti Theorems II

CS286.2 Lecture 14: Quantum de Finetti Theorems II CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Comparison of Differences between Power Means 1

Comparison of Differences between Power Means 1 In. Journal of Mah. Analyss, Vol. 7, 203, no., 5-55 Comparson of Dfferences beween Power Means Chang-An Tan, Guanghua Sh and Fe Zuo College of Mahemacs and Informaon Scence Henan Normal Unversy, 453007,

More information

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs

More information

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth Should Exac Index umbers have Sandard Errors? Theory and Applcaon o Asan Growh Rober C. Feensra Marshall B. Rensdorf ovember 003 Proof of Proposon APPEDIX () Frs, we wll derve he convenonal Sao-Vara prce

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project GMM paraeer esaon Xaoye Lu M290c Fnal rojec GMM nroducon Gaussan ure Model obnaon of several gaussan coponens Noaon: For each Gaussan dsrbuon:, s he ean and covarance ar. A GMM h ures(coponens): p ( 2π

More information

Tight results for Next Fit and Worst Fit with resource augmentation

Tight results for Next Fit and Worst Fit with resource augmentation Tgh resuls for Nex F and Wors F wh resource augmenaon Joan Boyar Leah Epsen Asaf Levn Asrac I s well known ha he wo smple algorhms for he classc n packng prolem, NF and WF oh have an approxmaon rao of

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

2.1 Constitutive Theory

2.1 Constitutive Theory Secon.. Consuve Theory.. Consuve Equaons Governng Equaons The equaons governng he behavour of maerals are (n he spaal form) dρ v & ρ + ρdv v = + ρ = Conservaon of Mass (..a) d x σ j dv dvσ + b = ρ v& +

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

Bundling with Customer Self-Selection: A Simple Approach to Bundling Low Marginal Cost Goods On-Line Appendix

Bundling with Customer Self-Selection: A Simple Approach to Bundling Low Marginal Cost Goods On-Line Appendix Bundlng wh Cusomer Self-Selecon: A Smple Approach o Bundlng Low Margnal Cos Goods On-Lne Appendx Lorn M. H Unversy of Pennsylvana, Wharon School 57 Jon M. Hunsman Hall Phladelpha, PA 94 lh@wharon.upenn.edu

More information

Density Matrix Description of NMR BCMB/CHEM 8190

Density Matrix Description of NMR BCMB/CHEM 8190 Densy Marx Descrpon of NMR BCMBCHEM 89 Operaors n Marx Noaon Alernae approach o second order specra: ask abou x magnezaon nsead of energes and ranson probables. If we say wh one bass se, properes vary

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Hidden Markov Models

Hidden Markov Models 11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

5-1. We apply Newton s second law (specifically, Eq. 5-2). F = ma = ma sin 20.0 = 1.0 kg 2.00 m/s sin 20.0 = 0.684N. ( ) ( )

5-1. We apply Newton s second law (specifically, Eq. 5-2). F = ma = ma sin 20.0 = 1.0 kg 2.00 m/s sin 20.0 = 0.684N. ( ) ( ) 5-1. We apply Newon s second law (specfcally, Eq. 5-). (a) We fnd he componen of he foce s ( ) ( ) F = ma = ma cos 0.0 = 1.00kg.00m/s cos 0.0 = 1.88N. (b) The y componen of he foce s ( ) ( ) F = ma = ma

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation Global Journal of Pure and Appled Mahemacs. ISSN 973-768 Volume 4, Number 6 (8), pp. 89-87 Research Inda Publcaons hp://www.rpublcaon.com Exsence and Unqueness Resuls for Random Impulsve Inegro-Dfferenal

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

Epistemic Game Theory: Online Appendix

Epistemic Game Theory: Online Appendix Epsemc Game Theory: Onlne Appendx Edde Dekel Lucano Pomao Marcano Snscalch July 18, 2014 Prelmnares Fx a fne ype srucure T I, S, T, β I and a probably µ S T. Le T µ I, S, T µ, βµ I be a ype srucure ha

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

Li An-Ping. Beijing , P.R.China

Li An-Ping. Beijing , P.R.China A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.

More information

2/20/2013. EE 101 Midterm 2 Review

2/20/2013. EE 101 Midterm 2 Review //3 EE Mderm eew //3 Volage-mplfer Model The npu ressance s he equalen ressance see when lookng no he npu ermnals of he amplfer. o s he oupu ressance. I causes he oupu olage o decrease as he load ressance

More information

Second-Order Non-Stationary Online Learning for Regression

Second-Order Non-Stationary Online Learning for Regression Second-Order Non-Saonary Onlne Learnng for Regresson Nna Vas, Edward Moroshko, and Koby Crammer, Fellow, IEEE arxv:303040v cslg] Mar 03 Absrac he goal of a learner, n sandard onlne learnng, s o have he

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

On elements with index of the form 2 a 3 b in a parametric family of biquadratic elds

On elements with index of the form 2 a 3 b in a parametric family of biquadratic elds On elemens wh ndex of he form a 3 b n a paramerc famly of bquadrac elds Bora JadrevĆ Absrac In hs paper we gve some resuls abou prmve negral elemens p(c p n he famly of bcyclc bquadrac elds L c = Q ) c;

More information

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems Swss Federal Insue of Page 1 The Fne Elemen Mehod for he Analyss of Non-Lnear and Dynamc Sysems Prof. Dr. Mchael Havbro Faber Dr. Nebojsa Mojslovc Swss Federal Insue of ETH Zurch, Swzerland Mehod of Fne

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Online Appendix for. Strategic safety stocks in supply chains with evolving forecasts

Online Appendix for. Strategic safety stocks in supply chains with evolving forecasts Onlne Appendx for Sraegc safey socs n supply chans wh evolvng forecass Tor Schoenmeyr Sephen C. Graves Opsolar, Inc. 332 Hunwood Avenue Hayward, CA 94544 A. P. Sloan School of Managemen Massachuses Insue

More information

PHYS 705: Classical Mechanics. Canonical Transformation

PHYS 705: Classical Mechanics. Canonical Transformation PHYS 705: Classcal Mechancs Canoncal Transformaon Canoncal Varables and Hamlonan Formalsm As we have seen, n he Hamlonan Formulaon of Mechancs,, are ndeenden varables n hase sace on eual foong The Hamlon

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

ON THE WEAK LIMITS OF SMOOTH MAPS FOR THE DIRICHLET ENERGY BETWEEN MANIFOLDS

ON THE WEAK LIMITS OF SMOOTH MAPS FOR THE DIRICHLET ENERGY BETWEEN MANIFOLDS ON THE WEA LIMITS OF SMOOTH MAPS FOR THE DIRICHLET ENERGY BETWEEN MANIFOLDS FENGBO HANG Absrac. We denfy all he weak sequenal lms of smooh maps n W (M N). In parcular, hs mples a necessary su cen opologcal

More information

Pendulum Dynamics. = Ft tangential direction (2) radial direction (1)

Pendulum Dynamics. = Ft tangential direction (2) radial direction (1) Pendulum Dynams Consder a smple pendulum wh a massless arm of lengh L and a pon mass, m, a he end of he arm. Assumng ha he fron n he sysem s proporonal o he negave of he angenal veloy, Newon s seond law

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes. umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

Part II CONTINUOUS TIME STOCHASTIC PROCESSES

Part II CONTINUOUS TIME STOCHASTIC PROCESSES Par II CONTINUOUS TIME STOCHASTIC PROCESSES 4 Chaper 4 For an advanced analyss of he properes of he Wener process, see: Revus D and Yor M: Connuous marngales and Brownan Moon Karazas I and Shreve S E:

More information

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez Chaînes de Markov cachées e flrage parculare 2-22 anver 2002 Flrage parculare e suv mul-pses Carne Hue Jean-Perre Le Cadre and Parck Pérez Conex Applcaons: Sgnal processng: arge rackng bearngs-onl rackng

More information

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading Onlne Supplemen for Dynamc Mul-Technology Producon-Invenory Problem wh Emssons Tradng by We Zhang Zhongsheng Hua Yu Xa and Baofeng Huo Proof of Lemma For any ( qr ) Θ s easy o verfy ha he lnear programmng

More information

Midterm Exam. Thursday, April hour, 15 minutes

Midterm Exam. Thursday, April hour, 15 minutes Economcs of Grow, ECO560 San Francsco Sae Unvers Mcael Bar Sprng 04 Mderm Exam Tursda, prl 0 our, 5 mnues ame: Insrucons. Ts s closed boo, closed noes exam.. o calculaors of an nd are allowed. 3. Sow all

More information

2. SPATIALLY LAGGED DEPENDENT VARIABLES

2. SPATIALLY LAGGED DEPENDENT VARIABLES 2. SPATIALLY LAGGED DEPENDENT VARIABLES In hs chaper, we descrbe a sascal model ha ncorporaes spaal dependence explcly by addng a spaally lagged dependen varable y on he rgh-hand sde of he regresson equaon.

More information

(,,, ) (,,, ). In addition, there are three other consumers, -2, -1, and 0. Consumer -2 has the utility function

(,,, ) (,,, ). In addition, there are three other consumers, -2, -1, and 0. Consumer -2 has the utility function MACROECONOMIC THEORY T J KEHOE ECON 87 SPRING 5 PROBLEM SET # Conder an overlappng generaon economy le ha n queon 5 on problem e n whch conumer lve for perod The uly funcon of he conumer born n perod,

More information

Fitting a Conditional Linear Gaussian Distribution

Fitting a Conditional Linear Gaussian Distribution Fng a Condonal Lnear Gaussan Dsrbuon Kevn P. Murphy 28 Ocober 1998 Revsed 29 January 2003 1 Inroducon We consder he problem of fndng he maxmum lkelhood ML esmaes of he parameers of a condonal Gaussan varable

More information

Testing a new idea to solve the P = NP problem with mathematical induction

Testing a new idea to solve the P = NP problem with mathematical induction Tesng a new dea o solve he P = NP problem wh mahemacal nducon Bacground P and NP are wo classes (ses) of languages n Compuer Scence An open problem s wheher P = NP Ths paper ess a new dea o compare he

More information

Lecture 2 M/G/1 queues. M/G/1-queue

Lecture 2 M/G/1 queues. M/G/1-queue Lecure M/G/ queues M/G/-queue Posson arrval process Arbrary servce me dsrbuon Sngle server To deermne he sae of he sysem a me, we mus now The number of cusomers n he sysems N() Tme ha he cusomer currenly

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Scattering at an Interface: Oblique Incidence

Scattering at an Interface: Oblique Incidence Course Insrucor Dr. Raymond C. Rumpf Offce: A 337 Phone: (915) 747 6958 E Mal: rcrumpf@uep.edu EE 4347 Appled Elecromagnecs Topc 3g Scaerng a an Inerface: Oblque Incdence Scaerng These Oblque noes may

More information

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations.

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations. Soluons o Ordnary Derenal Equaons An ordnary derenal equaon has only one ndependen varable. A sysem o ordnary derenal equaons consss o several derenal equaons each wh he same ndependen varable. An eample

More information

Density Matrix Description of NMR BCMB/CHEM 8190

Density Matrix Description of NMR BCMB/CHEM 8190 Densy Marx Descrpon of NMR BCMBCHEM 89 Operaors n Marx Noaon If we say wh one bass se, properes vary only because of changes n he coeffcens weghng each bass se funcon x = h< Ix > - hs s how we calculae

More information

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5 TPG460 Reservor Smulaon 08 page of 5 DISCRETIZATIO OF THE FOW EQUATIOS As we already have seen, fne dfference appromaons of he paral dervaves appearng n he flow equaons may be obaned from Taylor seres

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu

More information

[Link to MIT-Lab 6P.1 goes here.] After completing the lab, fill in the following blanks: Numerical. Simulation s Calculations

[Link to MIT-Lab 6P.1 goes here.] After completing the lab, fill in the following blanks: Numerical. Simulation s Calculations Chaper 6: Ordnary Leas Squares Esmaon Procedure he Properes Chaper 6 Oulne Cln s Assgnmen: Assess he Effec of Sudyng on Quz Scores Revew o Regresson Model o Ordnary Leas Squares () Esmaon Procedure o he

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Relative controllability of nonlinear systems with delays in control

Relative controllability of nonlinear systems with delays in control Relave conrollably o nonlnear sysems wh delays n conrol Jerzy Klamka Insue o Conrol Engneerng, Slesan Techncal Unversy, 44- Glwce, Poland. phone/ax : 48 32 37227, {jklamka}@a.polsl.glwce.pl Keywor: Conrollably.

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

Fall 2009 Social Sciences 7418 University of Wisconsin-Madison. Problem Set 2 Answers (4) (6) di = D (10)

Fall 2009 Social Sciences 7418 University of Wisconsin-Madison. Problem Set 2 Answers (4) (6) di = D (10) Publc Affars 974 Menze D. Chnn Fall 2009 Socal Scences 7418 Unversy of Wsconsn-Madson Problem Se 2 Answers Due n lecure on Thursday, November 12. " Box n" your answers o he algebrac quesons. 1. Consder

More information

An Optimal Control Approach to the Multi-agent Persistent Monitoring Problem

An Optimal Control Approach to the Multi-agent Persistent Monitoring Problem An Opmal Conrol Approach o he Mul-agen Perssen Monorng Problem Chrsos.G. Cassandras, Xuchao Ln and Xu Chu Dng Dvson of Sysems Engneerng and Cener for Informaon and Sysems Engneerng Boson Unversy, cgc@bu.edu,

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

Advanced Macroeconomics II: Exchange economy

Advanced Macroeconomics II: Exchange economy Advanced Macroeconomcs II: Exchange economy Krzyszof Makarsk 1 Smple deermnsc dynamc model. 1.1 Inroducon Inroducon Smple deermnsc dynamc model. Defnons of equlbrum: Arrow-Debreu Sequenal Recursve Equvalence

More information

ABSTRACT KEYWORDS. Bonus-malus systems, frequency component, severity component. 1. INTRODUCTION

ABSTRACT KEYWORDS. Bonus-malus systems, frequency component, severity component. 1. INTRODUCTION EERAIED BU-MAU YTEM ITH A FREQUECY AD A EVERITY CMET A IDIVIDUA BAI I AUTMBIE IURACE* BY RAHIM MAHMUDVAD AD HEI HAAI ABTRACT Frangos and Vronos (2001) proposed an opmal bonus-malus sysems wh a frequency

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

arxiv: v3 [cs.lg] 12 Jun 2018

arxiv: v3 [cs.lg] 12 Jun 2018 Small-loss bounds for onlne learnng wh paral nformaon Thodors Lykours Karhk Srdharan Éva Tardos arxv:1711.03639v3 [cs.lg] 12 Jun 2018 Absrac We consder he problem of adversaral (non-sochasc) onlne learnng

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair TECHNI Inernaonal Journal of Compung Scence Communcaon Technologes VOL.5 NO. July 22 (ISSN 974-3375 erformance nalyss for a Nework havng Sby edundan Un wh ang n epar Jendra Sngh 2 abns orwal 2 Deparmen

More information