Context-Aware Clustering

Size: px
Start display at page:

Download "Context-Aware Clustering"

Transcription

1 Context-Awre Clustering Junsong Yun EECS Dept., Northwestern Univ. Evnston, IL, USA Ying Wu EECS Dept., Northwestern Univ. Evnston, IL, USA Abstrct Most existing methods of semi-supervised clustering introduce supervision from outside, e.g., mnully lbel some dt smples or introduce constrins into clustering results. This pper studies n interesting problem: cn the supervision come from inside, i.e., the unsupervised trining dt themselves? If the dt smples re not independent, we cn cpture the contextul informtion reflecting the dependency mong the dt smples, nd use it s supervision to improve the clustering. This is clled context-wre clustering. The investigtion is substntilized on two scenrios of (1) clustering primitive visul fetures (e.g., SIFT fetures) with help of sptil contexts, nd (2) clustering 0 9 hnd written digits with help of contextul ptterns mong different types of fetures. Our context-wre clustering cn be well formulted in closed-form, where the contextul informtion serves s regulriztion term to blnce the dt fidelity in originl feture spce nd the influences of contextul ptterns. A nested-em lgorithm is proposed to obtin n efficient solution, which proves to converge. By exploring the dependent structure of the dt smples, this method is completely unsupervised, s no outside supervision is introduced. 1. Introduction Unsupervised clustering is lrgely settled by the distnce metric tht mesures the dissimilrity or ffinity between two dt points. This cn be regrded s the internl force driving the clustering. Typicl exmples include the k- mens clustering nd spectrl clustering. In prctice, s it is generlly quite difficult to choose the right distnce metric in dvnce, we tend to lern good metric by imposing supervision. Acting s constrint, supervised informtion cn be regrded s the externl force tht blnces or djusts the effect of the internl force. In this wy, we cn sy the distnce metric is tuned or lerned. Supervision is generlly introduced from outside, e.g., mnully lbeling some smples s constrints to perform constrin-bsed clustering [12], or to perform co-trining mong multiple modlities [3], or to perform metric tuning [10]. Then, here is n interesting question: cn the supervision come from inside, i.e., the trining dt themselves? If so, it is still unsupervised, nd cn be clled self-supervised clustering. This is possible when trining dt re not independent. The dependency mong dt is the contextul informtion. Let s tke web-pge grouping s n exmple. The links mong web-pges provide informtion on dependency. We group web-pges not only bsed on if they hve similr contents (fetures) but lso if they shre similr link pges (contexts). Contextul informtion brought by dt dependency provides n importnt clue for dt mining [8]. In computer vision reserch, mny recent work showed tht contextul informtion cn be utilized to resolve the mbiguities nd uncertinties in mny pplictions, including imge serch [6], recognition [9] [1] [2], metric lerning [11], nd imge modeling [16]. If the dt dependency cn be well cptured by the contextul ptterns, which describe the co-occurrences of specific type of dt smples in higher level, it is possible to use it s the supervision to improve clustering. Becuse the contextul informtion is discovered from the unsupervised trining dt themselves, we cll such self-supervised clustering s context-wre clustering. We substntilize it on two cse studies where (1) we cluster primitive visul fetures (e.g., SIFT fetures) for finding locl sptil ptterns in imges, nd (2) we cluster 0 9 hnd written digits with multiple fetures. By feeding bck the contextul ptterns s supervision which chrcterize the cooccurrence sttistics, we cn resolve the mbiguous smples bsed on the hints from their contexts. The novelty of our work lies in two spects. First of ll, we give closed-form formultion of context-wre clustering, where the contextul informtion serves s regulriztion term in trditionl k-mens clustering. Secondly, due to the nice nlyticl properties of the new formultion, we present n efficient nested-em lgorithm for context-wre clustering, which proves to converge. Both simultion nd rel dt vlidte the effectiveness of our method /08/$ IEEE 1

2 2. Context-Awre Clustering 2.1. Motivting exmple: clustering visul primitives We illustrte our context-wre clustering in cse study of clustering visul primitives. Ech visul primitive is denoted s v =(x, y, f), where (x, y) is its sptil loction, nd f denotes the feture vector describing v. In generl f R d cn be ny possible visul fetures to chrcterize locl imge region, like color histogrms or SIFTlike fetures [4] [5]. An imge is collection of visul primitives, nd we denote the visul primitive dtbse s D v = {v i } N. After clustering these visul primitives into words, we cn lbel ech v i D v with l(v i ) Ω, where Ω is the visul word lexicon of size Ω = M. The context of visul primitive is its sptil neighbors in the imge, i.e., those visul primitives tht collocte with it (Fig. 1). For ech visul primitive v i D v, we define its locl sptil neighborhood, e.g. K-nerest neighbors (K- NN) or ɛ-nerest neighbors (ɛ-nn), s its context group G i = {v i,v i1,v i2,,v ik }. The context dtbse is denoted by G = {G i } N. Once the visul primitives re lbeled by Ω, the context dtbse G cn be trnsfered to trnsction dtbse with N records, where ech record t i {0, 1} M is binry vector representtion of G i by indicting which words pper in group G i. This trnsction dtbse is sprse binry mtrix T M N, where ech column is context trnsction t i. The entry t ij =1indictes the j th trnsction contins the i th word nd t ij =0otherwise. In the cse of using sptil K-NN to define context group, we hve M t ij = K, j =1,..., N, becuse ech context group G i contins K visul primitives. An N N sprse binry mtrix Q cn be used to describe the sptil context reltions mong the visul primitives, where q ij =1denotes tht v i belongs to the context group of v j, i.e. v i G j ; nd q ij =0otherwise. Mtrix Q is symmetric when using ɛ-nn to define sptil neighbors, while n symmetric mtrix when using K-NN. The context mtrix Q plys centrl role in our context-wre clustering s it introduces extr reltions mong dt smples other thn in the feture spce. Besides sptil contexts, Q cn present ny other possible contextul informtion mong the N dt smples. In Sec. 3.3, we give nother exmple of pplying contextul informtion from multiple fetures for clustering. Bsed on the word lexicon Ω ( Ω = M), we cn further define phrse lexicon Ψ = {P i } M, where ech phrse P i Ψ is contextul pttern composed of collection of words, i.e. P i Ω. Compred with visul words which lbel visul primitives v, visul phrses lbel trnsctions t. As visul phrses describes the sptil dependencies mong visul words, they cn be more meningful ptterns in higher level [14]. For exmple in Fig. 1, the existence of visul phrse P = {, b} shows tht two words, b Ω Symbol Definition d dimensionlity of the feture vector N number of visul primitives M number of visul words M number of visul phrses t M 1 context trnsction T M N the trnsction dtbse Q N N sptil context reltions of visul primitives u d 1 prototype of visul word ũ M 1 prototype of visul phrse U d M prototypes of M visul words Ũ M M prototypes of M visul phrses R M N lbel mtrix of N primitives with M words R M N lbel mtrix of N groups with M phrses D M N distortion mtrix of N primitives with M words D M N distortion mtrix of N groups with M phrses Tble 1. Nottions of symbols. Bold upper cse letters denote mtrices nd bold lower cse letters denote vectors. b c b c c b b d e c e c b b b e Group Representtion G ={, b, d, e} Trnsction Representtion b c d e t =[ ] T Visul Phrse Representtion u =[ ] T Figure 1. Illustrtion of sptil contexts: context group G, trnsction t nd visul phrse P. The left figure denotes n imge nd ech rectngle denotes visul primitive. We suppose the visul word lexicon contins 5 words: Ω = {, b, c, d, e} nd ech visul primitive is lbeled by word. The circle denotes sptil context group generted by visul primitive. The highlighted visul primitives re instnces of discovered visul phrse P = {, b}. co-occur frequently in locl imge regions nd my form meningful visul pttern. Ech P j Ψ is presented by binry vector ũ j {0, 1} M which describes its word compositions, where ũ j (i) =1indictes tht the i th word is contined in P j. The mtrix Ũ M M is further pplied to represent Ψ, where ech column of Ũ is ũ j. Correspondingly, we use rel mtrix U d M to represent Ω, where ech column is feture vector to represent word prototype u j R d. All of our nottions re listed in tble Problem Formultion We first review the k-mens clustering nd its solution by the EM-lgorithm. By performing trditionl k-mens clustering on collection of visul primitives v i D v,the following men squre distortion is minimized:

3 where N M J 1 = r ij f i u j 2 = tr(r T D), (1) j=1 f i is the d 1 feture vector, nd u j is the center of the cluster (prototype of visul words); denotes the Eucliden distnce nd tr( ) denotes the mtrix trce; D M N denotes the distnce mtrix, where d ij = f j u i 2 denotes the distnce between the j th visul primitives nd the i th visul word prototype; R M N denotes the lbel indictor mtrix of the visul primitives, where r ij =1if the j th visul primitive is lbeled with the i th word; nd r ij =0otherwise. Stndrd EM-lgorithm cn be performed to minimize the distortion in Eq. 1 by itertively updting R (E-step) nd D (M-step). By minimizing the objective function J 1, k- mens clustering tries to mximize the dt likelihood under mixture Gussin distribution nd ssumes ll observtion smples v i D v re independent from one nother: N Pr(D v Ω) = Pr(v i Ω). (2) However, such n independent ssumption does not hold here becuse visul primitives hve sptil dependency with ech other. Thus they re not independent in the feture spce. As result, we need to tke into considertion these sptil contextul informtion nd cnnot cluster visul primitives only bsed on their fetures f i. In order to consider both feture nd contextul informtion for clustering, we propose regulrized objective function bsed on k-mens: N M N M J = r ij f i u j 2 + λ r ijd H (t i, ũ j ) j=1 j=1 = tr(r T D)+λ tr( R T D), (3) where λ>0is positive constnt for regulriztion; r ij is the binry lbel indictor of trnsctions, with r ij =1denoting tht the i th trnsction is lbeled with the j th visul phrse; nd r ij =0otherwise. Similr to R, R N M is mtrix to describe the clustering results of trnsctions t. For deterministic clustering, we hve the following constrints for R nd R: M M r ij =1, r ij =1, i =1,..., N. (4) j=1 j=1 d H (t i, ũ j ) denotes the Hmming distnce between two binry vectors: trnsction t i nd context pttern ũ j, where 1 is the M 1 ll 1 vector: d H (t i, ũ j ) = M [ t T i ũ j +(1 t i ) T (1 ũ j ) ] = t T i 1 + ũ T j 1 2t T i ũ j. (5) Given the objective function in Eq. 3 with M, M nd λ re fixed prmeters, our objectives re two-fold: (1) clustering ll the visul primitives v i into M clsses (word lexicon Ω) nd simultneously (2) clustering ll the context trnsctions t i T into M clsses (phrse lexicon Ψ). The clustering results re presented by R nd R respectively. Since ech visul primitive cn generte sptil context group, we finlly end up with two lbels for every primitive: (1) the word lbel of itself nd (2) the phrse lbel of the sptil group it genertes. Compred with k-mens clustering which ssumes convex (e.g. Gussin) shpe for ech cluster in the feture spce, our regulriztion term cn modify the cluster into n rbitrry shpe by considering the influences from the higher phrse level. Similr to the k-mens clustering, this formultion is lso chicken-ndegg problem where we cnnot estimte D, D, R nd R simultneously Itertive Solution: Nested-EM lgorithm The objective function in Eq. 3 cn be prtitioned into two prts: J = tr(r T T D) + λ tr( R D), }{{}}{{} J 1 J 2 where J 1 = tr(r T D) nd J 2 = λ tr( R T D) correspond to the quntiztion distortions of visul primitives nd context groups respectively. Although it looks we could minimize J by minimizing J 1 nd J 2 seprtely, e.g., through two independent EM-processes, this is ctully infesible becuse J 1 nd J 2 re coupled. By further nlyzing J 1 nd J 2, we find tht lthough visul primitive distortions D only depends on R, the context group distortions D depends on both visul primitive lbels R nd context group lbels R. Thus it is infesible to minimize J 1 nd J 2 seprtely due to their correltion. In the following, we show how to decouple the dependency between J 1 nd J 2 nd propose our nested-em lgorithm. Initiliztion: 1. Clustering ll visul primitives {v i } N into M clsses, e.g. through k-mens clustering, bsed on the Eucliden distnce. 2. Obtining the visul primitives lexicon Ω (represented by U) nd the distortion mtrix D. 3. Clustering ll context groups {G i } N into M clsses bsed on the Hmming distnce, nd obtin the visul phrse lexicon Ψ (represented by Ũ), s well s the distortion mtrix D. E-step: The tsk is to lbel visul primitives v i nd context groups G i with Ω nd Ψ, nmely to updte R nd R given D nd

4 D, where D nd D cn be directly computed from U nd Ũ respectively. Bsed on the nlysis bove, we need to optimize R (corresponding to J 1 ) nd R (corresponding to J 2 ) simultneously to minimize J, becuse J 1 nd J 2 re correlted. According to the Hmming distnce in Eq. 5, we cn derive the mtrix form of context groups distortions: D = 2 Ũ T T + 1 T T + Ũ T 1Ũ, where 1 T is n M M ll 1 mtrix nd 1Ũ is n M N ll 1 mtrix. Moreover, trnsction dtbse T cn be determined by T = RQ, becuse ech trnsction column cn be obtined s t j = N q ij r 1 i, where q ij is binry indictor of whether primitive v i belongs to the context group of v j, nd r i denotes the i th column of R which describe the word lbel of v i. Bsed on the bove, we derive Eq. 3 s follows: J(D, D, R, R) =tr(r T D)+λ tr( R T D) (6) = tr(r T D)+ = tr(r T D)+ = tr(r T D)+ λ tr[ R T ( 2Ũ T T + 1 T T + Ũ T 1Ũ )] (7) λ tr[ R T ( 2Ũ T RQ + 1 T RQ + Ũ T 1Ũ )] λ tr[ R T ( 2(Ũ T T )RQ + Ũ T 1Ũ )] = tr(r T D) 2λ tr[ R T (Ũ T T )RQ]+ λ tr( R T Ũ T 1Ũ ) (8) = tr(r T D) 2λ tr[q T R T (Ũ T T ) T R]+ λ tr( R T Ũ T 1Ũ ) (9) = tr(r T D) 2λ tr[r T (Ũ T T ) T RQ T ]+ λ tr( R T Ũ T 1Ũ ) (10) = tr[r T (D 2λ (Ũ T T ) T RQ T )] + λ tr( R T Ũ T 1Ũ ), (11) where we pply three properties of mtrix trce: for squre mtrix A nd B, wehve(1)tr(a) =tr(a T ) (Eq. 9), (2) 1 Strictly, t j is binry vector only if it contins distinguishble primitives, i.e. ech primitive belongs to different word in t j. However, our solution is generic nd do not need t j to be binry, s long s we pply the distortion mesure s in Eq. 5. tr(ab) =tr(ba) (Eq. 10), nd (3) tr(a) +tr(b) = tr(a + B) (Eq. 11). Bsed on the bove nlysis, we propose n E-step to itertively updte R nd R to decrese J. Recll tht R nd R re lbel indictor mtrices constrined by Eq We first fix R nd updte R. Bsed on Eq. 8, let we hve H = λ ( 2Ũ T RQ + 1 T RQ + Ũ T 1Ũ ), J = tr(r T T D) + tr( R H). (12) }{{}}{{} J 1 J 2 Therefore we only need to minimize J 2 = tr( R T H) s J 1 = tr(r T D) is constnt given R nd U. Becuse ech column of R contins single 1 (Eq. 4), we updte R to minimize J 2 bsed on the following criterion, j =1, 2,...N: { 1 i = rg mink hkj r ij =, (13) 0 otherwise where h kj is the element of H nd r ij is the element of R. H cn be clculted bsed on Q, Ũ nd R which re ll given. 2. Similr to the bove step, now we fix R nd updte R. Bsed on Eq. 11, let H = D 2λ (Ũ T T ) T RQ T We get nother representtion of J: J = tr(r T H) + λ tr( R T Ũ T 1Ũ ), (14) }{{}}{{} J 3 J 4 where J 4 = λ tr( R T Ũ T 1Ũ ) is constnt given R nd Ũ. Therefore, only J 3 cn be minimized. We updte R to minimize J 3 s follows, j =1,...N: { 1 i = rg mink h r ij = kj, (15) 0 otherwise where h kj is the element of H nd r ij is the element of R. The bove E-step itself is n EM-like process becuse we need to updte R nd R itertively until J converges. The objective function J decreses monotoniclly t ech step. M-step: After knowing the lbels of visul primitives nd visul groups (R nd R), we wnt to estimte better visul lexicons Ω nd Ψ. From Eq. 3, D nd D re not interlced nd thus U nd Ũ cn be optimized seprtely. We pply the following two steps to updte U nd Ũ seprtely:

5 1. Reclculte the cluster centroid for ech visul word clss {u i } M like trditionl k-mens lgorithm, with Eucliden distnce. Updte U nd D to decrese J. 2. Reclculte the cluster centroid for ech phrse {ũ i } M, with Hmming distnce (see the Appendix for the updte detils). Updte Ũ nd D to decrese J. Both of the bove steps gurntee tht J is decresing, therefore the whole M-step decreses J monotoniclly. Our method is clled nested-em lgorithm becuse there re two nested EM processes, where the E-step itself is n EM process. We describe the nested-em lgorithm in Alg. 1. Algorithm 1: Nested-EM Algorithm. input : visul primitive dtbse D = {v i}, contextul reltions Q, prmeters: M, M, λ output : visul word nd phrse lexicons: Ω nd Ψ; clustering results R nd R 1 Init: (1) clustering visul primitives to get Ω nd U; 2 (2) bsed on Ω, clustering visul groups to get Ψ nd Ũ; 3 while J is decresing do 4 E-step: fix U nd Ũ, updte R nd R 5 nested-e step: fix R, updte R (Eq. 13) 6 nested-m step: fix R, updte R (Eq. 15) 7 if J is decresing then 8 goto E-setp 9 else 10 Goto M-step M-step: fix R nd R, updte U nd Ũ seprtely. return U, Ũ, R, R. Becuse the solution spces of R nd R re discrete nd finite, ccording to the monotonic decresing of J t ech step of our nested-em lgorithm, we hve theorem 1. Theorem 1 convergence of the nested-em lgorithm The nested-em lgorithm cn converge in finite steps. 3. Experiments 3.1. Simultion results To illustrte the ide of our context-wre clustering, we synthesize sptil dtset for simultion. A concrete exmple of this sptil dtset cn be n imge. All the smples hve two representtions with regrding to (1) feture domin, f R 2 nd (2) sptil domin (x, y) N N s shown in Fig. 3 () nd (b). In our cse, we hve 5 different types of visul primitives lbeled s:,, O,, or. In the sptil domin, {, } is generted together to form co-occurrent contextul pttern, while {O,, } is the other contextul pttern. In the feture domin, ech of the 5 clusters hs different number of smples nd re generted bsed on Gussin distributions of different mens nd vrinces. Bsed on the feture domin only, clustering is chllenging problem becuse some of these Gussin distributions re hevily overlpped, for exmple, clusters, O, nd re hevily overlpped. Our tsks re (1) clustering visul primitives into words, nd (2) recover the visul phrses P 1 = {, } nd P 2 = {O,, }. We compre the performnces of the context-wre clustering with different choices of λ (λ = 0, 400, 0) in Fig. 3 (c),(d) nd (e), where λ = 0 gives the sme results s the k-mens clustering. The mjor differences of the clustering results pper from the cluster. Although hevily overlps with clusters O nd, most of its smples re still correctly lbeled bsed on the help from its sptil context: cluster. For exmple, lthough it is difficult to determine smple v locted in the overlpped regions of nd O in the feture spce, we cn resolve the mbiguity by observing the sptil contexts of v. If is found in its sptil context, then v should be lbeled s becuse discovered visul phrse {, } supports such lbel. Figure 3(f) shows the itertions of nested-em lgorithm with λ = 0. Ech itertion corresponds to n individul E-step or n M-step until converge. We decompose the objective function into J = J 1 + J 2, where J, J 1, J 2 re the red, blck nd pink curves respectively. All these three curves re normlized by J mx = J 0, which is the J vlue t the initiliztion step. Compred to k-mens clustering which minimizes distortions J 1 in feture spce only, our context-wre clustering scrifices J 1 to gin lrger decrese of distortion J 2 in the context spce, which gives smller totl distortion J. The error rte curve (blue) describes the percentge of smples tht re wrongly lbeled t ech step, nd we notice it decreses consistently with our objective function J. In terms of clustering errors, the context-wre clustering (e =0.12 when λ = 0) performs significntly better thn the k-mens method (e =0.25). The prmeter λ blnces the two clustering criteri: (1) clustering bsed on visul fetures f (J 1 ) nd (2) clustering bsed on sptil contexts (J 2 ). The smller the λ, the more fithful the clustering results follow the feture spce, where smples hve similr fetures re grouped together. An extreme cse is λ = 0 when no regulriztion is pplied in Eq. 3 by ignoring the feedbck from contexts. In such cse, our context-wre clustering is equl to k-mens clustering. On the other hnd, lrger λ fvors the clustering results tht support the discovered context ptterns (e.g. visul phrses), thus smples hve similr contexts re more likely to be grouped together Imge texton discovery To vlidte whether the discovered visul phrses cn relly cpture common sptil imge ptterns [13], we test

6 collection of texture imges 2. nd n exmple is presented in Fig. 3.2 Given n imge, we first detect SIFT points [7] nd tret keypoints of scle rnges between 1 nd 2 s visul primitives: D v = {v i }. We pply sptil K-NN groups to build the group dtbses G, with K = 10. Welet λ = τj 0 1/J 0 2, where J 0 1 nd J 0 2 re the initiliztion vlue of J 1 nd J 2 respectively; τ > 0 is the prmeter to blnce the distortions between SIFT fetures (word level) nd contextul ptterns (phrse level). k-mens clustering of visul primitives (k=2). Context-wre clustering: initiliztion of visul phrses. Context-wre clustering: fter the 1 st full EM itertion. Context-wre clustering: finl results (19 full EM iter). Figure 2. The 1 st row shows 2 visul words (red nd purple) formed through k-mens clustering. From the 2 nd to 4 th row, we show 2 visul phrses (red nd purple) discovered through contextwre clustering. There exist two types of ner-regulr textures in the imge. One is the flower pttern locted in the clothes (with deformtions) nd the other is the regulr textures locted in the right bottom. We notice tht k-mens clustering of visul primitives cnnot distinguish from two different textures. Prmeters used re M =25, M =2, τ =0.5. For n imge of size nd contining 0-0 visul primitives, the nested-em lgorithm cn converge within 40 full EM itertions. It is interesting to notice tht the discovered visul phrses re of sptil structures, 2 Imges re from source: Plese see supplementry mterils for more results such s flowers in Fig In Fig. 3.2, we lso show how our nested-em lgorithm corrects the imperfect clustering results itertively, by using the sptil contextul informtion s the feedbck. In comprison, conventionl k-mens clustering cnnot obtin stisfctory results if clustering visul primitives individully Multiple-view clustering Multiple-view clustering is nother typicl ppliction of our context-wre clustering lgorithm. In multiple-view clustering [15], ech dt smple v = {f 1, f 2,..., f c } is represented by different types of fetures f i. Our tsk is to cluster collection of dt smples D v = {v j } N j=1.asimple solution is to conctente ll fetures into long feture vector f = f 1 f 2... f c. Then we cn perform trditionl clustering in the new formed feture spce f. However, becuse the depedent informtion mong different types of fetures re not well utilized, such simple solution cnnot get stisfied results. We select the multiple fetures dt set from the UCI Mchine Lerning Repository for evlution. This multiclss dt set consists of hndwritten numerls ( 0 9 ) extrcted from collection of Dutch utility mps. Ech clss contins dt smples nd the dt set hs 0 digits in totl. Ech digit is represented in terms of the 6 fetures nd we select 3 of them for context-wre clustering: (1) 76 Fourier coefficients of the chrcter shpes (fou); (2) 64 Krhunen-Loeve coefficients (kr) nd (3) 240 pixel verges in 2 x 3 windows (pix). A dt smple v thus genertes 3 primitives f fou, f kr nd f pix in 3 feture spces respectively. As result, we obtin in totl N = 0 3 primitives nd cluster them for word lexicon. The originl dt smple v now corresponds to context group G which chrcterizes the co-occurrent dependency mong different types of fetures f i. Ech v D v genertes to trnsction t of length 3. We further cluster these n = 0 trnsctions into phrses. In the initiliztion step, we build the word lexicon Ω i ( Ω i =10, i = fou,kr,pix)for three types of fetures seprtely nd obtin finl lexicon Ω = Ω fou Ω kr Ω pix ( Ω =30). The phrse lexicon Ψ is then constructed bsed on Ω. By considering both distortions from 3 individul fetures nd their contextul ptterns, the objective function in Eq. 3 now becomes: 3 J = tr(r T i D i )+λ tr( R T D) = tr(r T D)+λ tr( R T D), where R nd D re mtrices contining 3 digonl blocks corresponding to fou, kr nd pix fetures respectively. With stright forwrd djustment of mtrices sizes in Tble 1, we cn still pply the nest-em lgorithm in Alg. 1. Tble 2 compres conventionl k-mens clustering with our context-wre clustering. Specificlly, we try k-mens

7 clustering in 3 fetures individully, nd lso in conctention of 3 fetures. In ech cse, k-mens clustering is repeted times nd the best result with minimum totl distortion is selected for comprison. In context-wre clustering, to blnce between the dt fidelity in feture spce (J 1 ) nd the influence of contextul informtion J 2, we select τ =1, which results in λ = τj 0 1/J 0 2 = From Tble 2, we cn see tht lthough ech individul feture hs limited bility in clustering, the contextul pttern mong them cn help to improve the clustering results significntly. Also our context-wre clustering performs better (with error 13.5%) thn simply conctenting ll fetures for k-mens clustering (with error 17.9%). Tble 2. multiple feture clustering: comprison between trditionl k-mens clustering nd context-wre clustering. #feture #clss error k-mens (fou) 76 k= % k-mens (kr) 64 k= % k-mens (pix) 240 k= % k-mens (ll) k= % context-wre M = 30; M = % 4. Conclusion We present in this pper new formultion of selfsupervised clustering, clled context-wre clustering, nd show how contextul informtion cn feed bck to improve the clustering results. Two kinds of contextul informtion (1) sptil contexts of visul primitives nd (2) contextul ptterns mong different types of fetures re pplied to improve the clustering results in (1) imge texton discovery nd (2) multiple view clustering of hnd written digits, respectively. Compred with trditionl k-mens clustering, our context-wre clustering considers the dt (or feture) dependency in higher level. Thus it not only gets better clustering results, but lso cn revel the hidden structures mong dt smples. The proposed nested-em lgorithm is n efficient itertive solution for the context-wre clustering nd is proved to converge. It provides generl solution to context-wre clustering. Besides sptil contexts nd feture contexts proposed in our experiments, other types of contextul informtion cn lso be incorported. Appendix We discuss how to updte the prototypes of visul phrses (Ũ) in the M-step. Given cluster of M 1 trnsctions X = {t i } n, our trget is to find their centroid ũ {0, 1} M such tht the totl quntiztion distortions re minimized under the Hmming distnce criterion in Eq. 5. The optimiztion problem is formulted s: min ũ {0,1} M n [ M (t T i ũ +(1 t T i )(1 ũ)) ]. Let ũ k denote the k th element of ũ, we minimize the following objective function by using the Lgrngin nd let λ t 0 to obtin the unique mximum solution: f(ũ,λ)= n t=1 By pplying f(ũ,λ) =0, ũ k we obtin nd Finlly, we hve M [ 2t k i ũ k t k i ũ k +1+λ k ũ k (1 ũ k ) ]. f(ũ,λ) λ k =0, λ k 0, k, ũ k = 1 2 (2 n tk i n λ k +1), k, λ k = 2 n t k i n 0. ũ k = sgn(2 n tk i n)+1, 2 where sgn() =1if 0, nd sgn() = 1 if <0. Acknowledgment This work ws supported in prt by Ntionl Science Foundtion Grnts IIS nd IIS References [1] J. Amores, N. Sebe, nd P. Rdev. Context-bsed object-clss recognition nd retrievl by generlized correlogrms. IEEE Trns. on Pttern Anlysis nd Mchine Intelligence, 29(10): , 7. [2] S. Belongie, J. Mlik, nd J. Puzich. Shpe mtching nd object recognition using shpe contexts. IEEE Trns. on Pttern Anlysis nd Mchine Intelligence, 2. [3] A. Blum nd T. Mitchell. Combining lbeled nd unlbeled dt with co-trining. In Proc. of Intl. Conf. on Mchine Lerning, [4] O. Boimn nd M. Irni. Similrity by composition. In Proc. of Neurl Informtion Processing Systems, 6. [5] A. Frome, Y. Singer, nd J. Mlik. Imge retrievl nd clssifiction using locl distnce functions. In Proc. of Neurl Informtion Processing Systems, 6. [6] H. Jegou, H. Hrzllh, nd C. Schmid. A contextul dissimilrity mesure for ccurte nd efficient imge serch. In Proc. IEEE Conf. on Computer Vision nd Pttern Recognition, 7. [7] D. Lowe. Distinctive imge fetures from scle-invrint keypoints. Intl. Journl of Computer Vision, 4. [8] Q. Mei, D. Xin, H. Cheng, J. Hn, nd C. Zhi. Generting semntic nnottions for frequent ptterns with context nlysis. In Proc. ACM SIGKDD, 6. [9] E. Shechtmn nd M. Irni. Mtching locl self-similrities cross imges nd videos. In Proc. IEEE Conf. on Computer Vision nd Pttern Recognition, 7. [10] E. P. Xing, A. Y. Ng, M. I. Jordn, nd S. Russell. Distnce metric lerning, with ppliction to clustering with side-informtion. In Proc. of Neurl Informtion Processing Systems, 2. [11] J. Ye, Z. Zho, nd H. Liu. Adptive distnce metric lerning for clustering. In Proc. IEEE Conf. on Computer Vision nd Pttern Recognition, 7.

8 () Feture Domin (b) Sptil Domin (c) k-mens Clustering (k =5) (d) Context-Awre Clustering (λ = 400) J/Jmx J1/Jmx J2/Jmx error rte itertion (e) Context-Awre Clustering (λ = 0) (f) Performnce (λ = 0) Figure 3. Context-wre clustering on the synthesized sptil dt nd the comprison with the k-mens lgorithm. Prmeter used re M =5, M =2nd ɛ = in serching for ɛ-nn sptil groups. See texts for descriptions. Best seen in color. [12] S. X. Yu nd J. Shi. Segmenttion given prtil grouping constrints. IEEE Trns. on Pttern Anlysis nd Mchine Intelligence, 4. [13] J. Yun nd Y. Wu. Sptil rndom prtition for common visul pttern discovery. In Proc. IEEE Intl. Conf. on Computer Vision, 7. [14] J. Yun, Y. Wu, nd M. Yng. Discovery of colloction ptterns: from visul words to visul phrses. In Proc. IEEE Conf. on Computer Vision nd Pttern Recognition, 7. [15] D. Zhou nd C. J. Burges. Spectrl clustering nd trnsductive lerning with multiple views. In Proc. Intl. Conf. on Mchine Lerning, 7. [16] S.-C. Zhu, C. en Guo, Y. Wng, nd Z. Xu. Wht re textons? Intl. Journl of Computer Vision, 5.

LECTURE NOTE #12 PROF. ALAN YUILLE

LECTURE NOTE #12 PROF. ALAN YUILLE LECTURE NOTE #12 PROF. ALAN YUILLE 1. Clustering, K-mens, nd EM Tsk: set of unlbeled dt D = {x 1,..., x n } Decompose into clsses w 1,..., w M where M is unknown. Lern clss models p(x w)) Discovery of

More information

Solution for Assignment 1 : Intro to Probability and Statistics, PAC learning

Solution for Assignment 1 : Intro to Probability and Statistics, PAC learning Solution for Assignment 1 : Intro to Probbility nd Sttistics, PAC lerning 10-701/15-781: Mchine Lerning (Fll 004) Due: Sept. 30th 004, Thursdy, Strt of clss Question 1. Bsic Probbility ( 18 pts) 1.1 (

More information

Monte Carlo method in solving numerical integration and differential equation

Monte Carlo method in solving numerical integration and differential equation Monte Crlo method in solving numericl integrtion nd differentil eqution Ye Jin Chemistry Deprtment Duke University yj66@duke.edu Abstrct: Monte Crlo method is commonly used in rel physics problem. The

More information

New Expansion and Infinite Series

New Expansion and Infinite Series Interntionl Mthemticl Forum, Vol. 9, 204, no. 22, 06-073 HIKARI Ltd, www.m-hikri.com http://dx.doi.org/0.2988/imf.204.4502 New Expnsion nd Infinite Series Diyun Zhng College of Computer Nnjing University

More information

New data structures to reduce data size and search time

New data structures to reduce data size and search time New dt structures to reduce dt size nd serch time Tsuneo Kuwbr Deprtment of Informtion Sciences, Fculty of Science, Kngw University, Hirtsuk-shi, Jpn FIT2018 1D-1, No2, pp1-4 Copyright (c)2018 by The Institute

More information

Tests for the Ratio of Two Poisson Rates

Tests for the Ratio of Two Poisson Rates Chpter 437 Tests for the Rtio of Two Poisson Rtes Introduction The Poisson probbility lw gives the probbility distribution of the number of events occurring in specified intervl of time or spce. The Poisson

More information

NUMERICAL INTEGRATION. The inverse process to differentiation in calculus is integration. Mathematically, integration is represented by.

NUMERICAL INTEGRATION. The inverse process to differentiation in calculus is integration. Mathematically, integration is represented by. NUMERICAL INTEGRATION 1 Introduction The inverse process to differentition in clculus is integrtion. Mthemticlly, integrtion is represented by f(x) dx which stnds for the integrl of the function f(x) with

More information

Multiscale Fourier Descriptor for Shape Classification

Multiscale Fourier Descriptor for Shape Classification Multiscle Fourier Descriptor for Shpe Clssifiction Iivri Kunttu, een epistö, Juhni Ruhm 2, nd Ari Vis Tmpere University of Technology Institute of Signl Processing P. O. Box 553, FI-330 Tmpere, Finlnd

More information

20 MATHEMATICS POLYNOMIALS

20 MATHEMATICS POLYNOMIALS 0 MATHEMATICS POLYNOMIALS.1 Introduction In Clss IX, you hve studied polynomils in one vrible nd their degrees. Recll tht if p(x) is polynomil in x, the highest power of x in p(x) is clled the degree of

More information

Review of Calculus, cont d

Review of Calculus, cont d Jim Lmbers MAT 460 Fll Semester 2009-10 Lecture 3 Notes These notes correspond to Section 1.1 in the text. Review of Clculus, cont d Riemnn Sums nd the Definite Integrl There re mny cses in which some

More information

Math 1B, lecture 4: Error bounds for numerical methods

Math 1B, lecture 4: Error bounds for numerical methods Mth B, lecture 4: Error bounds for numericl methods Nthn Pflueger 4 September 0 Introduction The five numericl methods descried in the previous lecture ll operte by the sme principle: they pproximte the

More information

1 Online Learning and Regret Minimization

1 Online Learning and Regret Minimization 2.997 Decision-Mking in Lrge-Scle Systems My 10 MIT, Spring 2004 Hndout #29 Lecture Note 24 1 Online Lerning nd Regret Minimiztion In this lecture, we consider the problem of sequentil decision mking in

More information

Non-Linear & Logistic Regression

Non-Linear & Logistic Regression Non-Liner & Logistic Regression If the sttistics re boring, then you've got the wrong numbers. Edwrd R. Tufte (Sttistics Professor, Yle University) Regression Anlyses When do we use these? PART 1: find

More information

Math 426: Probability Final Exam Practice

Math 426: Probability Final Exam Practice Mth 46: Probbility Finl Exm Prctice. Computtionl problems 4. Let T k (n) denote the number of prtitions of the set {,..., n} into k nonempty subsets, where k n. Argue tht T k (n) kt k (n ) + T k (n ) by

More information

p-adic Egyptian Fractions

p-adic Egyptian Fractions p-adic Egyptin Frctions Contents 1 Introduction 1 2 Trditionl Egyptin Frctions nd Greedy Algorithm 2 3 Set-up 3 4 p-greedy Algorithm 5 5 p-egyptin Trditionl 10 6 Conclusion 1 Introduction An Egyptin frction

More information

Quadratic Forms. Quadratic Forms

Quadratic Forms. Quadratic Forms Qudrtic Forms Recll the Simon & Blume excerpt from n erlier lecture which sid tht the min tsk of clculus is to pproximte nonliner functions with liner functions. It s ctully more ccurte to sy tht we pproximte

More information

8 Laplace s Method and Local Limit Theorems

8 Laplace s Method and Local Limit Theorems 8 Lplce s Method nd Locl Limit Theorems 8. Fourier Anlysis in Higher DImensions Most of the theorems of Fourier nlysis tht we hve proved hve nturl generliztions to higher dimensions, nd these cn be proved

More information

Hidden Markov Models

Hidden Markov Models Hidden Mrkov Models Huptseminr Mchine Lerning 18.11.2003 Referent: Nikols Dörfler 1 Overview Mrkov Models Hidden Mrkov Models Types of Hidden Mrkov Models Applictions using HMMs Three centrl problems:

More information

W. We shall do so one by one, starting with I 1, and we shall do it greedily, trying

W. We shall do so one by one, starting with I 1, and we shall do it greedily, trying Vitli covers 1 Definition. A Vitli cover of set E R is set V of closed intervls with positive length so tht, for every δ > 0 nd every x E, there is some I V with λ(i ) < δ nd x I. 2 Lemm (Vitli covering)

More information

Duality # Second iteration for HW problem. Recall our LP example problem we have been working on, in equality form, is given below.

Duality # Second iteration for HW problem. Recall our LP example problem we have been working on, in equality form, is given below. Dulity #. Second itertion for HW problem Recll our LP emple problem we hve been working on, in equlity form, is given below.,,,, 8 m F which, when written in slightly different form, is 8 F Recll tht we

More information

Chapter 4 Contravariance, Covariance, and Spacetime Diagrams

Chapter 4 Contravariance, Covariance, and Spacetime Diagrams Chpter 4 Contrvrince, Covrince, nd Spcetime Digrms 4. The Components of Vector in Skewed Coordintes We hve seen in Chpter 3; figure 3.9, tht in order to show inertil motion tht is consistent with the Lorentz

More information

Chapter 14. Matrix Representations of Linear Transformations

Chapter 14. Matrix Representations of Linear Transformations Chpter 4 Mtrix Representtions of Liner Trnsformtions When considering the Het Stte Evolution, we found tht we could describe this process using multipliction by mtrix. This ws nice becuse computers cn

More information

Jim Lambers MAT 169 Fall Semester Lecture 4 Notes

Jim Lambers MAT 169 Fall Semester Lecture 4 Notes Jim Lmbers MAT 169 Fll Semester 2009-10 Lecture 4 Notes These notes correspond to Section 8.2 in the text. Series Wht is Series? An infinte series, usully referred to simply s series, is n sum of ll of

More information

Notes on length and conformal metrics

Notes on length and conformal metrics Notes on length nd conforml metrics We recll how to mesure the Eucliden distnce of n rc in the plne. Let α : [, b] R 2 be smooth (C ) rc. Tht is α(t) (x(t), y(t)) where x(t) nd y(t) re smooth rel vlued

More information

Properties of Integrals, Indefinite Integrals. Goals: Definition of the Definite Integral Integral Calculations using Antiderivatives

Properties of Integrals, Indefinite Integrals. Goals: Definition of the Definite Integral Integral Calculations using Antiderivatives Block #6: Properties of Integrls, Indefinite Integrls Gols: Definition of the Definite Integrl Integrl Clcultions using Antiderivtives Properties of Integrls The Indefinite Integrl 1 Riemnn Sums - 1 Riemnn

More information

Math 131. Numerical Integration Larson Section 4.6

Math 131. Numerical Integration Larson Section 4.6 Mth. Numericl Integrtion Lrson Section. This section looks t couple of methods for pproimting definite integrls numericlly. The gol is to get good pproimtion of the definite integrl in problems where n

More information

A recursive construction of efficiently decodable list-disjunct matrices

A recursive construction of efficiently decodable list-disjunct matrices CSE 709: Compressed Sensing nd Group Testing. Prt I Lecturers: Hung Q. Ngo nd Atri Rudr SUNY t Bufflo, Fll 2011 Lst updte: October 13, 2011 A recursive construction of efficiently decodble list-disjunct

More information

Recitation 3: More Applications of the Derivative

Recitation 3: More Applications of the Derivative Mth 1c TA: Pdric Brtlett Recittion 3: More Applictions of the Derivtive Week 3 Cltech 2012 1 Rndom Question Question 1 A grph consists of the following: A set V of vertices. A set E of edges where ech

More information

HW3, Math 307. CSUF. Spring 2007.

HW3, Math 307. CSUF. Spring 2007. HW, Mth 7. CSUF. Spring 7. Nsser M. Abbsi Spring 7 Compiled on November 5, 8 t 8:8m public Contents Section.6, problem Section.6, problem Section.6, problem 5 Section.6, problem 7 6 5 Section.6, problem

More information

Driving Cycle Construction of City Road for Hybrid Bus Based on Markov Process Deng Pan1, a, Fengchun Sun1,b*, Hongwen He1, c, Jiankun Peng1, d

Driving Cycle Construction of City Road for Hybrid Bus Based on Markov Process Deng Pan1, a, Fengchun Sun1,b*, Hongwen He1, c, Jiankun Peng1, d Interntionl Industril Informtics nd Computer Engineering Conference (IIICEC 15) Driving Cycle Construction of City Rod for Hybrid Bus Bsed on Mrkov Process Deng Pn1,, Fengchun Sun1,b*, Hongwen He1, c,

More information

CBE 291b - Computation And Optimization For Engineers

CBE 291b - Computation And Optimization For Engineers The University of Western Ontrio Fculty of Engineering Science Deprtment of Chemicl nd Biochemicl Engineering CBE 9b - Computtion And Optimiztion For Engineers Mtlb Project Introduction Prof. A. Jutn Jn

More information

Riemann Sums and Riemann Integrals

Riemann Sums and Riemann Integrals Riemnn Sums nd Riemnn Integrls Jmes K. Peterson Deprtment of Biologicl Sciences nd Deprtment of Mthemticl Sciences Clemson University August 26, 203 Outline Riemnn Sums Riemnn Integrls Properties Abstrct

More information

Week 10: Line Integrals

Week 10: Line Integrals Week 10: Line Integrls Introduction In this finl week we return to prmetrised curves nd consider integrtion long such curves. We lredy sw this in Week 2 when we integrted long curve to find its length.

More information

Student Activity 3: Single Factor ANOVA

Student Activity 3: Single Factor ANOVA MATH 40 Student Activity 3: Single Fctor ANOVA Some Bsic Concepts In designed experiment, two or more tretments, or combintions of tretments, is pplied to experimentl units The number of tretments, whether

More information

Goals: Determine how to calculate the area described by a function. Define the definite integral. Explore the relationship between the definite

Goals: Determine how to calculate the area described by a function. Define the definite integral. Explore the relationship between the definite Unit #8 : The Integrl Gols: Determine how to clculte the re described by function. Define the definite integrl. Eplore the reltionship between the definite integrl nd re. Eplore wys to estimte the definite

More information

Chapters 4 & 5 Integrals & Applications

Chapters 4 & 5 Integrals & Applications Contents Chpters 4 & 5 Integrls & Applictions Motivtion to Chpters 4 & 5 2 Chpter 4 3 Ares nd Distnces 3. VIDEO - Ares Under Functions............................................ 3.2 VIDEO - Applictions

More information

2D1431 Machine Learning Lab 3: Reinforcement Learning

2D1431 Machine Learning Lab 3: Reinforcement Learning 2D1431 Mchine Lerning Lb 3: Reinforcement Lerning Frnk Hoffmnn modified by Örjn Ekeberg December 7, 2004 1 Introduction In this lb you will lern bout dynmic progrmming nd reinforcement lerning. It is ssumed

More information

State space systems analysis (continued) Stability. A. Definitions A system is said to be Asymptotically Stable (AS) when it satisfies

State space systems analysis (continued) Stability. A. Definitions A system is said to be Asymptotically Stable (AS) when it satisfies Stte spce systems nlysis (continued) Stbility A. Definitions A system is sid to be Asymptoticlly Stble (AS) when it stisfies ut () = 0, t > 0 lim xt () 0. t A system is AS if nd only if the impulse response

More information

Riemann Sums and Riemann Integrals

Riemann Sums and Riemann Integrals Riemnn Sums nd Riemnn Integrls Jmes K. Peterson Deprtment of Biologicl Sciences nd Deprtment of Mthemticl Sciences Clemson University August 26, 2013 Outline 1 Riemnn Sums 2 Riemnn Integrls 3 Properties

More information

CMDA 4604: Intermediate Topics in Mathematical Modeling Lecture 19: Interpolation and Quadrature

CMDA 4604: Intermediate Topics in Mathematical Modeling Lecture 19: Interpolation and Quadrature CMDA 4604: Intermedite Topics in Mthemticl Modeling Lecture 19: Interpoltion nd Qudrture In this lecture we mke brief diversion into the res of interpoltion nd qudrture. Given function f C[, b], we sy

More information

Week 10: Riemann integral and its properties

Week 10: Riemann integral and its properties Clculus nd Liner Algebr for Biomedicl Engineering Week 10: Riemnn integrl nd its properties H. Führ, Lehrstuhl A für Mthemtik, RWTH Achen, WS 07 Motivtion: Computing flow from flow rtes 1 We observe the

More information

Reinforcement Learning

Reinforcement Learning Reinforcement Lerning Tom Mitchell, Mchine Lerning, chpter 13 Outline Introduction Comprison with inductive lerning Mrkov Decision Processes: the model Optiml policy: The tsk Q Lerning: Q function Algorithm

More information

Estimation of Binomial Distribution in the Light of Future Data

Estimation of Binomial Distribution in the Light of Future Data British Journl of Mthemtics & Computer Science 102: 1-7, 2015, Article no.bjmcs.19191 ISSN: 2231-0851 SCIENCEDOMAIN interntionl www.sciencedomin.org Estimtion of Binomil Distribution in the Light of Future

More information

Advanced Calculus: MATH 410 Notes on Integrals and Integrability Professor David Levermore 17 October 2004

Advanced Calculus: MATH 410 Notes on Integrals and Integrability Professor David Levermore 17 October 2004 Advnced Clculus: MATH 410 Notes on Integrls nd Integrbility Professor Dvid Levermore 17 October 2004 1. Definite Integrls In this section we revisit the definite integrl tht you were introduced to when

More information

Chapter 5 : Continuous Random Variables

Chapter 5 : Continuous Random Variables STAT/MATH 395 A - PROBABILITY II UW Winter Qurter 216 Néhémy Lim Chpter 5 : Continuous Rndom Vribles Nottions. N {, 1, 2,...}, set of nturl numbers (i.e. ll nonnegtive integers); N {1, 2,...}, set of ll

More information

The Regulated and Riemann Integrals

The Regulated and Riemann Integrals Chpter 1 The Regulted nd Riemnn Integrls 1.1 Introduction We will consider severl different pproches to defining the definite integrl f(x) dx of function f(x). These definitions will ll ssign the sme vlue

More information

Handout: Natural deduction for first order logic

Handout: Natural deduction for first order logic MATH 457 Introduction to Mthemticl Logic Spring 2016 Dr Json Rute Hndout: Nturl deduction for first order logic We will extend our nturl deduction rules for sententil logic to first order logic These notes

More information

We will see what is meant by standard form very shortly

We will see what is meant by standard form very shortly THEOREM: For fesible liner progrm in its stndrd form, the optimum vlue of the objective over its nonempty fesible region is () either unbounded or (b) is chievble t lest t one extreme point of the fesible

More information

Numerical integration

Numerical integration 2 Numericl integrtion This is pge i Printer: Opque this 2. Introduction Numericl integrtion is problem tht is prt of mny problems in the economics nd econometrics literture. The orgniztion of this chpter

More information

We partition C into n small arcs by forming a partition of [a, b] by picking s i as follows: a = s 0 < s 1 < < s n = b.

We partition C into n small arcs by forming a partition of [a, b] by picking s i as follows: a = s 0 < s 1 < < s n = b. Mth 255 - Vector lculus II Notes 4.2 Pth nd Line Integrls We begin with discussion of pth integrls (the book clls them sclr line integrls). We will do this for function of two vribles, but these ides cn

More information

ADVANCEMENT OF THE CLOSELY COUPLED PROBES POTENTIAL DROP TECHNIQUE FOR NDE OF SURFACE CRACKS

ADVANCEMENT OF THE CLOSELY COUPLED PROBES POTENTIAL DROP TECHNIQUE FOR NDE OF SURFACE CRACKS ADVANCEMENT OF THE CLOSELY COUPLED PROBES POTENTIAL DROP TECHNIQUE FOR NDE OF SURFACE CRACKS F. Tkeo 1 nd M. Sk 1 Hchinohe Ntionl College of Technology, Hchinohe, Jpn; Tohoku University, Sendi, Jpn Abstrct:

More information

Chapter 3 Polynomials

Chapter 3 Polynomials Dr M DRAIEF As described in the introduction of Chpter 1, pplictions of solving liner equtions rise in number of different settings In prticulr, we will in this chpter focus on the problem of modelling

More information

Probabilistic Investigation of Sensitivities of Advanced Test- Analysis Model Correlation Methods

Probabilistic Investigation of Sensitivities of Advanced Test- Analysis Model Correlation Methods Probbilistic Investigtion of Sensitivities of Advnced Test- Anlysis Model Correltion Methods Liz Bergmn, Mtthew S. Allen, nd Dniel C. Kmmer Dept. of Engineering Physics University of Wisconsin-Mdison Rndll

More information

Applicable Analysis and Discrete Mathematics available online at

Applicable Analysis and Discrete Mathematics available online at Applicble Anlysis nd Discrete Mthemtics vilble online t http://pefmth.etf.rs Appl. Anl. Discrete Mth. 4 (2010), 23 31. doi:10.2298/aadm100201012k NUMERICAL ANALYSIS MEETS NUMBER THEORY: USING ROOTFINDING

More information

Chapter 3 MATRIX. In this chapter: 3.1 MATRIX NOTATION AND TERMINOLOGY

Chapter 3 MATRIX. In this chapter: 3.1 MATRIX NOTATION AND TERMINOLOGY Chpter 3 MTRIX In this chpter: Definition nd terms Specil Mtrices Mtrix Opertion: Trnspose, Equlity, Sum, Difference, Sclr Multipliction, Mtrix Multipliction, Determinnt, Inverse ppliction of Mtrix in

More information

Physics 116C Solution of inhomogeneous ordinary differential equations using Green s functions

Physics 116C Solution of inhomogeneous ordinary differential equations using Green s functions Physics 6C Solution of inhomogeneous ordinry differentil equtions using Green s functions Peter Young November 5, 29 Homogeneous Equtions We hve studied, especilly in long HW problem, second order liner

More information

Vyacheslav Telnin. Search for New Numbers.

Vyacheslav Telnin. Search for New Numbers. Vycheslv Telnin Serch for New Numbers. 1 CHAPTER I 2 I.1 Introduction. In 1984, in the first issue for tht yer of the Science nd Life mgzine, I red the rticle "Non-Stndrd Anlysis" by V. Uspensky, in which

More information

The final exam will take place on Friday May 11th from 8am 11am in Evans room 60.

The final exam will take place on Friday May 11th from 8am 11am in Evans room 60. Mth 104: finl informtion The finl exm will tke plce on Fridy My 11th from 8m 11m in Evns room 60. The exm will cover ll prts of the course with equl weighting. It will cover Chpters 1 5, 7 15, 17 21, 23

More information

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 17

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 17 CS 70 Discrete Mthemtics nd Proility Theory Summer 2014 Jmes Cook Note 17 I.I.D. Rndom Vriles Estimting the is of coin Question: We wnt to estimte the proportion p of Democrts in the US popultion, y tking

More information

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 17

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 17 EECS 70 Discrete Mthemtics nd Proility Theory Spring 2013 Annt Shi Lecture 17 I.I.D. Rndom Vriles Estimting the is of coin Question: We wnt to estimte the proportion p of Democrts in the US popultion,

More information

Here we study square linear systems and properties of their coefficient matrices as they relate to the solution set of the linear system.

Here we study square linear systems and properties of their coefficient matrices as they relate to the solution set of the linear system. Section 24 Nonsingulr Liner Systems Here we study squre liner systems nd properties of their coefficient mtrices s they relte to the solution set of the liner system Let A be n n Then we know from previous

More information

Zero-Sum Magic Graphs and Their Null Sets

Zero-Sum Magic Graphs and Their Null Sets Zero-Sum Mgic Grphs nd Their Null Sets Ebrhim Slehi Deprtment of Mthemticl Sciences University of Nevd Ls Vegs Ls Vegs, NV 89154-4020. ebrhim.slehi@unlv.edu Abstrct For ny h N, grph G = (V, E) is sid to

More information

Numerical Integration

Numerical Integration Chpter 5 Numericl Integrtion Numericl integrtion is the study of how the numericl vlue of n integrl cn be found. Methods of function pproximtion discussed in Chpter??, i.e., function pproximtion vi the

More information

The Properties of Stars

The Properties of Stars 10/11/010 The Properties of Strs sses Using Newton s Lw of Grvity to Determine the ss of Celestil ody ny two prticles in the universe ttrct ech other with force tht is directly proportionl to the product

More information

Strong Bisimulation. Overview. References. Actions Labeled transition system Transition semantics Simulation Bisimulation

Strong Bisimulation. Overview. References. Actions Labeled transition system Transition semantics Simulation Bisimulation Strong Bisimultion Overview Actions Lbeled trnsition system Trnsition semntics Simultion Bisimultion References Robin Milner, Communiction nd Concurrency Robin Milner, Communicting nd Mobil Systems 32

More information

Math 61CM - Solutions to homework 9

Math 61CM - Solutions to homework 9 Mth 61CM - Solutions to homework 9 Cédric De Groote November 30 th, 2018 Problem 1: Recll tht the left limit of function f t point c is defined s follows: lim f(x) = l x c if for ny > 0 there exists δ

More information

Acceptance Sampling by Attributes

Acceptance Sampling by Attributes Introduction Acceptnce Smpling by Attributes Acceptnce smpling is concerned with inspection nd decision mking regrding products. Three spects of smpling re importnt: o Involves rndom smpling of n entire

More information

Riemann is the Mann! (But Lebesgue may besgue to differ.)

Riemann is the Mann! (But Lebesgue may besgue to differ.) Riemnn is the Mnn! (But Lebesgue my besgue to differ.) Leo Livshits My 2, 2008 1 For finite intervls in R We hve seen in clss tht every continuous function f : [, b] R hs the property tht for every ɛ >

More information

Improper Integrals. Type I Improper Integrals How do we evaluate an integral such as

Improper Integrals. Type I Improper Integrals How do we evaluate an integral such as Improper Integrls Two different types of integrls cn qulify s improper. The first type of improper integrl (which we will refer to s Type I) involves evluting n integrl over n infinite region. In the grph

More information

Recitation 3: Applications of the Derivative. 1 Higher-Order Derivatives and their Applications

Recitation 3: Applications of the Derivative. 1 Higher-Order Derivatives and their Applications Mth 1c TA: Pdric Brtlett Recittion 3: Applictions of the Derivtive Week 3 Cltech 013 1 Higher-Order Derivtives nd their Applictions Another thing we could wnt to do with the derivtive, motivted by wht

More information

Calculus and linear algebra for biomedical engineering Week 11: The Riemann integral and its properties

Calculus and linear algebra for biomedical engineering Week 11: The Riemann integral and its properties Clculus nd liner lgebr for biomedicl engineering Week 11: The Riemnn integrl nd its properties Hrtmut Führ fuehr@mth.rwth-chen.de Lehrstuhl A für Mthemtik, RWTH Achen Jnury 9, 2009 Overview 1 Motivtion:

More information

Chapter 6 Notes, Larson/Hostetler 3e

Chapter 6 Notes, Larson/Hostetler 3e Contents 6. Antiderivtives nd the Rules of Integrtion.......................... 6. Are nd the Definite Integrl.................................. 6.. Are............................................ 6. Reimnn

More information

Riemann Integrals and the Fundamental Theorem of Calculus

Riemann Integrals and the Fundamental Theorem of Calculus Riemnn Integrls nd the Fundmentl Theorem of Clculus Jmes K. Peterson Deprtment of Biologicl Sciences nd Deprtment of Mthemticl Sciences Clemson University September 16, 2013 Outline Grphing Riemnn Sums

More information

13: Diffusion in 2 Energy Groups

13: Diffusion in 2 Energy Groups 3: Diffusion in Energy Groups B. Rouben McMster University Course EP 4D3/6D3 Nucler Rector Anlysis (Rector Physics) 5 Sept.-Dec. 5 September Contents We study the diffusion eqution in two energy groups

More information

Continuous Random Variables

Continuous Random Variables STAT/MATH 395 A - PROBABILITY II UW Winter Qurter 217 Néhémy Lim Continuous Rndom Vribles Nottion. The indictor function of set S is rel-vlued function defined by : { 1 if x S 1 S (x) if x S Suppose tht

More information

1 The Riemann Integral

1 The Riemann Integral The Riemnn Integrl. An exmple leding to the notion of integrl (res) We know how to find (i.e. define) the re of rectngle (bse height), tringle ( (sum of res of tringles). But how do we find/define n re

More information

The Algebra (al-jabr) of Matrices

The Algebra (al-jabr) of Matrices Section : Mtri lgebr nd Clculus Wshkewicz College of Engineering he lgebr (l-jbr) of Mtrices lgebr s brnch of mthemtics is much broder thn elementry lgebr ll of us studied in our high school dys. In sense

More information

Math 520 Final Exam Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Final Exam Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Mth 520 Finl Exm Topic Outline Sections 1 3 (Xio/Dums/Liw) Spring 2008 The finl exm will be held on Tuesdy, My 13, 2-5pm in 117 McMilln Wht will be covered The finl exm will cover the mteril from ll of

More information

Theoretical foundations of Gaussian quadrature

Theoretical foundations of Gaussian quadrature Theoreticl foundtions of Gussin qudrture 1 Inner product vector spce Definition 1. A vector spce (or liner spce) is set V = {u, v, w,...} in which the following two opertions re defined: (A) Addition of

More information

Lecture Note 9: Orthogonal Reduction

Lecture Note 9: Orthogonal Reduction MATH : Computtionl Methods of Liner Algebr 1 The Row Echelon Form Lecture Note 9: Orthogonl Reduction Our trget is to solve the norml eution: Xinyi Zeng Deprtment of Mthemticl Sciences, UTEP A t Ax = A

More information

The steps of the hypothesis test

The steps of the hypothesis test ttisticl Methods I (EXT 7005) Pge 78 Mosquito species Time of dy A B C Mid morning 0.0088 5.4900 5.5000 Mid Afternoon.3400 0.0300 0.8700 Dusk 0.600 5.400 3.000 The Chi squre test sttistic is the sum of

More information

Spectral Regularization for Max-Margin Sequence Tagging

Spectral Regularization for Max-Margin Sequence Tagging Spectrl Regulriztion for Mx-Mrgin Sequence Tgging Aridn Quttoni Borj Blle Xvier Crrers Amir Globerson pq Universitt Politècnic de Ctluny now t p q McGill University p q The Hebrew University of Jeruslem

More information

N 0 completions on partial matrices

N 0 completions on partial matrices N 0 completions on prtil mtrices C. Jordán C. Mendes Arújo Jun R. Torregros Instituto de Mtemátic Multidisciplinr / Centro de Mtemátic Universidd Politécnic de Vlenci / Universidde do Minho Cmino de Ver

More information

A general framework for estimating similarity of datasets and decision trees: exploring semantic similarity of decision trees

A general framework for estimating similarity of datasets and decision trees: exploring semantic similarity of decision trees A generl frmework for estimting similrity of dtsets nd decision trees: exploring semntic similrity of decision trees Irene Ntoutsi Alexndros Klousis Ynnis Theodoridis Abstrct Decision trees re mong the

More information

Lecture 21: Order statistics

Lecture 21: Order statistics Lecture : Order sttistics Suppose we hve N mesurements of sclr, x i =, N Tke ll mesurements nd sort them into scending order x x x 3 x N Define the mesured running integrl S N (x) = 0 for x < x = i/n for

More information

Improper Integrals, and Differential Equations

Improper Integrals, and Differential Equations Improper Integrls, nd Differentil Equtions October 22, 204 5.3 Improper Integrls Previously, we discussed how integrls correspond to res. More specificlly, we sid tht for function f(x), the region creted

More information

Travelling Profile Solutions For Nonlinear Degenerate Parabolic Equation And Contour Enhancement In Image Processing

Travelling Profile Solutions For Nonlinear Degenerate Parabolic Equation And Contour Enhancement In Image Processing Applied Mthemtics E-Notes 8(8) - c IN 67-5 Avilble free t mirror sites of http://www.mth.nthu.edu.tw/ men/ Trvelling Profile olutions For Nonliner Degenerte Prbolic Eqution And Contour Enhncement In Imge

More information

MATH 115 FINAL EXAM. April 25, 2005

MATH 115 FINAL EXAM. April 25, 2005 MATH 115 FINAL EXAM April 25, 2005 NAME: Solution Key INSTRUCTOR: SECTION NO: 1. Do not open this exm until you re told to begin. 2. This exm hs 9 pges including this cover. There re 9 questions. 3. Do

More information

A Signal-Level Fusion Model for Image-Based Change Detection in DARPA's Dynamic Database System

A Signal-Level Fusion Model for Image-Based Change Detection in DARPA's Dynamic Database System SPIE Aerosense 001 Conference on Signl Processing, Sensor Fusion, nd Trget Recognition X, April 16-0, Orlndo FL. (Minor errors in published version corrected.) A Signl-Level Fusion Model for Imge-Bsed

More information

INTRODUCTION TO INTEGRATION

INTRODUCTION TO INTEGRATION INTRODUCTION TO INTEGRATION 5.1 Ares nd Distnces Assume f(x) 0 on the intervl [, b]. Let A be the re under the grph of f(x). b We will obtin n pproximtion of A in the following three steps. STEP 1: Divide

More information

Best Approximation. Chapter The General Case

Best Approximation. Chapter The General Case Chpter 4 Best Approximtion 4.1 The Generl Cse In the previous chpter, we hve seen how n interpolting polynomil cn be used s n pproximtion to given function. We now wnt to find the best pproximtion to given

More information

ICP Algorithm for Alignment of Stars from Astronomical Photographic Images

ICP Algorithm for Alignment of Stars from Astronomical Photographic Images Interntionl Conference on Computer Systems nd echnologies - CompSysech 010 ICP Algorithm for Alignment of Strs from Astronomicl Photogrphic Imges Alexnder Mrinov, Ndezhd Zltev Abstrct: his rticle proposes

More information

Data Assimilation. Alan O Neill Data Assimilation Research Centre University of Reading

Data Assimilation. Alan O Neill Data Assimilation Research Centre University of Reading Dt Assimiltion Aln O Neill Dt Assimiltion Reserch Centre University of Reding Contents Motivtion Univrite sclr dt ssimiltion Multivrite vector dt ssimiltion Optiml Interpoltion BLUE 3d-Vritionl Method

More information

Natural examples of rings are the ring of integers, a ring of polynomials in one variable, the ring

Natural examples of rings are the ring of integers, a ring of polynomials in one variable, the ring More generlly, we define ring to be non-empty set R hving two binry opertions (we ll think of these s ddition nd multipliction) which is n Abelin group under + (we ll denote the dditive identity by 0),

More information

Physics 202H - Introductory Quantum Physics I Homework #08 - Solutions Fall 2004 Due 5:01 PM, Monday 2004/11/15

Physics 202H - Introductory Quantum Physics I Homework #08 - Solutions Fall 2004 Due 5:01 PM, Monday 2004/11/15 Physics H - Introductory Quntum Physics I Homework #8 - Solutions Fll 4 Due 5:1 PM, Mondy 4/11/15 [55 points totl] Journl questions. Briefly shre your thoughts on the following questions: Of the mteril

More information

Elements of Matrix Algebra

Elements of Matrix Algebra Elements of Mtrix Algebr Klus Neusser Kurt Schmidheiny September 30, 2015 Contents 1 Definitions 2 2 Mtrix opertions 3 3 Rnk of Mtrix 5 4 Specil Functions of Qudrtic Mtrices 6 4.1 Trce of Mtrix.........................

More information

Chapter 0. What is the Lebesgue integral about?

Chapter 0. What is the Lebesgue integral about? Chpter 0. Wht is the Lebesgue integrl bout? The pln is to hve tutoril sheet ech week, most often on Fridy, (to be done during the clss) where you will try to get used to the ides introduced in the previous

More information

Partial Derivatives. Limits. For a single variable function f (x), the limit lim

Partial Derivatives. Limits. For a single variable function f (x), the limit lim Limits Prtil Derivtives For single vrible function f (x), the limit lim x f (x) exists only if the right-hnd side limit equls to the left-hnd side limit, i.e., lim f (x) = lim f (x). x x + For two vribles

More information

Reversals of Signal-Posterior Monotonicity for Any Bounded Prior

Reversals of Signal-Posterior Monotonicity for Any Bounded Prior Reversls of Signl-Posterior Monotonicity for Any Bounded Prior Christopher P. Chmbers Pul J. Hely Abstrct Pul Milgrom (The Bell Journl of Economics, 12(2): 380 391) showed tht if the strict monotone likelihood

More information

GRADE 4. Division WORKSHEETS

GRADE 4. Division WORKSHEETS GRADE Division WORKSHEETS Division division is shring nd grouping Division cn men shring or grouping. There re cndies shred mong kids. How mny re in ech shre? = 3 There re 6 pples nd go into ech bsket.

More information