Library-based coding: a representation for. ecient video compression and retrieval. MIT Media Lab,
|
|
- Tamsyn Boyd
- 6 years ago
- Views:
Transcription
1 Lbrary-based codng: a representaton for ecent vdeo compresson and retreval Nuno Vasconcelos and Andrew Lppman MIT Meda Lab, fnuno,lpg@meda.mt.edu Abstract The ubquty ofnetworkng and computatonal capacty assocated wth the new communcatons meda unvel a unverse of new requrements for mage representatons. Among such requrements s the ablty of the representaton used for codng to support hgher-level tasks such as content-based retreval. In ths paper, we explore the relatonshps between probablstc modelng and data compresson to ntroduce a representaton - lbrary-based codng - whch, by enablng retreval n the compressed doman, satses ths requrement. Because t contans an embedded probablstc descrpton of the source, ths new representaton allows the constructon of good nference models wthout compromse of compresson ecency, leads to very ecent procedures for query and retreval, and provdes a framework for hgher level tasks such as the analyss and classcaton of vdeo shots. 1 Introducton The ntroducton of dgtal communcatons and nexpensve computaton orgnated a shft from mage representatons based on very smple prmtves (snusods) and operatons (lterng and modulaton) towards others based on more nvolved languages, capable of explotng the statstcal characterstcs of vdeo sources and the characterstcs of the Human Vsual System, or provdng hgher level descrptons of scene content. Whle ths shft allowed more ecent use of the avalable bandwdth, t has also uncovered a unverse of new requrements for mage representatons. Because dgtal decodng requres computatonal capacty at the recevng end of the channel, t leads to smart nformaton and entertanment applances capable of two way communcaton under the control of the user [8]. Dgtal decoders are ntellgent, and capable of searchng the network for the content that s \just rght" for ther users. Unfortunately, current dgtal representatons, such as JPEG or MPEG, desgned wth the sole goal of achevng compresson ecency, are not helpful for ths task. Perhaps due to ths, most of the recent eort n the area of content-based retreval consders ths task ndependently of the ssue of bandwdth ecency. Typcal
2 solutons consder a feature space for retreval whch does not overlap wth the representaton space used for compresson, e.g. whle compresson s based on DCT [6] or wavelet bass functons [1], retreval s based on color hstograms [5] or texture features [7]. Such solutons mply that ether the features are pre-computed and stored n addton to the compressed btstreams - a process whch s necent n terms of storage resources - or the bt-streams must be decoded and the features computed at the tme of query - a procedure whch s computatonally very necent snce all the work performed at the tme of mage encodng s useless for the task of retreval. Even when spaces used for retreval and compresson overlap and full mage reconstructon s not requred (e.g. when wavelet or DCT coecents are used for retreval), the process stll suers from necences: these coecents are n general a szeable porton of the btstream and ther decodng can, therefore, stll be an expensve task, and t s generally not clear that the feature space assocated wth them s the most approprate for statstcal dscrmnaton, or that t allows the constructon of good models for statstcal nference. The fundamental goal of ths work s to restore some of the producton optons or nteractve potental of vdeo by augmentng the representaton used n codng and by explotng the analyss used for compresson as a retreval ad. The noton s that the analyss used n makng an ecent coder can potentally provde useful cues to the underlyng acton n the scene that may facltate browsng, lterng, sortng or combnng sets of movng pcture sequences. For ths we ntroduce a representaton, lbrary-based codng, whch contans an embedded descrpton of the pcture content and allows the constructon of good nference models wthout compromse of compresson ecency. Ths embedded descrpton s compact and can be decoded ndependently of the bulk of the dgtal btstream, leadng to very ecent procedures for query and retreval. Furthermore, because the representaton allows the constructon of probablstc models for statstcal nference, t provdes a framework for performng hgher level tasks such as analyzng and classfyng vdeo shots. Fnally, because t s close to the representatons currently used for vdeo compresson, t requres only slght alteratons to the exstng standards. 2 Embedded probablstc descrptons A representaton capable of supportng content-based queres wthout full decodng should nclude a compact, decodable on ts own, descrpton of the statstcal propertes of the mage source. Ideally,onewould want a complete probablstc descrpton, such as the probablty densty functon (pdf) of the stochastc process from whch the mages were drawn, because that would allow retreval to be based on statstcal nferences. For example, gven the probablty denstes assocated wth M sources p(xjs ), =1 ::: M, ther relatve probabltes of occurrence P (S ), and a set of data x created from mages to be classed as belongng to one of the classes, the task could be performed optmally by usng an Maxmum a Posteror Probablty (MAP)
3 crteron and Bayes rule,.e. pck the class such that = arg max 2f1 ::: M g P (S jx) =arg max 2f1 ::: M g P (xjs )P (S ): (1) One possble way to acheve ths would be to use a non-parametrc descrpton of the source, such as a hstogram. However, non-parametrc descrptons are not compact, mplyng a degradaton of the ecency of the representaton n terms of compresson, and t s not clear how they could be used as a part of the encodng procedure,.e. not smply as overhead. Fortunately, the alternatve of usng parametrc descrptors, n partcular mxture denstes, satses these two requstes, provdng a much more ecent soluton. 2.1 Parametrc modelng, mxture denstes, and the EM algorthm A parametrc probablstc model capable of approxmatng any probablty densty s the class of mxture denstes [9]. A mxture densty has the form P (x) = CX =1 P (xj! )P (! ) (2) where C s the number of probablty classes, P (xj! ) are the class-condtonal denstes, andp (! ) =1 ::: C the class probabltes ( P C =1 P (! ) = 1). The classcondtonal denstes can be any vald probablty densty functons, even though they are n most of the applcatons (and n the rest of ths paper) assumed to be Gaussans. In ths case, the mxture densty becomes P (x) = CX =1 p e ; 1 (x j; 2 ) T ;1 (x j ; ) (3) where p = P (! ). The mxture s then completely characterzed by the parame- p(2) n jj ters L (s) = f (s) (s) p (s) =1 ::: Cg. The standard statstcal tool for the estmaton of mxture parameters s the Expectaton-Maxmzaton (EM) algorthm [3]. The EM algorthm treats the class assgnments (.e. whch class s responsble for each sample) as hdden (non-observed) varables and, gven a set of M ndependent and dentcally dstrbuted samples x =1 ::: M, nds the mxture parameters that maxmze the data lkelhood by teratng between the followng steps. E-step: h m = P (! jx m )= P (x m j! )p PN k=1 P (xm j! k )p k (4) M-step: new = Pm h m x P m new m h m = Pm h m (x m ; new P m h m )(x m ; new ) T p new = 1 M X m h m (5)
4 where m =1 ::: M and =1 ::: C. Gven a mage to compress, the parameter set L (s) can be estmated through the EM algorthm and ncluded n the compressed btstream leadng to a compact, stand-alone descrpton that can be used to perform statstcal nferences such as those mentoned above. However, whle ths would ncrease the retreval ablty of the representaton, t would also compromse ts compresson ecency, as ths descrpton would amounttopureoverhead. The nterestng mssng lnk s that mxture parameters are very closely related to the codebooks whch form the bass of a representaton whch s known to be optmal from a compresson standpont: vector quantzaton (VQ) [4]. 2.2 Vector quantzaton Avector quantzer Q s a mappng from a K-dmensonal vector space of nput samples to a nte set of reconstructon vectors, usually known as codevectors or codewords. The set of reconstructon vectors s generally desgnated by codebook. The N-dmensonal nput vector space s parttoned nto a set C of N K-dmensonal regons R,alsoknown as parttons or cells, and a reconstructon vector y assocated wth each regon. The non-lnearty nherent to the operaton of quantzaton makes t mpossble to acheve a sngle, closed-form soluton to the problem of optmal vector quantzaton. It s however possble to nd two necessary condtons for optmalty by decomposng the problem nto two smaller ones: ndng the optmal partton for a gven codebook, and the optmal codebook for a gven partton. The optmal partton (encoder) for a xed codebook (decoder) must satsfy the nearest-neghbor condton R fx : d(x y ) d(x y j ) 8j 6= g (6) whle the optmal codebook for a gven partton must satsfy the generalzed-centrod condton Q(x) =mnfe[d(x y )jx 2R ]g: (7) y The most popular algorthm for vector quantzer desgn - the LBG algorthm [4] - terates between these two condtons, whch, gven a tranng set T = t 1 ::: t M, and assumng the mean squared error dstorton metrc become, respectvely, R = ft 2T : jjt ; y jj < jjt ; y j jj 8j 6= g (8) and Y = E[xjx 2R ]= P M j=1 t j S (t j ) P M j=1 S (t j ) (9) where the S (t j )=1ft j 2R,andS (t j ) = 0 otherwse.
5 2.3 Relatonshp between mxture densty estmaton and vector quantzaton To understand the relatonshp between vector quantzaton and estmaton of mxture parameters we start by re-wrtng the EM equatons for the smpler case of equally lkely classes (p =1=C), and dentty covarances E-step: M-step: h m = P (x m j! ) P N k=1 P (x m j! k ) = e ; 1 ; 2 jj 2 P N 1 k=1 e; jjxm ; 2 k jj 2 (10) new = Pm h m x P m. (11) m h m The smlartes wth VQ desgn are made clear by the comparson of these expressons wth equatons 8 and 9. The only derence s that, n the VQ case, the h 's are thresholded after the E-step so that ( 1 h 0 = 0 f h >h j 8j 6= otherwse (12).e. the tranng sample s assgned to the mxture component that has maxmum a posteror probablty ofhavng generated t. Ths transforms equaton 10 nto equaton 8 1, and consequently equaton 11 nto equaton 9. Therefore, gven a sample to encode, the optmal codeword to represent t s the mean of the mxture component whch has maxmum a posteror probablty ofhavng generated the sample. The results above can be extended to the generc Gaussan case wth full covarances and class probabltes. In ths case, the relatonshps between mxture densty estmaton and vector quantzaton are the same, but the VQ s optmal under the Mahanolabs dstance, and has a constrant on ts output entropy. I.e. gven an nput vector t, the optmal partton s the one for whch where R j = ft 2T : jjt ; y j jj 2 j ; log p j < jjt ; y jj 2 ; log p 8g (13) jjt ; y j jj 2 =(t ; y j ) T ;1 (t ; y j ): (14) The man concluson s thus that vector quantzaton s smply EM estmaton wth MAP class assgnments and, n practce, ths means that the codebooks orgnated by VQ are a good approxmaton to the parameters of the mxture densty that best descrbes the data. Or, f one desgns the codebook wth the EM algorthm and uses MAP decsons only after the after tranng s completed, the codebook wll also provde an optmal estmate of the mxture parameters. 3 The lbrary-based coder The lbrary-based coder bulds on the smlartes between EM and VQ desgn to obtan a representaton that can be used to jontly address the ssues of compresson 1 Notce that the denomnator of equaton 10 s smply a normalzng constant, equal for all h 's.
6 and retreval. For each frame, a codebook s desgned and transmtted to the recever. The frame s then encoded usng VQ, and the quantzaton ndexes transmtted as well. Because the codebook provdes a probablstc descrpton of the source, t s all that needs to be decoded for the purposes of retreval - the bulk of the data beng decoded only when the frame s to be reconstructed. From the compresson pont of vew, the scheme can be seen as a unversal encoder, contnuously adapted to the source probabltes. Whle, n theory, VQ has long been known to be the optmal compresson scheme n practce, because optmalty sonlyacheved wth large vector szes and encodng complexty grows exponentally wth vector sze, vector quantzers have fallen short of provdng the theoretcally attanable performance. In fact, f block szes and encodng complexty are to reman compatble wth those of the current standards, codebooks wll be lmted to relatvely small szes, leadng to reduced rates and poor mage qualty. Rate control Lbrary desgn + DCT Q Q 1 M U X Lbrary update IDCT Lbrary predcton Moton compensated predcton Lbrary entres Lbrary ndces Moton estmaton Moton vectors Fgure 1: Block dagram of the lbrary-based coder. Due to ths lmtaton, and the desre to keep the codng model as close to that of the current standards as possble, our mplementaton of the lbrary-based coder s bascally an extended verson of MPEG, where the lbrary s used as an addtonal predctor. In ths settng, the lbrary-based coder can be seen as retreval-enabled MPEG, wth the addtonal benet of better predcton (through the lbrary) durng common events where block-matchng fals. A block dagram of the complete coder s presented n gure 1. Each nput frame s segmented nto square blocks, whch are then processed to mnmze both temporal and spatal correlaton. Two derent predcton structures are used for temporal processng: the lbrary-based predctor dscussed above, and a conventonal motoncompensated predctor. By usng the two predcton modes, t s possble to combne the hgher ecency of moton-compensated predcton n areas of translatonal or reduced ampltude moton, wth the ncreased performance of lbrary predcton n areas of non-translatonal moton, object occluson, or where new objects are revealed. The encodng of the predcton error sgnal s smlar to that used by MPEG-2 [6].
7 4 Content-based queres Consder the task of ndng the closest match n a database to a gven a query mage. Ths task can be solved n several ways f the lbrary coded representaton s used. The smplest of the solutons s probably to follow the route outlned n secton 2. Assumng that the blocks of the query mage x =1:::N, are ndependent samples of the same stochastc process, the mages n the database are samples from M mage sources S, and usng equaton 1, the source wth the hghest probablty of havng generated the query mage s s =arg max NY k2f1 ::: M g =1 P (x js k )P (S k ): (15) Gven the lbrary entres L (k) = f (k) (k) p (k) =1:::Cg, the condtonal probabltes of equaton 15 are computed through equaton 3. In the absence of any pror knowledge about the relatve source lkelhoods, the term P (S k ) can be dsregarded. In a more complete settng, pror probabltes can, however, be used to constran the search. If, for example, the mages n the database are annotated wth text and the user speces a preference for pctures contanng \people", the retreval engne can assgn a hgh pror to all the mages annotated wth the \people" keyword, and a low pror to the remanng mages, ncreasng the posteror lkelhood of the mages n the desred category. The pont s that, unlke other types of features, because lbrares are probablstc descrptons of the source, they allow statstcal reasonng and the constructon of powerful search paradgms n the compressed doman. A practcal lmtaton of a soluton based on equaton 15 s that computng all the N condtonal probabltes can stll be an expensve task, whch grows proportonally to the mage sze and mples decodng the query mage f ths s also orgnally compressed. An alternatve, less expensve, soluton conssts n substtutng the product of condtonal probabltes by a functon whch measures the smlarty between the probablty denstes assocated wth the query and the database mages s =arg max k2f1 ::: M g D[P (xjs q ) P(xjS k )] (16) where S q s the source of the query. Event thoughany of the tradtonal smlarty metrcs, such as the Kulback-Lebler dstance [2], could theoretcally be used n equaton 16, these metrcs typcally do not lead to smple closed-form expressons for Gaussan mxtures. We have, therefore, consdered the followng smpler metrc nspred by equaton 15 s = arg max CY k2f1 ::: M g =1 P ( (q) js k ) (17) where the mage blocks x are replaced by the means of the Gaussans (q) n the mxture assocated wth the query mage. Ths reduces the number of condtonal probabltes to be evaluated by approxmately an order of magntude, and allows searches that not even requre full decodng of the query mage.
8 The metrc above can be further smpled by notcng that, for a gven x, the contrbuton of most of the Gaussans n equaton 3 to P (x) wll be neglgble. In the extreme case of neglgble overlap between the Gaussans n the mxture, at most one of these Gaussans wll be responsble for the bulk of the probablty. One can, therefore, use the approxmaton obtanng ; log P ( (q) js k ) mn f 1 21:::C 2 jj(q) s =arg mn CX k2f1 ::: M g =1 mn f1 21:::C 2 jj(q) ; (k) jj2 (k) ; (k) jj2 (k) ; log p (k) g (18) ; log p (k) g: (19) Comparng equaton 18 wth the vector quantzaton expresson of 13 once agan makes explct the connecton between VQ and MAP probablty estmaton usng mxture denstes. Thus, under the assumpton of separated Gaussans and the approxmaton of equaton 17, the closest mage to that used as a query s the one whose lbrary s closest to the lbrary assocated wth the query mage n the tradtonal VQ sense. 5 Smulaton Results In ths secton we analyze the performance of the lbrary-based coder. Because the compresson ecency of the coder was already studed n [10, 11], wenow concentrate on ts retreval capabltes. Fgure 2 depcts the dstances between a query mage and the mages n a database of 700 frames contanng varous scenes taken from tralers of the moves \Termnal velocty" and \East of Eden". Image 200 was used as a query example, and s presented nsde a whte frame. Notce that the measured dstances agree wth what should be expected from the retreval system. The query mage obvously has a null dstance to tself. Next, the closest mages are those n the same vdeo shot and, for these mages, the query dstance ncreases gradually wth the temporal dstance to the query mage. The followng closest mages are those n scenes wth content smlar to that of the query mage, namely people oatng n md ar, and large areas of sky. Next come scenes whch contan some degree of sky or water and nally, at a sgncant dstance, are the artcally generated graphcs. Ths ndcates that, as a metrc, the nter-lbrary dstance s capable of ne dscrmnaton between derent types of content. Fgure 3 shows examples of content-based retreval on the same mage database. It contans four mages, each dsplayng the results of ve queres. In all cases, each row corresponds to a query, wth the query mage shown on the rght, followed by the four best matches (excludng tself) n the database whch are presented from left to rght accordng to ther smlarty rank - most smlar on the left, less smlar on the rght. The query mages were selected randomly from the 700 frames n the sequence. A total of 130 queres were performed. The retreval results were manually classed as good or bad matches. Ths s a relatvely easy task n ths settng, because t s easy to determne f the retreved
9 Fgure 2: Example of the retreval results acheved wth the lbrary-based coder. Graph shows the dstance to the query frame (frame 200) versus frame number for the 700 mages n the sequence. Key-frames are shown for most of the vdeo shots. Query frame s presented wth a whte border. mage belongs to the same vdeo shot as the query mage. Table 1 presents the percentage of good matches, as a functon of the rank of the retreved mage. References [1] M. Antonn, M. Barlaud, P. Matheu, and I. Daubeches. Image Codng Usng Wavelet Transform. IEEE Trans. on Image Processng, Vol. 1, Aprl [2] T. Cover and J. Thomas. Elements of Informaton Theory. John Wley, [3] A. Dempster, N. Lard, and D. Rubn. Maxmum-lkelhood from Incomplete Data va the EM Algorthm. J. of the Royal Statstcal Socety, B-39, [4] A. Gersho and R. Gray. Vector Quantzaton and Sgnal Compresson. Kluwer Academc Press, [5] Y. Gong, H. Zhang, H. Chuan, and M. Sakauch. An Image Database System wth Content Capturng and Fast Image Indexng Abltes. In Proc. Int. Conf. on Multmeda Computng and Systems, May 1994, Boston, USA. [6] ISO-IEC/JTC1/SC29/WG11. MPEG Test Model, MPEG93/457. [7] F. Lu and R. Pcard. Perodcty, drectonalty, and randomness: Wold features for mage modelng and retreval. Techncal Report 320, MIT Meda Laboratory Perceptual Computng Secton, 1995.
10 [8] N. Negroponte. Beng Dgtal. Alfred A. Knopf, Inc, [9] D. Ttterngton, A. Smth, and U. Makov. Statstcal Analyss of Fnte Mxture Dstrbutons. John Wley, [10] N. Vasconcelos. Lbrary-based Image Codng usng Vector Quantzaton of the Predcton Space. Master's thess, Massachusetts Insttute of Technology, [11] N. Vasconcelos and A. Lppman. Lbray-based Image Codng. In Proc. Int. Conf. Acoustcs, Speech, and Sgnal Processng, Adelade, Australa, Fgure 3: 20 examples of retreval from a database of 700 mages. Each row contans the results of a retreval, query mage shown on the left followed by the closest match, second best match, etc. Rank Good matches (%) Table 1: Percentage of good matches versus smlarty rank for 130 random retrevals from a sequence of 700 frames.
VQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationLossy Compression. Compromise accuracy of reconstruction for increased compression.
Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationChapter 8 SCALAR QUANTIZATION
Outlne Chapter 8 SCALAR QUANTIZATION Yeuan-Kuen Lee [ CU, CSIE ] 8.1 Overvew 8. Introducton 8.4 Unform Quantzer 8.5 Adaptve Quantzaton 8.6 Nonunform Quantzaton 8.7 Entropy-Coded Quantzaton Ch 8 Scalar
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationMotion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong
Moton Percepton Under Uncertanty Hongjng Lu Department of Psychology Unversty of Hong Kong Outlne Uncertanty n moton stmulus Correspondence problem Qualtatve fttng usng deal observer models Based on sgnal
More informationClustering & Unsupervised Learning
Clusterng & Unsupervsed Learnng Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 2012 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y
More informationLecture 3: Shannon s Theorem
CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts
More informationAsymptotic Quantization: A Method for Determining Zador s Constant
Asymptotc Quantzaton: A Method for Determnng Zador s Constant Joyce Shh Because of the fnte capacty of modern communcaton systems better methods of encodng data are requred. Quantzaton refers to the methods
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More informationGaussian Mixture Models
Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous
More informationClustering & (Ken Kreutz-Delgado) UCSD
Clusterng & Unsupervsed Learnng Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y ), fnd an approxmatng
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationResource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud
Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal
More informationTornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003
Tornado and Luby Transform Codes Ashsh Khst 6.454 Presentaton October 22, 2003 Background: Erasure Channel Elas[956] studed the Erasure Channel β x x β β x 2 m x 2 k? Capacty of Noseless Erasure Channel
More informationGEMINI GEneric Multimedia INdexIng
GEMINI GEnerc Multmeda INdexIng Last lecture, LSH http://www.mt.edu/~andon/lsh/ Is there another possble soluton? Do we need to perform ANN? 1 GEnerc Multmeda INdexIng dstance measure Sub-pattern Match
More informationFlexible Quantization
wb 06/02/21 1 Flexble Quantzaton Bastaan Klejn KTH School of Electrcal Engneerng Stocholm wb 06/02/21 2 Overvew Motvaton for codng technologes Basc quantzaton and codng Hgh-rate quantzaton theory wb 06/02/21
More informationMemory ecient adaptation of vector quantizers to time-varying channels
Sgnal Processng 83 (3) 59 58 www.elsever.com/locate/sgpro Memory ecent adaptaton of vector quantzers to tme-varyng channels orbert Gortz a;,jorg Klewer b a Insttute for Communcatons Engneerng (LT), Munch
More informationStatistical pattern recognition
Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationTransform Coding. Transform Coding Principle
Transform Codng Prncple of block-wse transform codng Propertes of orthonormal transforms Dscrete cosne transform (DCT) Bt allocaton for transform coeffcents Entropy codng of transform coeffcents Typcal
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationEntropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or
Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda
More informationMAXIMUM A POSTERIORI TRANSDUCTION
MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,
More informationMarkov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement
Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationLecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.
Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from
More informationClassification as a Regression Problem
Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class
More informationParametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010
Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationLossless Compression Performance of a Simple Counter- Based Entropy Coder
ITB J. ICT, Vol. 5, No. 3, 20, 73-84 73 Lossless Compresson Performance of a Smple Counter- Based Entropy Coder Armen Z. R. Lang,2 ITB Research Center on Informaton and Communcaton Technology 2 Informaton
More informationComparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method
Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method
More informationError Probability for M Signals
Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal
More informationOn an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1
On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool
More informationFinite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin
Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationECE559VV Project Report
ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate
More informationChapter 7 Channel Capacity and Coding
Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models
More informationThe Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction
ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationJournal of Universal Computer Science, vol. 1, no. 7 (1995), submitted: 15/12/94, accepted: 26/6/95, appeared: 28/7/95 Springer Pub. Co.
Journal of Unversal Computer Scence, vol. 1, no. 7 (1995), 469-483 submtted: 15/12/94, accepted: 26/6/95, appeared: 28/7/95 Sprnger Pub. Co. Round-o error propagaton n the soluton of the heat equaton by
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More information3.1 ML and Empirical Distribution
67577 Intro. to Machne Learnng Fall semester, 2008/9 Lecture 3: Maxmum Lkelhood/ Maxmum Entropy Dualty Lecturer: Amnon Shashua Scrbe: Amnon Shashua 1 In the prevous lecture we defned the prncple of Maxmum
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More informationPulse Coded Modulation
Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal
More informationCS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements
CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationarxiv:cs.cv/ Jun 2000
Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São
More informationStructure and Drive Paul A. Jensen Copyright July 20, 2003
Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.
More informationj) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1
Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons
More informationPsychology 282 Lecture #24 Outline Regression Diagnostics: Outliers
Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.
More informationIntroduction to information theory and data compression
Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures
More informationSTATISTICAL DISTRIBUTIONS OF DISCRETE WALSH HADAMARD TRANSFORM COEFFICIENTS OF NATURAL IMAGES
STATISTICAL DISTRIBUTIOS OF DISCRETE WALSH HADAMARD TRASFORM COEFFICIETS OF ATURAL IMAGES Vjay Kumar ath and Deepka Hazarka Department o Electroncs and Communcaton Engneerng, School o Engneerng, Tezpur
More informationSpace of ML Problems. CSE 473: Artificial Intelligence. Parameter Estimation and Bayesian Networks. Learning Topics
/7/7 CSE 73: Artfcal Intellgence Bayesan - Learnng Deter Fox Sldes adapted from Dan Weld, Jack Breese, Dan Klen, Daphne Koller, Stuart Russell, Andrew Moore & Luke Zettlemoyer What s Beng Learned? Space
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationCommunication with AWGN Interference
Communcaton wth AWG Interference m {m } {p(m } Modulator s {s } r=s+n Recever ˆm AWG n m s a dscrete random varable(rv whch takes m wth probablty p(m. Modulator maps each m nto a waveform sgnal s m=m
More informationCompression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline
Compresson n the Real World 5-853:Algorthms n the Real World Data Compresson: Lectures and 2 Generc Fle Compresson Fles: gzp (LZ77), bzp (Burrows-Wheeler), BOA (PPM) Archvers: ARC (LZW), PKZp (LZW+) Fle
More informationLecture 7: Boltzmann distribution & Thermodynamics of mixing
Prof. Tbbtt Lecture 7 etworks & Gels Lecture 7: Boltzmann dstrbuton & Thermodynamcs of mxng 1 Suggested readng Prof. Mark W. Tbbtt ETH Zürch 13 März 018 Molecular Drvng Forces Dll and Bromberg: Chapters
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More informationWinter 2008 CS567 Stochastic Linear/Integer Programming Guest Lecturer: Xu, Huan
Wnter 2008 CS567 Stochastc Lnear/Integer Programmng Guest Lecturer: Xu, Huan Class 2: More Modelng Examples 1 Capacty Expanson Capacty expanson models optmal choces of the tmng and levels of nvestments
More informationExpectation Maximization Mixture Models HMMs
-755 Machne Learnng for Sgnal Processng Mture Models HMMs Class 9. 2 Sep 200 Learnng Dstrbutons for Data Problem: Gven a collecton of eamples from some data, estmate ts dstrbuton Basc deas of Mamum Lelhood
More informationA linear imaging system with white additive Gaussian noise on the observed data is modeled as follows:
Supplementary Note Mathematcal bacground A lnear magng system wth whte addtve Gaussan nose on the observed data s modeled as follows: X = R ϕ V + G, () where X R are the expermental, two-dmensonal proecton
More informationThe Geometry of Logit and Probit
The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationNotes on Frequency Estimation in Data Streams
Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More information2016 Wiley. Study Session 2: Ethical and Professional Standards Application
6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton
More informationMaximum Likelihood Estimation (MLE)
Maxmum Lkelhood Estmaton (MLE) Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 01 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More informationSTATS 306B: Unsupervised Learning Spring Lecture 10 April 30
STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear
More informationPredictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore
Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.
More informationBOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu
BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com
More information= z 20 z n. (k 20) + 4 z k = 4
Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5
More informationTracking with Kalman Filter
Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,
More informationComparison of Regression Lines
STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence
More informationExplaining the Stein Paradox
Explanng the Sten Paradox Kwong Hu Yung 1999/06/10 Abstract Ths report offers several ratonale for the Sten paradox. Sectons 1 and defnes the multvarate normal mean estmaton problem and ntroduces Sten
More informationComputation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models
Computaton of Hgher Order Moments from Two Multnomal Overdsperson Lkelhood Models BY J. T. NEWCOMER, N. K. NEERCHAL Department of Mathematcs and Statstcs, Unversty of Maryland, Baltmore County, Baltmore,
More informationChapter 7 Channel Capacity and Coding
Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform
More informationCopyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor
Taylor Enterprses, Inc. Control Lmts for P Charts Copyrght 2017 by Taylor Enterprses, Inc., All Rghts Reserved. Control Lmts for P Charts Dr. Wayne A. Taylor Abstract: P charts are used for count data
More informationBayesian predictive Configural Frequency Analysis
Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationFinding Dense Subgraphs in G(n, 1/2)
Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng
More informationNegative Binomial Regression
STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationLow Complexity Soft-Input Soft-Output Hamming Decoder
Low Complexty Soft-Input Soft-Output Hammng Der Benjamn Müller, Martn Holters, Udo Zölzer Helmut Schmdt Unversty Unversty of the Federal Armed Forces Department of Sgnal Processng and Communcatons Holstenhofweg
More informationConsider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder.
PASSBAND DIGITAL MODULATION TECHNIQUES Consder the followng passband dgtal communcaton system model. cos( ω + φ ) c t message source m sgnal encoder s modulator s () t communcaton xt () channel t r a n
More informationSDMML HT MSc Problem Sheet 4
SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More informationInstance-Based Learning (a.k.a. memory-based learning) Part I: Nearest Neighbor Classification
Instance-Based earnng (a.k.a. memory-based learnng) Part I: Nearest Neghbor Classfcaton Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n
More informationCHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE
CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationMATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)
1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons
More informationLimited Dependent Variables
Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages
More informationCIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M
CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute
More informationPower Allocation for Distributed BLUE Estimation with Full and Limited Feedback of CSI
Power Allocaton for Dstrbuted BLUE Estmaton wth Full and Lmted Feedback of CSI Mohammad Fanae, Matthew C. Valent, and Natala A. Schmd Lane Department of Computer Scence and Electrcal Engneerng West Vrgna
More informationClustering gene expression data & the EM algorithm
CG, Fall 2011-12 Clusterng gene expresson data & the EM algorthm CG 08 Ron Shamr 1 How Gene Expresson Data Looks Entres of the Raw Data matrx: Rato values Absolute values Row = gene s expresson pattern
More informationStatistics for Economics & Business
Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable
More information