A Generalized Information Formula as the Bridge between Shannon and Popper
|
|
- Jonathan Summers
- 5 years ago
- Views:
Transcription
1 A Generalzed Informaton Formula as the Brdge between Shannon and Popper Chenguang Lu 1 Independent Researcher, survval99@hotmal.com Abstract. A generalzed nformaton formula related to logcal probablty and fuzzy set s deduced from the classcal nformaton formula. The new nformaton measure accords wth to Popper s crteron for knowledge evoluton very much. In comparson wth square error crteron, the nformaton crteron does not only reflect error of a proposton, but also reflect the partcularty of the events descrbed by the proposton. It gves a proposton wth less logcal probablty hgher evaluaton. The paper ntroduces how to select a predcton or sentence from many for forecasts and language translatons accordng to the generalzed nformaton crteron. It also ntroduces the rate fdelty theory, whch comes from the mprovement of the rate dstorton theory n the classcal nformaton theory by replacng dstorton (.e. average error crteron wth the generalzed mutual nformaton crteron, for data compresson and communcaton effcency. Some nterestng conclusons are obtaned from the rate-fdelty functon n relaton to mage communcaton. It also dscusses how to mprove Popper s theory. 1 Introducton Although Shannon s nformaton theory s successful for electrcal communcaton, t does not deal wth semantc nformaton [1]. Semantc nformaton measures have been dscussed for long tme [], [3], [4], [5]. However, no formula can be properly used to measure the nformaton of a predcton lke Tomorrow wll be rany or Temperature s about 10 C. Twenty years ago, I set up a symmetrcal model of color vson wth four pars of opponent colors nstead of three pars as n the popular zone model [6]. To prove that the more unque colors we perceve, the hgher dscrmnaton to lghts our eyes have, and hence the more nformaton we can obtan, I researched nformaton theory. Smlarly, nformaton conveyed by color vson s also related to semantcs or meanng of symbols. From 1989 to 1993, I found the generalzed nformaton formula [7], [8] and publshed the monograph [9] on generalzed nformaton. Later, I wrote some papers for further dscusson [10], [11] and publshed a monograph on portfolo and nformaton value [1]. Ths paper wll focus on the generalzed nformaton crteron for selectng one from several sentences, or predctons, and on the rate fdelty theory, whch s an mproved verson of classcal rate dstorton theory, for data compresson and communcaton effcency. Deducng Generalzed Informaton Formula Frst we consder the nformaton provded by predctons such as The growth rate of GDP of ths year wll be 8%. Let X denote the random varable takng values from set A={x 1, x } of events, such as the growth rates or temperatures, Y denote the random varable takng values from set B={y 1, y } of sentences or predctons. For each y, there s a subset or fuzzy subset A of A and y = x A. In the classcal nformaton theory, nformaton provded by y about x s x y (1 I ( x ; y = log. x 1 see hs home page: for more artcles. 1
2 Yet, n lngustc communcaton we only know meanng of a sentence or a predcton nstead of the condton probablty x y. Fortunately, we can deduce the condton probablty x A of x whle condton x A, whch means that y = x A s true, by Bayesan formula A x x Q ( x A =, ( A where Q ( A = x A x. (3 Replacng x y wth x A n (1, we have the generalzed nformaton formula: x A A x (4 I ( x ; y = log = log, x A whch s llustrated by Fg. 1. Note that the most mportant thng s that generally x A x y because x y = x x A =x x A s reported; yet, x A = x x A =x x A s true. The y may be an ncorrect predcton or a le; yet, x A means that y must be correct. If they are always equal, then generalzed nformaton formula (4 wll become the classcal nformaton formula (1. The generalzed nformaton formula can measure not only semantc nformaton, but also sensory nformaton. Let X denote one of monochromatc lghts, Y denote the correspondng color percepton, A denote fuzzy set, whch ncludes all x that are confused wth x, of A, and A x denote the confuson probablty of x wth x. Then, a color percepton can be regarded as a sentence y = The color x s about x. Hence, the generalzed nformaton formula can also be used to measure the nformaton of a color percepton. Fg. 1. Illustraton of the generalzed nformaton formula. The more precse the predcton s, the more the nformaton s provded. Informaton mght be negatve f the predcton s obvously wrong. The greater the error We use an example to show the propertes of the formula. Assume we need to predct a stock ndex for the next weekend. Let the current ndex be x=100. There are predctons y = The ndex wll be about x and y k = The ndex wll be about x k. Assume there s pror knowledge: X x Q ( X C exp[ ( = 0 ], where C s a normalzng constant; d Q ( X x X = exp[ d 0 ( X xk ], Ak X = exp[ ]. d ( A k
3 Fgure Informaton I about stock ndex X conveyed by dfferent predctons y and y k Fgure shows the changes of nformaton conveyed by y and y k respectvely wth X changng. It tells us that the more an occasonal event s correctly predcted, the more the nformaton s. The dashed lnes show the case n whch d s reduced. The correspondng predcton may be expressed as The ndex wll be very closed to x. It can be sad that when predctons are correct, the more precse the predcton s, the more the nformaton s. If a predcton s extremely fuzzy such as The ndex wll probably go up or not go up, A X can be represented by a horzontal lne and the nformaton wll always be zero. 3 Comparng Informaton Crteron wth Square Error Crteron Assume A x s a normal functon wth the maxmum 1,.e. Q ( x x x = exp[ ], d ( A where d means the precson of a predcton or the dscrmnaton of sense organ. The less the d s, the hgher the precson or the dscrmnaton s. From (4 and (5, we have ( x x (6 I( x ; y = log A. d (5 If d and A are 1 or constants, then the nformaton wll crteron become the square error crteron. In comparson wth the square error crteron, the nformaton crteron gves more precse predctons, or predctons that predct more occasonal events, hgher evaluaton. If we use two crterons to evaluate people, the square error crteron means that no error s good; yet, the nformaton crteron means that contrbuton over error s good. Actually, phlosopher K. R. Popper suggested usng nformaton as crteron to evaluate a scentfc theory or a proposton (see page 50 n [1] long tme ago. But he ddn t provde sutable nformaton formula. The above nformaton measure accords wth Popper s theory very much [8]. If A x 1, then there must be I(x; y =0. Ths s ust the mathematcal descrpton of Popper s affrmaton that a proposton that cannot be falsfed provdes no nformaton and hence s meanngless. The less a fuzzy set A s, or the more unexpected the events n A are, the less the A s, and hence the bgger the I(x; y s whle A x =1. Ths s ust the mathematcal descrpton of Popper s affrmaton that a proposton wth less pror logcal probablty has more mportant scentfc sgnfcance f t can go though tests of facts. For sentences The temperature tomorrow mornng wll be lower than 10 C and There wll be small to medum ran tomorrow, error s hard to be expressed because there s no center x n fuzzy set A. However, measurng nformaton s the same easy for gven logcal probablty A x and probablty dstrbuton x. 3
4 4 Generalzed Kullback Formula for Sentences Selecton For gven event X= x, t s easy to select a descrptve sentence y* from many sentences y 1, y accordng to the generalzed nformaton crteron. We calculate I(x ; y for each y. The y that makes I(x ; y has the maxmum s y* we want. However, n general artfcal ntellgent systems, for gven data or evdences denoted by z, we can only know the probablty dstrbuton x z nstead of exact event x. For example, to forecast ranfall, we frst get x z accordng to observed data, and then select a sentence such as There wll be heavy ran tomorrow from many as predcton accordng to x z. In theses cases, we need generalzed Kullback formula (see Fg. : I ( X ; y ( ( = Q x A Q A x x z log = x z log, (7 x A whch s the average of I(x ; y for dfferent x. Ths formula s called generalzed Kullback formula because t has the form of Kullback formula whle x A = x z (for each. We can prove that I(X; y reaches ts maxmum when x A = x z. Now, we calculate I(X; y for dfferent y. The y that makes I(X; y have the maxmum s y* we want. We can also use the generalzed condton entropy H * ( X y = x z log x A (8 as crteron to select y *. But, actually, the calculaton s not smpler than the rght part of (7 because we need A x and A to calculate x A. For language translaton, we need to translate a sentence y n a language to another sentence y* n another language. In ths case, we need to replace x z wth x A, where A s a fuzzy subset of A, so that (7 become I ( X ; y A x = x A' log. (9 A Fg. 3. The property of the generalzed Kullback formula: the closer to the fact x z the posteror probablty x A s n comparson wth the pror probablty x, the more the nformaton about X s conveyed by y ; otherwse, the nformaton s negatve 4 Generalzed Mutual Informaton Formula Actually, the probablty x n (7 may be replaced wth subectvely forecasted probablty x so that we have I( X; y = x y x A log x (10 4
5 Calculatng the average of I(X; y for dfferent y, we have generalzed mutual nformaton formula: I( X ; X = y I ( X ; y (11 = x, y log[ x A / x ] = H ( X H ( X Y = H ( Y H ( Y Y, where H ( X x log, (1 = x H ( X Y x, y log x A, (13 = = A H ( Y y log, (14 H ( Y X x, y log A x. (15 = I call H(X forecastng entropy, whch reflects the average codng length when we economcally encode X accordng to X whle real source s X, and reaches ts mnmum as X= X. I call H(X Y posteror forecastng entropy, call H(Y generalzed entropy, and call H(Y X generalzed condton entropy or fuzzy entropy. I thnk that the generalzed nformaton s subectve nformaton and Shannon nformaton s obectve nformaton. If two weather forecasters always provde opposte forecasts and one s always correct and another s always ncorrect. They convey the same obectve nformaton, but the dfferent subectve nformaton. If X= X and X A = X y for each, whch means subectve forecasts conform to obectve facts, then the subectve mutual nformaton wll be equal to obectve or Shannon s mutual nformaton. 5 Improvng Rate Dstorton Theory nto Rate Fdelty Theory Shannon proposed the rate-dstorton functon R(D for data compresson n hs creatve paper [1]. For gven source X and the upper lmt D of dstorton d ( X, Y x, y d( x, y, (16 = where d(x, y s error measure such as square error, we change channel Y X to search the mnmum of Shannon s mutual nformaton I s (X; Y. The mnmum denoted by R=R(D s ust the rate-dstorton functon, whch reflects necessary communcaton rate for gven source X and dstorton lmt D. Actually Shannon had mentoned fdelty crteron for lossy codng. He used the dstorton,.e. average error, as the crteron for optmzng lossy codng because the fdelty crteron s hard to be formulated. However, dstorton s not a good crteron n most cases. For ths reason, I replace the error functon d =d(x, y wth generalzed nformaton I = I(x ; y and dstorton d(x, Y wth generalzed mutual nformaton I(X; Y as crteron to search the mnmum of Shannon mutual nformaton I s (X; Y for gven X=X and the lower lmt G of I(X; Y. I call ths crteron I(X; Y the fdelty crteron, call the mnmum the rate-fdelty functon R(G, and call the mproved theory the rate fdelty theory. In a way smlar to that n the classcal nformaton theory [14], we can obtan the expresson of functon R(G wth parameter s: 5
6 G( s = x y exp( si λ I (17 R( s = sg( s + x log λ where s=dr/dg ndcates the slope of functon R(G (see Fg. 3 and λ = 1/ y exp( si (18 In [1], I defned nformaton value V by the ncrement of growng speed of a portfolo because of nformaton, and suggested to use the nformaton value as crteron to optmze communcaton to get rate-value functon R(V, whch s more meanngful n some cases. 6. Propertes of Rate-fdelty Functon and Image Compresson Now we use the nformaton provded by dfferent gray levels of pxels of mages (see [9] for detals as sample to dscuss the propertes of rate-fdelty functon. The conclusons are also meanngful to lngustc communcaton. Fg. 4 Relatonshp between d and R(G for b=63 Let the gray level of a dgtzed pxel be a source and the gray level s x =, =0, 1... b = k -1 wth normal probablty dstrbuton whose expectaton=b/ and standard devaton= b/8. Assume that after decodng, the pxel also has gray level y ==0, 1... b; the percepton caused by y s also denoted by y ; and dscrmnaton functon or confusng probablty functon of x s A X = exp[ ( X /(d ] (19 where d s dscrmnaton parameter. The smaller the d s, the hgher the dscrmnaton s. Fg. 4 tells us that 1 The hgher dscrmnaton can gve us more nformaton when obectve nformaton R s bg enough; yet, lower dscrmnaton s better when obectve nformaton R s less. Ths concluson can be supported by the fact that t s better to watch TV wth less pxels or wth too much snowflake-lke dsturbance from further dstance. When R=0, G<0, whch means that f a coded mage has nothng to do wth the orgnal mage, we stll beleve t reflects the orgnal mage, then the nformaton wll be negatve. For lngustc communcaton, ths means that f one beleves a fortuneteller s talk, one would be more gnorant about facts and the nformaton he has wll be reduced. 3 When G=-, R>0, whch means that certan obectve nformaton s necessary when one uses les to deceve hs enemy to some extent; or say, les aganst facts are more terrble than les accordng to nothng. 4 The each lne of functon R(G s tangent wth the lne R=G, whch means there s a matchng pont at whch obectve nformaton s equal to subectve nformaton, and the hgher the dscrmnaton s (the less the d s, the bgger the matchng nformaton amount s. For lngustc communcaton, ths means that for mprovng effcency of communcaton, t s necessary to make obectve nformaton accord wth subectve understandng. 6
7 5 The slope of R(G becomes bgger and bgger wth G ncreasng, whch tell us that for gven dscrmnaton, t s lmted to ncrease subectve nformaton, and too much obectve nformaton s wasteful. Fg. 5. Relatonshp between matchng value of R wth G, dscrmnaton parameter d, and dgtzed bt k Fg. 5. tells us that for gven dscrmnaton, there exsts the optmal dgtzed bt k' so that the matchng value of G and R reaches the maxmum. If k<k', the matchng nformaton ncreases wth k; f k>k', the matchng nformaton no longer ncreases wth k. Ths means that too hgh resoluton of mages s unnecessary or uneconomcal for gven vsual dscrmnaton. 7 Improvement of Popper Theory Popper and hs successors tell us that relablty of a scentfc proposton comes from the repeated tests by facts. What s the dfference between the repeated tests and verfcaton emphaszed by logcal postvsm? Now we dstngush pror logcal probablty and posteror logcal probablty of a proposton. For the pror logcal probablty A, the less the better; yet for the posteror logcal probablty A x, the bgger the better. So, both falsfcaton and verfcaton are necessary. There are many probablstc and fuzzy propostons, such as Hgh humdty wll brng ran, Thrty years old s young. How do we falsfy or evaluate these propostons? Can we use a counterexample to falsfy a proposton? In theses cases, the above nformaton formula can gve these propostons approprate evaluatons. 8 Conclusons Ths paper provdes the generalzed nformaton crteron, whch accords to Popper s crteron of scentfc advance, for sentences selecton and data compresson. Its ratonalty s supported by predctons evaluaton and many propertes of rate-fdelty functon. References 1. Shannon, C. E.: A Mathematcal Theory of Communcaton. Bell System Techncal Journal, 7 ( , Weaver, W.: Recent Contrbutons to the Mathematcal Theory of Communcaton. In: The Mathematcal Theory of Communcaton, edted by C. E. Shannon and W. Weaver, Unversty of Illnos Press, Urbana ( Bar-Hllel, Y. and Carnap, R.: An Outlne of a Theory of Semantc Informaton. Tech. Rep. No.47, Research Lab. of Electroncs, MIT ( Klr, G. J. and Werman M. J.: Uncertanty-Based Informaton: Elements of Generalzed Informaton Theory (Second Edton. Physca-Verlag/Sprnger-Verlag, Hedelberg anf New York ( Zhong, Y: Prncple of Informaton Scence(n Chnese. Beng: Beng Unversty of Posts and Telecommuncatons Press ( Lu, C.: Decodng Model of Colour Vson and Verfcatons. ACTA OPTIC SINICA (n Chnese, 9( ( Lu, C.: Reform of Shannon's Formulas (n Chnese. J. of Chna Insttute of Communcaton, 1( (
8 8. Lu, C.: Coherence between the generalzed mutual nformaton formula and Popper's theory of scentfc evoluton (n Chnese. J. of Changsha Unversty, ( Lu C.: A Generalzed Informaton Theory. Chna Scence and Technology Unversty Press ( Lu, C.: Meanngs of Generalzed Entropy and Generalzed Mutual Informaton for Codng (n Chnese. J. of Chna Insttute of Communcaton, 15(6 ( Lu, C.: A Generalzaton of Shannon s Informaton Theory. Int. J.of General Systems,8(6 ( Lu, C.: Entropy Theory of Portfolo and Informaton Value. Chna Scence and Technology Unversty Press ( Popper, K. R.: Conectures and Refutatons the Growth of Scentfc Knowledge. Routledge, London and New York ( Kullback, S.: Informaton and Statstcs. John Wley & Sons Inc., New York ( Berger, T.: Rate Dstorton Theory. Englewood Clffs, N.J.: Prentce-Hall (1971 8
A GENERALIZATION OF SHANNON S INFORMATION THEORY
A GENERALIZATION OF SHANNON S INFORMATION THEORY Chenguang Lu Independent Researcher 4643 North Park Drve, Port Albern, BC, Canada, V3Y 3H7 Abstract: A generalzed nformaton theory s proposed as a natural
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationOn the correction of the h-index for career length
1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationPower law and dimension of the maximum value for belief distribution with the max Deng entropy
Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng
More information2016 Wiley. Study Session 2: Ethical and Professional Standards Application
6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More information} Often, when learning, we deal with uncertainty:
Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally
More informationEGR 544 Communication Theory
EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources
More informationLossy Compression. Compromise accuracy of reconstruction for increased compression.
Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost
More informationFuzzy Boundaries of Sample Selection Model
Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN
More informationQuantum and Classical Information Theory with Disentropy
Quantum and Classcal Informaton Theory wth Dsentropy R V Ramos rubensramos@ufcbr Lab of Quantum Informaton Technology, Department of Telenformatc Engneerng Federal Unversty of Ceara - DETI/UFC, CP 6007
More informationA Note on Bound for Jensen-Shannon Divergence by Jeffreys
OPEN ACCESS Conference Proceedngs Paper Entropy www.scforum.net/conference/ecea- A Note on Bound for Jensen-Shannon Dvergence by Jeffreys Takuya Yamano, * Department of Mathematcs and Physcs, Faculty of
More informationUncertainty in measurements of power and energy on power networks
Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationImprovement of Histogram Equalization for Minimum Mean Brightness Error
Proceedngs of the 7 WSEAS Int. Conference on Crcuts, Systems, Sgnal and elecommuncatons, Gold Coast, Australa, January 7-9, 7 3 Improvement of Hstogram Equalzaton for Mnmum Mean Brghtness Error AAPOG PHAHUA*,
More informationBayesian predictive Configural Frequency Analysis
Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationComparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method
Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method
More informationSimulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests
Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More informationComparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy
Comparatve Studes of Law of Conservaton of Energy and Law Clusters of Conservaton of Generalzed Energy No.3 of Comparatve Physcs Seres Papers Fu Yuhua (CNOOC Research Insttute, E-mal:fuyh1945@sna.com)
More informationThis column is a continuation of our previous column
Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard
More informationIntroduction to Information Theory, Data Compression,
Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures
More informationMDL-Based Unsupervised Attribute Ranking
MDL-Based Unsupervsed Attrbute Rankng Zdravko Markov Computer Scence Department Central Connectcut State Unversty New Brtan, CT 06050, USA http://www.cs.ccsu.edu/~markov/ markovz@ccsu.edu MDL-Based Unsupervsed
More informationChapter 13: Multiple Regression
Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to
More informationANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)
Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of
More informationRegulation No. 117 (Tyres rolling noise and wet grip adhesion) Proposal for amendments to ECE/TRANS/WP.29/GRB/2010/3
Transmtted by the expert from France Informal Document No. GRB-51-14 (67 th GRB, 15 17 February 2010, agenda tem 7) Regulaton No. 117 (Tyres rollng nose and wet grp adheson) Proposal for amendments to
More informationFrom Bayesian Inference to Logical Bayesian Inference:
From Bayesan Inference to Logcal Bayesan Inference: A New Mathematcal Frame for Semantc Communcaton and Machne Learnng Chenguang Lu lcguang@foxmal.com Abstract. Bayesan Inference (BI) uses the Bayes posteror
More informationCONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION
CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING N. Phanthuna 1,2, F. Cheevasuvt 2 and S. Chtwong 2 1 Department of Electrcal Engneerng, Faculty of Engneerng Rajamangala
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationStatistics II Final Exam 26/6/18
Statstcs II Fnal Exam 26/6/18 Academc Year 2017/18 Solutons Exam duraton: 2 h 30 mn 1. (3 ponts) A town hall s conductng a study to determne the amount of leftover food produced by the restaurants n the
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More information/ n ) are compared. The logic is: if the two
STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence
More informationA New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane
A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,
More informationPulse Coded Modulation
Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationMotion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong
Moton Percepton Under Uncertanty Hongjng Lu Department of Psychology Unversty of Hong Kong Outlne Uncertanty n moton stmulus Correspondence problem Qualtatve fttng usng deal observer models Based on sgnal
More informationMAKING A DECISION WHEN DEALING WITH UNCERTAIN CONDITIONS
Luca Căbulea, Mhaela Aldea-Makng a decson when dealng wth uncertan condtons MAKING A DECISION WHEN DEALING WITH UNCERTAIN CONDITIONS. Introducton by Luca Cabulea and Mhaela Aldea The decson theory offers
More informationLecture 17 : Stochastic Processes II
: Stochastc Processes II 1 Contnuous-tme stochastc process So far we have studed dscrete-tme stochastc processes. We studed the concept of Makov chans and martngales, tme seres analyss, and regresson analyss
More informationApplication of Set Pair Analysis on QPE and Rain Gauge in Flood Forecasting Zhiyuan Yin1,a, Fang Yang2,b, Tieyuan Shen1,c
Advances n Engneerng Research (AER), volume 24 2nd Internatonal Symposum on Advances n Electrcal, Electroncs and Computer Engneerng (ISAEECE 27) Applcaton of Set Par Analyss on QPE and Ran Gauge n Flood
More informationMaximizing the number of nonnegative subsets
Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum
More informationRotation Invariant Shape Contexts based on Feature-space Fourier Transformation
Fourth Internatonal Conference on Image and Graphcs Rotaton Invarant Shape Contexts based on Feature-space Fourer Transformaton Su Yang 1, Yuanyuan Wang Dept of Computer Scence and Engneerng, Fudan Unversty,
More informationComparison of Regression Lines
STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence
More informationLecture 3 Stat102, Spring 2007
Lecture 3 Stat0, Sprng 007 Chapter 3. 3.: Introducton to regresson analyss Lnear regresson as a descrptve technque The least-squares equatons Chapter 3.3 Samplng dstrbuton of b 0, b. Contnued n net lecture
More informationDETERMINATION OF UNCERTAINTY ASSOCIATED WITH QUANTIZATION ERRORS USING THE BAYESIAN APPROACH
Proceedngs, XVII IMEKO World Congress, June 7, 3, Dubrovn, Croata Proceedngs, XVII IMEKO World Congress, June 7, 3, Dubrovn, Croata TC XVII IMEKO World Congress Metrology n the 3rd Mllennum June 7, 3,
More informationSemantic Information Measure with Two Types of Probability for Falsification and Confirmation 1
1 Semantc Informaton Measure wth Two Types of Probablty for Falsfcaton and Confrmaton 1 Chenguang Lu Survval99(a)gmal.com Home Page: http://survvor99.com/lcg/englsh Abstract: Logcal Probablty (LP) s strctly
More information28. SIMPLE LINEAR REGRESSION III
8. SIMPLE LINEAR REGRESSION III Ftted Values and Resduals US Domestc Beers: Calores vs. % Alcohol To each observed x, there corresponds a y-value on the ftted lne, y ˆ = βˆ + βˆ x. The are called ftted
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationChapter 7 Channel Capacity and Coding
Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models
More informationCHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION
CAPTER- INFORMATION MEASURE OF FUZZY MATRI AN FUZZY BINARY RELATION Introducton The basc concept of the fuzz matr theor s ver smple and can be appled to socal and natural stuatons A branch of fuzz matr
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More information3) Surrogate Responses
1) Introducton Vsual neurophysology has benefted greatly for many years through the use of smple, controlled stmul lke bars and gratngs. One common characterzaton of the responses elcted by these stmul
More informationCase Study of Markov Chains Ray-Knight Compactification
Internatonal Journal of Contemporary Mathematcal Scences Vol. 9, 24, no. 6, 753-76 HIKAI Ltd, www.m-har.com http://dx.do.org/.2988/cms.24.46 Case Study of Marov Chans ay-knght Compactfcaton HaXa Du and
More informationTemperature. Chapter Heat Engine
Chapter 3 Temperature In prevous chapters of these notes we ntroduced the Prncple of Maxmum ntropy as a technque for estmatng probablty dstrbutons consstent wth constrants. In Chapter 9 we dscussed the
More informationOutline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]
DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm
More informationColor Rendering Uncertainty
Australan Journal of Basc and Appled Scences 4(10): 4601-4608 010 ISSN 1991-8178 Color Renderng Uncertanty 1 A.el Bally M.M. El-Ganany 3 A. Al-amel 1 Physcs Department Photometry department- NIS Abstract:
More informationTurbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH
Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant
More information4DVAR, according to the name, is a four-dimensional variational method.
4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The
More informationSTAT 3008 Applied Regression Analysis
STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,
More information1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands
Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of
More informationA Network Intrusion Detection Method Based on Improved K-means Algorithm
Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton
More informationThe lower and upper bounds on Perron root of nonnegative irreducible matrices
Journal of Computatonal Appled Mathematcs 217 (2008) 259 267 wwwelsevercom/locate/cam The lower upper bounds on Perron root of nonnegatve rreducble matrces Guang-Xn Huang a,, Feng Yn b,keguo a a College
More informationLecture 3: Shannon s Theorem
CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts
More informationThe Exact Formulation of the Inverse of the Tridiagonal Matrix for Solving the 1D Poisson Equation with the Finite Difference Method
Journal of Electromagnetc Analyss and Applcatons, 04, 6, 0-08 Publshed Onlne September 04 n ScRes. http://www.scrp.org/journal/jemaa http://dx.do.org/0.46/jemaa.04.6000 The Exact Formulaton of the Inverse
More informationLINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity
LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have
More information18. SIMPLE LINEAR REGRESSION III
8. SIMPLE LINEAR REGRESSION III US Domestc Beers: Calores vs. % Alcohol Ftted Values and Resduals To each observed x, there corresponds a y-value on the ftted lne, y ˆ ˆ = α + x. The are called ftted values.
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More informationLecture 3: Probability Distributions
Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.
More informationCorrelation and Regression. Correlation 9.1. Correlation. Chapter 9
Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationIntroduction to information theory and data compression
Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures
More informationFinding Dense Subgraphs in G(n, 1/2)
Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng
More informationDPCM Compression for Real-Time Logging While Drilling Data
28 JOURAL OF SOFTWARE, VOL. 5, O. 3, MARCH 21 DPCM Compresson for Real-Tme Loggng Whle Drllng Data Yu Zhang Modern Sgnal Processng & Communcaton Group, Insttute of Informaton Scence, Bejng Jaotong Unversty,
More informationAPPENDIX 2 FITTING A STRAIGHT LINE TO OBSERVATIONS
Unversty of Oulu Student Laboratory n Physcs Laboratory Exercses n Physcs 1 1 APPEDIX FITTIG A STRAIGHT LIE TO OBSERVATIOS In the physcal measurements we often make a seres of measurements of the dependent
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More informationA PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,
More informationCSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography
CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve
More informationChapter 3 Describing Data Using Numerical Measures
Chapter 3 Student Lecture Notes 3-1 Chapter 3 Descrbng Data Usng Numercal Measures Fall 2006 Fundamentals of Busness Statstcs 1 Chapter Goals To establsh the usefulness of summary measures of data. The
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationUsing Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*
Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton
More informationEconomics 130. Lecture 4 Simple Linear Regression Continued
Economcs 130 Lecture 4 Contnued Readngs for Week 4 Text, Chapter and 3. We contnue wth addressng our second ssue + add n how we evaluate these relatonshps: Where do we get data to do ths analyss? How do
More informationThe Minimum Universal Cost Flow in an Infeasible Flow Network
Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran
More informationEntropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or
Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda
More informationGravitational Acceleration: A case of constant acceleration (approx. 2 hr.) (6/7/11)
Gravtatonal Acceleraton: A case of constant acceleraton (approx. hr.) (6/7/11) Introducton The gravtatonal force s one of the fundamental forces of nature. Under the nfluence of ths force all objects havng
More informationStatistics for Managers Using Microsoft Excel/SPSS Chapter 14 Multiple Regression Models
Statstcs for Managers Usng Mcrosoft Excel/SPSS Chapter 14 Multple Regresson Models 1999 Prentce-Hall, Inc. Chap. 14-1 Chapter Topcs The Multple Regresson Model Contrbuton of Indvdual Independent Varables
More informationFREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,
FREQUENCY DISTRIBUTIONS Page 1 of 6 I. Introducton 1. The dea of a frequency dstrbuton for sets of observatons wll be ntroduced, together wth some of the mechancs for constructng dstrbutons of data. Then
More informationErrors in Nobel Prize for Physics (7) Improper Schrodinger Equation and Dirac Equation
Errors n Nobel Prze for Physcs (7) Improper Schrodnger Equaton and Drac Equaton u Yuhua (CNOOC Research Insttute, E-mal:fuyh945@sna.com) Abstract: One of the reasons for 933 Nobel Prze for physcs s for
More informationChapter - 2. Distribution System Power Flow Analysis
Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load
More informationPredictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore
Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.
More informationMAXIMUM A POSTERIORI TRANSDUCTION
MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,
More informationNUMERICAL RESULTS QUALITY IN DEPENDENCE ON ABAQUS PLANE STRESS ELEMENTS TYPE IN BIG DISPLACEMENTS COMPRESSION TEST
Appled Computer Scence, vol. 13, no. 4, pp. 56 64 do: 10.23743/acs-2017-29 Submtted: 2017-10-30 Revsed: 2017-11-15 Accepted: 2017-12-06 Abaqus Fnte Elements, Plane Stress, Orthotropc Materal Bartosz KAWECKI
More informationGlobal Sensitivity. Tuesday 20 th February, 2018
Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values
More informationA Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function
Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,
More information