SIMULTANEOUS DEMPSTER-SHAFER CLUSTERING AND GRADUAL DETERMINATION OF NUMBER OF CLUSTERS USING A NEURAL NETWORK STRUCTURE.

Size: px
Start display at page:

Download "SIMULTANEOUS DEMPSTER-SHAFER CLUSTERING AND GRADUAL DETERMINATION OF NUMBER OF CLUSTERS USING A NEURAL NETWORK STRUCTURE."

Transcription

1 SIMULTANEOUS DEMPSTER-SHAFER CLUSTERING AND GRADUAL DETERMINATION OF NUMBER OF CLUSTERS USING A NEURAL NETWORK STRUCTURE Johan Schubert Department of Informaton System Technology Dvson of Command and Control Warfare Technology Defence Research Establshment SE 7 9 Stockholm, Sweden schubert@sto.foa.se ABSTRACT In ths paper we extend an earler result wthn Dempster- Shafer theory [ Fast Dempster-Shafer Clusterng Usng a Neural Network Structure, n Proc. Seventh Int. Conf. Informaton Processng and Management of Uncertanty n Knowledge-Based Systems (IPMU 98)] where several peces of evdence were clustered nto a fxed number of clusters usng a neural structure. Ths was done by mnmzng a metaconflct functon. We now develop a method for smultaneous clusterng and determnaton of number of clusters durng teraton n the neural structure. We let the output sgnals of neurons represent the degree to whch a peces of evdence belong to a correspondng cluster. From these we derve a probablty dstrbuton regardng the number of clusters, whch gradually durng the teraton s transformed nto a determnaton of number of clusters. Ths gradual determnaton s fed back nto the neural structure at each teraton to nfluence the clusterng process.. INTRODUCTION In ths paper we develop a neural network structure for smultaneous clusterng of evdence wthn Dempster- Shafer theory [] and gradual determnaton of number of clusters. The clusterng s done by mnmzng a metaconflct functon. The studed problem concerns the stuaton when we are reasonng wth multple events whch should be handled ndependently. We use the clusterng process to separate the evdence nto clusters that wll be handled separately. In an earler paper [] we developed a method based on clusterng wth a neural network structure nto a fxed number of clusters. We used the structure of the neural network, but no learnng was done to set the weghts of the network. Instead, all the weghts were set drectly by a method usng the conflct n Dempster s rule as nput. Ths clusterng approach was a great mprovement on computatonal complexty compared to a prevous method based on teratve optmzaton [ 7], although ts clusterng performance was not equally good. In order to mprove on clusterng performance a hybrd of the two methods has also been developed [8]. Here, the dea of gradual determnaton of number of clusters s developed and ntegrated wth the neural structure to run smultaneously wth the clusterng process. In [] a methodology was developed for fndng a posteror probablty dstrbuton concerned wth the number of clusters. Ths was based on the fnal clusterng result of an teratve optmzaton of the metaconflct functon and partal specfcatons of nonspecfc peces of evdence, uncertan wth respect to whch events they were referrng to []. Here, ths approach s expanded to nclude a gradual determnaton of number of clusters. Instead of usng the fnal clusterng result and partal specfcatons, we use ncremental clusterng states durng the teraton n the neural network, where we let an output sgnal of a neuron represent the degree to whch a pece of evdence belongs to the correspondng cluster. Ths yelds a probablty dstrbuton for the number of clusters. By usng a change n entropy of the output sgnals n the neural structure durng teraton, we can gradually transform the probablty dstrbuton nto a determnaton of number of clusters. The result of ths gradual determnaton s fed back nto the neural network durng the teraton to nfluence the clusterng process. In the early stages of the teraton the number of clusters vary slowly wth the change n the gradual determnaton of number of clusters and any changes n conflct n the clusters. In the latter stages, the teraton converges on a fxed number of clusters. The dea to use a neural network for optmzaton was nspred by an approxmate soluton to the travelng salesman problem by Hopfeld and Tank [9]. The clusterng methodology developed over several papers was ntally ntended for preprocessng of ntellgence nformaton for stuaton analyss n antsubmarne warfare [ ]. In secton we descrbe the problem at hand and n secton we contnue wth the neural structure for smultaneous clusterng and gradual determnaton of number of clusters. Frst, we descrbe the approach to clusterng wth a neural structure. Secondly, we focus on the gradual determnaton of number of clusters and how ths s ntegrated nto the neural structure to run smultaneously wth the clusterng process. The computatonal and clusterng performance s nvestgated n secton. Fnally, n secton, we draw conclusons.. THE PROBLEM If we receve several peces of evdence about dfferent and separate events and the peces of evdence are mxed up, we want to arrange them accordng to whch event they are referrng to. Thus, we partton the set of all peces of evdence χ nto subsets where each subset refers to a n Proc. 999 Informaton, Decson and Control (IDC 99), pp., Adelade, February 999, IEEE, Pscataway, NJ, 999.

2 Metalevel Metaconflct c c c c c multple events. The metaconflct s derved as the plausblty that the parttonng s correct when the conflct n each subset s vewed as a pece of metalevel evdence aganst the parttonng of the set of evdence, χ, nto the subsets, χ. DEFINITION. Let the metaconflct functon, e 9 e e e e e e 8 e χ χ Partton e e e e 7 e χ χ Four subsets OK? Fg. The conflct n each subset of the partton becomes a pece of evdence at the metalevel. partcular event. In fgure these subsets are denoted by χ and the conflct when all peces of evdence n χ are combned by Dempster s rule s denoted by c. Here, thrteen peces of evdence are parttoned nto four subsets. When the number of subsets s uncertan there wll also be a doman conflct c whch s a conflct between the current hypothess about the number of subsets and our pror belef. The partton s then smply an allocaton of all peces of evdence to the dfferent events. Snce these events do not have anythng to do wth each other, we wll analyze them separately. Now, f t s uncertan to whch event some peces of evdence s referrng we have a problem. It could then be mpossble to know drectly f two dfferent peces of evdence are referrng to the same event. We do not know f we should put them nto the same subset or not. Ths problem s then a problem of organzaton. Evdence from dfferent events that we want to analyze are unfortunately mxed up and we are facng a problem n separatng them. To solve ths problem, we can use the conflct n Dempster s rule when all peces of evdence wthn a subset are combned, as an ndcaton of whether these peces of evdence belong together. The hgher ths conflct s, the less credble that they do. Let us create an addtonal pece of evdence for each subset wth the proposton that ths s not an adequate partton. We have a smple frame of dscernment on the metalevel Θ = {AdP, AdP}, where AdP s short for adequate partton. Let the proposton take a value equal to the conflct of the combnaton wthn the subset, m χ ( AdP) = Conf( { e j e j χ }). These new peces of evdence, one regardng each subset, reason about the partton of the orgnal evdence. Just so we do not confuse them wth the orgnal evdence, let us call ths evdence metalevel evdence and let us say that ts combnaton and the analyss of that combnaton take place on the metalevel, fgure. We establsh [] a crteron functon of overall conflct called the metaconflct functon for reasonng wth r Mcf( r, e, e,, e n ) = ( c ) ( c ), = be the conflct aganst a parttonng of n peces of evdence of the set χ nto r dsjont subsets χ. Here, c s the conflct n subset and c s the conflct between r subsets and propostons about possble dfferent number of subsets. We wll use the mnmzng of the metaconflct functon as the method of parttonng the evdence nto subsets representng the events. Ths method wll also handle the stuaton when the number of events are uncertan.. NEURAL STRUCTURE We wll study a test problem where peces of evdence, all smple support functons wth elements from Θ, are clustered nto an unknown number of clusters, where Θ = {,,,..., }. Snce the elements are all the dfferent subsets of the frame there s always a partton nto fve clusters wth a global mnmum to the metaconflct functon equal to zero. If you put all evdence wth the element nto χ, of the remanng elements all those wth the element nto χ, etc., you get zero conflct n every cluster. The reason we choose a problem where the mnmum metaconflct s zero s that t makes a good test example for evaluatng performance. When mnmzng the metaconflct functon usng a neural structure we wll choose an archtecture that mnmzes a sum. Thus, we have to make some change to the functon that we want to mnmze. If we take the logarthm of one mnus the metaconflct functon, we can change from mnmzng Mcf to mnmzng a sum. Let us change the mnmzaton as follows mn Mcf = mn ( c ) max Mcf = max ( c ) max log( Mcf ) = max log ( c ) = max log( c ) = mn log( c ) where log( c ) [, ] s a weght. Snce the mnmum of Mcf s obtaned when the fnal sum s mnmal, the mnmzaton of the fnal sum yelds the same result as a mnmzaton of Mcf would have. Thus, n the neural network we wll not let the weghts be drectly dependent on the conflcts between dfferent peces of evdence but rather on log( c jk ), where c jk s the conflct between the jth and kth pece of evdence;

3 n n n Cluster st nd rd th th th n st nd rd th th Evdence... n 7 Fg. Neural network. Each column corresponds to a cluster and each row corresponds to a pece of evdence. st st th 9th th 7th st teraton teraton teraton teraton teraton teraton th teraton 9th teraton m j m k, conflct c jk =, no conflct. Ths s a slght smplfcaton snce the neural structure wll now mnmze a sum of log( c jk ). Let us study the calculatons takng place n the neural network durng an teraton. We use the same termnology as Hopfeld and Tank [] wth nput voltages as the weghted sum of nput sgnals to a neuron, output voltages as the output sgnal from a neuron, and nhbton terms as negatve weghts. For each neuron n mn we calculate an nput voltage u as the weghted sum of all sgnals from row m and column n and from a doman term, fgure. Ths sum s the prevous nput voltage for the prevous teraton of n mn plus a gan factor η tmes a sum of fve terms. The frst term s the sum of output voltages V n of all neurons of the same column as n mn, weghted by a data-term nhbton dt tmes the weght of conflct log( c n ) plus a global nhbton g. The second term s the sum of output voltages V mj of all neurons of the same row as n mn, weghted by a row-nhbton r and the global nhbton. The thrd term s the gradual determnaton of number of clusters. For a neuron n mn n column r we have the gradual determnaton factor gd t ( χ = r) weghted by a doman-nhbton Dt plus the global nhbton. The two last terms are an exctaton bas eb mnus the prevous nput voltage of n mn. Thus, the new nput voltage to n mn at teraton t + s u t mn + = u t mn + η [ dt log( c n ) + g] V n + [ r + g] V mj + [ Dt + g] gd t ( χ = n) + eb u t mn j n where gd t ( χ = n) [, ], see secton.. We have used the followng parameter settngs: η =, dt =, r =, Dt =, g =, and the exctaton bas eb = 8. From the new nput voltage to n mn we can calculate a new output voltage of n mn V t mn + -- tanh ut mn + = + ( ) u where tanh s the hyperbolc tangent, u =., and V t mn + [,]. Intally, before the teraton begns, each neuron s rd 7th st th 9th rd teraton teraton teraton teraton teraton teraton ntated wth an nput voltage of u + nose where u = u atanh n -- 7th teraton st teraton Fg. Sxteen dfferent states (teratons) of a neural network wth 8 neurons. From left to rght: The convergence of clusterng peces of evdence nto an unknown number of clusters at every fourth teraton. In each snapshot of an teraton each of the sx columns represent one possble cluster and each of the rows represent one pece of evdence. The lnear dmenson of each square s proportonal to the output voltage of the neuron and represent the degree to whch a pece of evdence belongs to a cluster. In the fnal state each row has one output voltage of. and fve output voltages of.. A pece of evdence, represented by a row, s now clustered nto the cluster where the output voltage s.. and atanh s the hyperbolc arc tangent. The ntal nput voltage s set at u + δu where δu, the nose, s a random number chosen unformly n the nterval. u δu. u. In each teraton all new voltages are calculated from the results of the prevous teraton. Ths contnues untl convergence s reached. As long as the weghts of the neural network s symmetrc convergence s always guaranteed. In fgure the convergence of a neural network wth rows and sx columns for clusterng peces of evdence nto a unknown number of clusters s shown. After convergence s acheved the conflct wthn each cluster s calculated by combnng those peces of evdence for whch the output voltage for the column s.. We now have a conflct for each subset and can calculate the overall metaconflct, Mcf, by the prevous formula.

4 .8... Fg. m χ ( χ χ) for all χ over teratons.. Gradual determnaton of number of clusters Makng a gradual determnaton of the number of clusters s a process wth several steps. Frst, we use the dea that each pece of evdence n a subset supports the exstence of that subset to the degree that that pece of evdence supports anythng at all []. Durng an teraton t s uncertan to whch cluster a peces of evdence belongs. Therefore, we wll use every pece of evdence n each cluster but weghted by ts output voltage for the cluster. All these weghted peces of evdence n each cluster χ are then combned. The degree to whch the result from ths combnaton supports anythng at all other than the entre frame s the degree to whch these peces of evdence taken together supports the exstence of χ. Thus, we have t + t + m χ ( χ χ) = k ( V mn + V mn m m ( Θ) ), m t + t + m χ ( Θ) = k ( V mn + V mn m m ( Θ) ) m where k s the conflct n Dempster s rule of the combnaton. Should nothng be supported by the evdence n some cluster that evdence s meanngless and can be thrown away and ths partcular cluster s not needed. In fgure m χ ( χ χ) s calculated for all sx possble clusters over teratons. Ths s the same problem as n fgure. We notce how the basc probablty of the th clusters drops off very fast durng the frst few teratons and remans low for the remander of the teratons. Also the basc probablty of the th cluster drops off ntally but comes back agan durng the teraton process to fnsh hgh. These sx peces of evdence m χ, one regardng each subset, are then combned. We have m χ (( χ ) χ) = m χ ( χ χ) m ( χ Θ ), j ( χ χ ) j χ j χ n m χ ( Θ) = m χ ( Θ). = where χ χ and χ = { χ, χ, χ, }. From ths we can create a new type of evdence by exchangng all propostons n the prevous ones that are conjunctons of r terms, χ, for one proposton n the new type of evdence that s on the form χ r. Here, r. The sum of probablty of all conjunctons of length r n the prevous peces of evdence s then awarded to the focal element n ths new pece of evdence whch supports the proposton that χ r. We get,.8... s the conflct n that fnal combnaton. Thus, by vewng each pece of evdence n a subset as support for the exstence of that subset we are able to derve a posteror probablty dstrbuton concerned wth the queston of how many subsets there are. Ths dstrbuton s plotted n fgure. It s nterestng to observe that the basc probabltes for propostons such as m χ ( χ r) has now been moved upwards when we derve the probabltes m ( χ = r). In fgure 7 we see the same plot projected on the teraton probablty axes. We notce the ntal drop off n the alternatves wth fve and especally wth sx clusters, as well as the early rse of the three and four cluster alternatves. In the frst few teratons the sx cluster alternatve s stll the preferable choce, but very quckly the preferable choce becomes fve and then four clusters. After tryng to cluster the evdence nto four Fg. m χ ( χ r) for all χ over teratons. m χ ( χ r) = m χ (( χ ) χ), χ * χ * = r m χ ( Θ) = m χ ( Θ). Taken as a whole ths gves us an opnon about the probablty of dfferent numbers of subsets. In fgure we see the basc probablty m χ ( χ r) for dfferent mnmal number of clusters over the entre teraton. Note that we are no longer talkng about ndvdual clusters but of the derved support regardng the actual number of clusters. These newly created peces of evdence can now be combned wth a pror probablty dstrbuton, m ()., from the problem specfcaton. Ths s a doman dependent dstrbuton. Wthout a pror probablty dstrbuton there s nothng to hold the clusterng together and we could fnd as many clusters as peces of evdence f that was allowed by the neural structure. In these trals we have chosen a dstrbuton where m( χ = r) = p ( r ), where p s a constant. We get m ( χ = r) = m( χ = r) m k χ ( Θ) + m χ ( χ j) j = where n n k = m( χ = r) m χ ( χ j).8... = j = + Fg. m ( χ = r) for all χ over teratons.

5 ... χ = χ = χ = χ = m ( χ = r) Fg 7.. clusters the nternal conflcts become too large and t s necessary to reclam the ffth cluster. Ths s observed here n the dramatc shft around the teraton. After that shft, the fve cluster alternatve remans preferable throughout the remander of the teraton. Sooner or later t becomes necessary to determne how many clusters there are. The bass for makng such a determnaton s the probablty dstrbuton n fgure 7. The obvous choce s that of the hghest probablty, and that s how we wll decde n the fnal teraton, but we must avod makng an early determnaton. As seen n fgure 7, an early choce based on maxmum probablty s lkely to be false. However, nether can we wat untl the fnal teraton to determne the number of clusters. We must gve the neural teraton a good opportunty to converge on the rght problem. The dea s to accept the probablty dstrbuton n fgure 7 but make a gradual determnaton of number of clusters throughout the teratve process that s not fnal untl the last teraton. We do ths by measurng how far we have traveled from the ntal state untl the fnal state when convergence s reached. Intally all output voltages of the neurons are scattered maxmally among the dfferent choces whle n the fnal state there s no scatterng at all. A normalzed Shannon entropy s thus a good measure of how far we have traveled towards the fnal state, the entropy beng zero at the fnal state. The entropy at teraton t s calculated over all peces of evdence and all clusters as Entropy t = Vmn t logvmn t m n where the normalzed entropy α t s Entropy t dvded by Entropy. The normalzed entropy s plotted n fgure 8 as a sold lne, t goes from to n teratons. The gradual determnaton s made by substtutng the probablty dstrbuton n fgures and 7 wth gd t ( χ = r) = α t + α t m ( χ = r), ff s. m ( χ = r) m ( χ = s), gd t ( χ = r) = α t m ( χ = s), otherwse In fgure 9 the gradual determnaton of the posteror probablty dstrbuton n fgures s plotted. In ts frst teraton they are dentcal, gradually the determnaton takes place and n the fnal teraton an exact determnaton of fve clusters s made. From fgure 9 we observe that even though the exact determnaton of fve clusters does not take place untl the fnal teraton, the stuaton s pretty clear long before. At the nd and th teratons gd ( χ = ) =.9 and gd ( χ = ) =.9, respectvely, gvng the clusterng process ample tme to converge on the rght number of clusters. Also, n fgure we notced a good clusterng takng place n the ffth cluster from the 7th teraton, gvng the process some addtonal teratons to converge.. PERFORMANCE In ths secton we wll compare the clusterng performance and computatonal complexty of the smultaneous clusterng and gradual determnaton of numbers of clusters presented n ths paper wth clusterng nto a known number of fve clusters whch was done n []. The comparson s done usng the prevous descrbed peces of evdence wth dfferent random basc probablty assgnments over ten runs. In fgure 8 the metaconflct over teratons of one of the runs resultng n fve clusters s shown, dashed lne. In fgure the conflct per cluster s shown. We notce the tendency that the conflct drops off n one cluster at a tme. Quckly the conflct drops off n cluster sx and then n fve. Intally, there s a very hgh conflct n cluster two but t also, drops off wthn the frst ten teratons. The remander of the teraton concerns manly clusters one, three and four. Cluster three holds out untl the mddle of the teraton but towards the end most of the remanng conflct s stuated n cluster one. In table we fnd a somewhat hgher computaton tme for the new approach, as would be expected, snce we have two dfferent convergng processes runnng smultaneously. Table : Computaton tme and teratons (mean). # Clusters unknown Neural structure tme (s) teratons Fg 9. The gradual determnaton of number of clusters..... Fg 8. Normalzed entropy α t, sold lne, and metaconflct, dashed lne, over teratons. Fg. Conflct per cluster durng the teraton.

6 If we consder that the clusterng nto a known number of clusters has to be performed for many possble dfferent numbers of clusters one after the other, ths can be a sgnfcant savng of tme. In the ten runs of clusterng peces of evdence nto an unknown number of clusters four of these runs ended up usng fve clusters, fve runs used sx clusters and one run used only four clusters. The use of sx clusters n some of the runs s the result of a below average clusterng performance. If the number of clusters had been fxed to fve, we would have seen a hgher than average conflct n these runs. Here, ths s nterpreted as a need for an addtonal cluster. Snce the propostons of the evdence are the dfferent subsets of the frame Θ = {,,,..., }, t wll never be possble to fnd a conflct free parttonng of the evdence nto four clusters. The reason why such a partton occurred once, s that the basc probablty assgnments attached to the propostons are random number between and. If two propostons are n conflct and ther assgnments are close to zero the numerc conflct s stll a small number and mght be found acceptable compared to the use of an addtonal cluster. If all assgnments had been. we could never have had four clusters n ths test. The clusterng performance s somewhat dffcult to measure because of the dfferent number of subsets. A poor clusterng mght yeld more clusters than a good one wth a lower conflct per cluster. For ths reason we wll compare the performance of those runs that resulted n fve clusters when clusterng nto a unknown number of clusters wth clusterng nto a known fve clusters. We have four runs resultng n fve clusters. They are consdered the four best runs out of ten. We compare them to the four best runs out of ten when clusterng nto a known fve clusters. We also compare the four runs wth the same set of random assgnments when the number of clusters s known to be fve, table. Table : Metaconflct. # Clusters unknown best runs of best of. mean of.8. runs wth same assgnments best of. mean of.9 As expected, the conflct of the new approach s hgher, but the actual numbers are qute small. The best crtera of a good clusterng performance s the conflct per cluster and pece of evdence. These are tabulated n table. For example, we fnd a mean conflct per cluster and pece of evdence of only n the case of clusterng nto an unknown number of clusters. Ths can be compared wth the average numerc conflct of % between two conflctng peces of evdence. Table : Mean metaconflct per cluster and evdence. # Clusters unknown best of runs / cluster.. / evdence.. runs wth same assgnments / cluster. / evdence.. CONCLUSIONS We have demonstrated that t s possble to use a neural network structure to perform smultaneous clusterng of Dempster-Shafer evdence wth a gradual determnaton of the number of clusters when ths s unknown. We found the computatonal and clusterng performance to be almost as good as when clusterng nto a fxed number of clusters. Ths approach s advantageous, snce now only one clusterng process has to be performed. REFERENCES [] G. Shafer, A Mathematcal Theory of Evdence, Prnceton Unversty Press, Prnceton, 97. [] J. Schubert, Fast Dempster-Shafer Clusterng Usng a Neural Network Structure, n Proc. Seventh Int. Conf. Informaton Processng and Management of Uncertanty n Knowledge-Based Systems (IPMU 98), pp. 8, Unversté de La Sorbonne, Pars, France, July 998, Edtons EDK, Pars, 998. [] J. Schubert, On Nonspecfc Evdence, Int. J. Intell. Syst., Vol 8, pp [] J. Schubert, Specfyng Nonspecfc Evdence, Int. J. Intell. Syst., Vol, pp.. [] J. Schubert, Fndng a Posteror Doman Probablty Dstrbuton by Specfyng Nonspecfc Evdence, Int. J. Uncertanty, Fuzzness and Knowledge- Based Syst., Vol, pp. 8. [] J. Schubert, Cluster-based Specfcaton Technques n Dempster-Shafer Theory, n Symbolc and Quanttatve Approaches to Reasonng and Uncertanty, Proc. European Conf. (ECSQARU 9), pp. 9, Unversty of Frbourg, Swtzerland, July 99, Sprnger-Verlag (LNAI 9), Berln, 99. [7] J. Schubert, Creatng Prototypes for Fast Classfcaton n Dempster-Shafer Clusterng, n Qualtatve and Quanttatve Practcal Reasonng, Proc. Frst Int. Jont Conf. (ECSQARU-FAPR 97), pp., Bad Honnef, Germany, 9 June 997, Sprnger-Verlag (LNAI ), Berln, 997. [8] J. Schubert, A Neural Network and Iteratve Optmzaton Hybrd for Dempster-Shafer Clusterng, n Proc. EuroFuson98 Int. Conf. Data Fuson, pp. 9, Great Malvern, UK, 7 October 998. [9] J.J. Hopfeld, and D.W. Tank, Neural Computaton of Decsons n Optmzaton Problems, Bol. Cybern., Vol, pp.. [] U. Bergsten, J. Schubert, and P. Svensson, Applyng Data Mnng and Machne Learnng Technques to Submarne Intellgence Analyss, n Proc. Thrd Int. Conf. on Knowledge Dscovery and Data Mnng (KDD 97), pp. 7, Newport Beach, USA, 7 August 997, The AAAI Press, Menlo Park, 997. [] J. Schubert, Cluster-based Specfcaton Technques n Dempster-Shafer Theory for an Evdental Intellgence Analyss of Multple Target Tracks, Ph.D. Thess, TRITA NA 9, Royal Insttute of Technology, Sweden, 99, ISRN KTH/NA/R 9/ SE, ISSN 8 9, ISBN [] J. Schubert, Cluster-based Specfcaton Technques n Dempster-Shafer Theory for an Evdental Intellgence Analyss of Multple Target Tracks (Thess Abstract), AI Communcatons, Vol 8, pp. 7.

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through   ISSN Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

On the Multicriteria Integer Network Flow Problem

On the Multicriteria Integer Network Flow Problem BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 5, No 2 Sofa 2005 On the Multcrtera Integer Network Flow Problem Vassl Vasslev, Marana Nkolova, Maryana Vassleva Insttute of

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

HMMT February 2016 February 20, 2016

HMMT February 2016 February 20, 2016 HMMT February 016 February 0, 016 Combnatorcs 1. For postve ntegers n, let S n be the set of ntegers x such that n dstnct lnes, no three concurrent, can dvde a plane nto x regons (for example, S = {3,

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

On the correction of the h-index for career length

On the correction of the h-index for career length 1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

On the Interval Zoro Symmetric Single-step Procedure for Simultaneous Finding of Polynomial Zeros

On the Interval Zoro Symmetric Single-step Procedure for Simultaneous Finding of Polynomial Zeros Appled Mathematcal Scences, Vol. 5, 2011, no. 75, 3693-3706 On the Interval Zoro Symmetrc Sngle-step Procedure for Smultaneous Fndng of Polynomal Zeros S. F. M. Rusl, M. Mons, M. A. Hassan and W. J. Leong

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

} Often, when learning, we deal with uncertainty:

} Often, when learning, we deal with uncertainty: Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

CS286r Assign One. Answer Key

CS286r Assign One. Answer Key CS286r Assgn One Answer Key 1 Game theory 1.1 1.1.1 Let off-equlbrum strateges also be that people contnue to play n Nash equlbrum. Devatng from any Nash equlbrum s a weakly domnated strategy. That s,

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

The Expectation-Maximization Algorithm

The Expectation-Maximization Algorithm The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

CSC 411 / CSC D11 / CSC C11

CSC 411 / CSC D11 / CSC C11 18 Boostng s a general strategy for learnng classfers by combnng smpler ones. The dea of boostng s to take a weak classfer that s, any classfer that wll do at least slghtly better than chance and use t

More information

The internal structure of natural numbers and one method for the definition of large prime numbers

The internal structure of natural numbers and one method for the definition of large prime numbers The nternal structure of natural numbers and one method for the defnton of large prme numbers Emmanul Manousos APM Insttute for the Advancement of Physcs and Mathematcs 3 Poulou str. 53 Athens Greece Abstract

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Bayesian Networks. Course: CS40022 Instructor: Dr. Pallab Dasgupta

Bayesian Networks. Course: CS40022 Instructor: Dr. Pallab Dasgupta Bayesan Networks Course: CS40022 Instructor: Dr. Pallab Dasgupta Department of Computer Scence & Engneerng Indan Insttute of Technology Kharagpur Example Burglar alarm at home Farly relable at detectng

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Hopfield Training Rules 1 N

Hopfield Training Rules 1 N Hopfeld Tranng Rules To memorse a sngle pattern Suppose e set the eghts thus - = p p here, s the eght beteen nodes & s the number of nodes n the netor p s the value requred for the -th node What ll the

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced, FREQUENCY DISTRIBUTIONS Page 1 of 6 I. Introducton 1. The dea of a frequency dstrbuton for sets of observatons wll be ntroduced, together wth some of the mechancs for constructng dstrbutons of data. Then

More information

Relevance Vector Machines Explained

Relevance Vector Machines Explained October 19, 2010 Relevance Vector Machnes Explaned Trstan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introducton Ths document has been wrtten n an attempt to make Tppng s [1] Relevance Vector Machnes

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

12. The Hamilton-Jacobi Equation Michael Fowler

12. The Hamilton-Jacobi Equation Michael Fowler 1. The Hamlton-Jacob Equaton Mchael Fowler Back to Confguraton Space We ve establshed that the acton, regarded as a functon of ts coordnate endponts and tme, satsfes ( ) ( ) S q, t / t+ H qpt,, = 0, and

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Semi-supervised Classification with Active Query Selection

Semi-supervised Classification with Active Query Selection Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples

More information

FUZZY FINITE ELEMENT METHOD

FUZZY FINITE ELEMENT METHOD FUZZY FINITE ELEMENT METHOD RELIABILITY TRUCTURE ANALYI UING PROBABILITY 3.. Maxmum Normal tress Internal force s the shear force, V has a magntude equal to the load P and bendng moment, M. Bendng moments

More information

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2. THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Lecture 4. Instructor: Haipeng Luo

Lecture 4. Instructor: Haipeng Luo Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would

More information

A New Evidence Combination Method based on Consistent Strength

A New Evidence Combination Method based on Consistent Strength TELKOMNIKA, Vol.11, No.11, November 013, pp. 697~6977 e-issn: 087-78X 697 A New Evdence Combnaton Method based on Consstent Strength Changmng Qao, Shul Sun* Insttute of Electronc Engneerng, He Longang

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

O-line Temporary Tasks Assignment. Abstract. In this paper we consider the temporary tasks assignment

O-line Temporary Tasks Assignment. Abstract. In this paper we consider the temporary tasks assignment O-lne Temporary Tasks Assgnment Yoss Azar and Oded Regev Dept. of Computer Scence, Tel-Avv Unversty, Tel-Avv, 69978, Israel. azar@math.tau.ac.l??? Dept. of Computer Scence, Tel-Avv Unversty, Tel-Avv, 69978,

More information

APPENDIX 2 FITTING A STRAIGHT LINE TO OBSERVATIONS

APPENDIX 2 FITTING A STRAIGHT LINE TO OBSERVATIONS Unversty of Oulu Student Laboratory n Physcs Laboratory Exercses n Physcs 1 1 APPEDIX FITTIG A STRAIGHT LIE TO OBSERVATIOS In the physcal measurements we often make a seres of measurements of the dependent

More information

Wavelet chaotic neural networks and their application to continuous function optimization

Wavelet chaotic neural networks and their application to continuous function optimization Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

Min Cut, Fast Cut, Polynomial Identities

Min Cut, Fast Cut, Polynomial Identities Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem Appled Mathematcal Scences Vol 5 0 no 65 3 33 Interactve B-Level Mult-Objectve Integer Non-lnear Programmng Problem O E Emam Department of Informaton Systems aculty of Computer Scence and nformaton Helwan

More information

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl RECURSIVE SPLINE INTERPOLATION METHOD FOR REAL TIME ENGINE CONTROL APPLICATIONS A. Stotsky Volvo Car Corporaton Engne Desgn and Development Dept. 97542, HA1N, SE- 405 31 Gothenburg Sweden. Emal: astotsky@volvocars.com

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

Grover s Algorithm + Quantum Zeno Effect + Vaidman

Grover s Algorithm + Quantum Zeno Effect + Vaidman Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the

More information

1 GSW Iterative Techniques for y = Ax

1 GSW Iterative Techniques for y = Ax 1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for U Charts. Dr. Wayne A. Taylor

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for U Charts. Dr. Wayne A. Taylor Taylor Enterprses, Inc. Adjusted Control Lmts for U Charts Copyrght 207 by Taylor Enterprses, Inc., All Rghts Reserved. Adjusted Control Lmts for U Charts Dr. Wayne A. Taylor Abstract: U charts are used

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Fundamental loop-current method using virtual voltage sources technique for special cases

Fundamental loop-current method using virtual voltage sources technique for special cases Fundamental loop-current method usng vrtual voltage sources technque for specal cases George E. Chatzaraks, 1 Marna D. Tortorel 1 and Anastasos D. Tzolas 1 Electrcal and Electroncs Engneerng Departments,

More information

Department of Electrical & Electronic Engineeing Imperial College London. E4.20 Digital IC Design. Median Filter Project Specification

Department of Electrical & Electronic Engineeing Imperial College London. E4.20 Digital IC Design. Median Filter Project Specification Desgn Project Specfcaton Medan Flter Department of Electrcal & Electronc Engneeng Imperal College London E4.20 Dgtal IC Desgn Medan Flter Project Specfcaton A medan flter s used to remove nose from a sampled

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper Games of Threats Elon Kohlberg Abraham Neyman Workng Paper 18-023 Games of Threats Elon Kohlberg Harvard Busness School Abraham Neyman The Hebrew Unversty of Jerusalem Workng Paper 18-023 Copyrght 2017

More information

arxiv:cs.cv/ Jun 2000

arxiv:cs.cv/ Jun 2000 Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

Analysis of Queuing Delay in Multimedia Gateway Call Routing

Analysis of Queuing Delay in Multimedia Gateway Call Routing Analyss of Queung Delay n Multmeda ateway Call Routng Qwe Huang UTtarcom Inc, 33 Wood Ave. outh Iseln, NJ 08830, U..A Errol Lloyd Computer Informaton cences Department, Unv. of Delaware, Newark, DE 976,

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor Taylor Enterprses, Inc. Control Lmts for P Charts Copyrght 2017 by Taylor Enterprses, Inc., All Rghts Reserved. Control Lmts for P Charts Dr. Wayne A. Taylor Abstract: P charts are used for count data

More information