Modified Recursive Prediction Error Algorithm For Training Layered Neural Network
|
|
- Tobias Simmons
- 5 years ago
- Views:
Transcription
1 Modfed Recursve Predcton Error Alorth For Trann Layered Neural Netor Mohd Yusoff Mashor Centre for ELectronc Intellent Syste (CELIS) School of Electrcal and Electronc Enneern, Unversty Sans Malaysa, Pulau Pnan, MALAYSIA. Eal: ABSTRACT Bac propaaton s a steepest descent type alorth that norally has slo learnn rate and the search for the lobal nu often becoes trapped at poor local na. Ths paper proposes an alorth called odfed recursve predcton error (MRPE) alorth for trann ultlayered perceptron netors. MRPE s a odfed verson of recursve predcton error (RPE) alorth. RPE and MRPE are based on Gaussan-Neton type alorth that enerally provdes better perforance than a steepest type alorth such as bac propaaton. The current study nvestates the perforance of MRPE alorth to tran MLP netors and copares ts perforance to the faous bac propaaton alorth. Three data sets ere used for the coparson. It s found that the proposed MRPE s uch better than bac propaaton alorth.. INTRODUCTION Noadays, artfcal neural netors are studed and appled n varous dscplnes such as neuroboloy, psycholoy, coputer scence, contve scence, enneern, econocs, edcne, etc. Tan et al. (992) used neural netors for forecastn US- Snapore dollar exchane. Lnens and Ne (993) appled a neuro-fuzzy controller to a proble of ultvarable blood pressure control. Yu et al. (993) used neural netors to solve the travelln salesan proble and the ap-colourn proble. Arad et al. (994) used RBF netors to reconse huan facal expressons based on 2D and 3D odels of the face. Rosenblu and Davs (994) have appled RBF netors for a vehcle vsual autonoous road follon syste. Many applcatons of artfcal neural netors are nspred by the ablty of the netors to deonstrate branle behavour. Applcatons of artfcal neural netors n these dverse felds have ade t possble to tacle soe probles that ere prevously consdered to be very dffcult or unsolved. Multlayered perceptron (MLP) netor traned usn bac propaaton (BP) alorth s the ost popular choce n neural netor applcatons. It has been shon that the netor can provde satsfactory results. Hoever, MLP netor and BP alorth can be consdered as the 24
2 Modfed Recursve Predcton Error Alorth basc to the neural netor studes. For exaples RBF and HMLP netors have been proved to provde uch better perforance than MLP netor (Chen, 992; Mashor, 999). Bac propaaton s a steepest descent type alorth that norally has slo converence rate and the search for the lobal nu often becoe trapped at poor local na. Ths paper proposes an alorth called odfed recursve predcton error (MRPE) alorth for trann ultlayered perceptron netors. MRPE s a odfed verson of recursve predcton error (RPE) alorth. RPE and MRPE are based on Gaussan-Neton type alorth that enerally provdes better perforance than a steepest type alorth such as bac propaaton. The current study nvestates the perforance of MRPE alorth to tran MLP netors and copared ts perforance to RPE alorth and the faous bac propaaton alorth. The coparson as carred out by usn the MLP netors that ere traned usn the three alorths to perfor non-lnear syste dentfcaton. becoe the nputs to the second layer and so on. The last layer acts as the netor output layer. A hdden neuron perfors to functons that are the cobnn functon and the actvaton functon. The output of the -th neuron of the -th hdden layer, s ven by n v ( t) F v ( t) = + b ; = for and f the -th layer s the output layer then the output of the l-th neuron ŷ l of the output layer s ven by $y ( t) = v ( t) ; for l (2) l n = here n, n o s, b s and F(.) are the nuber of neurons n -th layer, nuber of neurons n output layer, ehts, thresholds and actvaton functon respectvely. n n o ŷ ŷ n o 2. MULTILAYERED PERCEPTRON NETWORKS MLP netor s a feed forard neural netor th one or ore hdden layers. Cybeno (989) and Funahash (989) have proved that the MLP netor s a eneral functon approxator and one hdden layer netors ll alays be suffcent to approxate any contnuous functon up to certan accuracy. A MLP netor th to hdden layers s shon n Fure. The nput layer acts as an nput data holder that dstrbutes the nput to the frst hdden layer. The outputs fro the frst hdden layer then v v 2 v n Output Layer 2nd Hdden Layer st Hdden Layer Input Layer Fure : Multlayered perceptron netors Internatonal Journal of The Coputer, The Internet and Manaeent, Vol., No.2, 2003, pp
3 In the current study, the netor th a snle output node and a snle hdden layer as used,.e = 2 and n o =. Wth these splfcatons the netor output s: n n n r 2 2 $y( t) = v( t) F v ( t) = + b = = = (3) here n r s the nuber of nodes n the nput layer. The actvaton functon F(.) s selected to be Fvt ( ( )) = ( ) + e vt (4) The ehts and threshold b are unnon and should be selected to nsed the predcton errors defned as = y yˆ ε (5) here y(t) s the actual output and netor output. 3. TRAINING ALGORITHMS $y( t) s the Ths secton brefly presents bac propaaton and the proposed MRPE alorths. The bac propaaton alorth th oentu has been used n ths study. It s ell non that ths verson of bac propaaton has better learnn rate copared to the ornal bac propaaton. 3. Bac Propaaton Alorth Bac propaaton alorth as ntally ntroduced by Werbos (974) and further developed by Ruelhart and McClelland (986). Bac propaaton s a steepest decent type alorth here the eht connecton beteen the -th neuron of the (-)-th layer and the -th neuron of the -th layer are respectvely updated accordn to ( t) = ( t ) + ( t) b ( t) = b ( t ) + b ( t) (6) th the ncreent ( ) t and b ( ) t ven by ( t) = η ρ ( t) v ( t) + α ( t ) b ( t) = ηρ ( t) + α b ( t ) b ) b (7) here the subscrpts and b represent the eht and threshold respectvely, α and α b are oentu constants hch deterne the nfluence of the past paraeter chanes on the current drecton of oveent n the paraeter space, η and η b represent the learnn rates and ρ ( t) s the error snal of the -th neuron of the -th layer hch s bac propaated n the netor. Snce the actvaton functon of the output neuron s lnear, the error snal at the output node s ρ ( t) = y( t) y$ ( t) (8) and for the neurons n the hdden layer + + ρ ( t) = F ( v ( t) ) ρ ( t) ( t ; ) here ( ( )) ( F v = -,..., 2, (9) ( t) Fv t th respect to v ( t). s the frst dervatve of Snce bac propaaton alorth s a steepest decent type alorth, the alorth suffers fro a slo converence rate. The search for the lobal na ay becoe trapped at local na and the alorth can be senstve to the user selectable paraeters. 26
4 Modfed Recursve Predcton Error Alorth 3.2 Modfed Recursve Predcton Error Alorth Recursve predcton error alorth (RPE) as ornally derved by Lun and Soderstro (983) and odfed by Chen et al. (990) to tran MLP netors. RPE alorth s a Gauss-Neton type alorth that ll enerally ve better perforance than a steepest descent type alorth such as bac propaaton alorth. In the present study, the converence rate of the RPE alorth s further proved by usn the optsed oentu and learnn rate. The oentu and learnn rate n ths research are vared copared to the constant values n Chen et al. (990). The RPE alorth odfed by Chen et al. (990) nses the follon cost functon, J N T ( Θ) = ε ( t, Θˆ ) Λ ε( ˆ t, Θˆ ) 2N t= (0) by updatn the estated paraeter vector, Θˆ (conssts of s and b s), recursvely usn Gauss-Neton alorth: and = Θˆ ( t ) + P t Θ ˆ = α ( t ) + α ψ t ε t (2) here ε t and Λ are the predcton error and syetrc postve defnte atrx respectvely, and s the nuber of output nodes; and α t and α t are the oentu and learnn rate respectvely. t and α t can be arbtrarly assned α to soe values beteen 0 and and the typcal value of t and α t are closed α to and 0 respectvely. In the present study, α and α are vared to further prove the converence rate of the RPE alorth accordn to: and α α = α (t α ( 0) 0 α < ψ = α (t ) + a (3) ( t, Θ) = )( ) α (4) here a s a sall constant (typcally a = 0.0); s norally ntalsed to. ψ t represents the radent of the one step ahead predcted output th respect to the netor paraeters: ( ) dyˆ t, Θ dθ (5) P t n equaton s updated recursvely accordn to: T ( t ) ψ ψ P( ) P t P = P( t ) ; λ t γ T γ = λ t I + ψ t P t ψ t (6) ( ) here λ t s the forettn factor, 0 < λ( t) <, and norally been updated usn the follon schee, Lun and Soderstro (983): here ( ) = λ λ( t ) + ( λ ) λ t (7) λ and the ntal forettn factor λ 0 are the desn values. Intal value of P(t) atrx, P(0) s norally set to α I here I s the dentty atrx and α s a constant, typcally beteen 00 to Sall value of α ll cause slo learnn hoever too lare α ay cause the estated paraeters do not convere properly. Hence, t should be selected to coprose beteen the to ponts, α = 000 s adequate for ost cases. Internatonal Journal of The Coputer, The Internet and Manaeent, Vol., No.2, 2003, pp
5 The radent atrx ψ t for one-hddenlayer MLP netor can be obtaned by dfferentatn equaton (3) th respect to the paraeters, θ, to yeld: ψ dyˆ = dθ v = v v ( v ) ( v ) v f θ = c 2 f θ = b c f θ = c otherse n n n, n h h h (8) The above radent atrx s derved based on sod functon therefore, f other actvaton functons ere used the atrx should be chaned accordnly. The odfed recursve predcton error alorth (MRPE) alorth for one hdden layer MLP netor can be pleented as follos: ) Intalse ehts, thresholds, P(0), a, b, α ( 0), λ 0 and λ( 0). ) Present nputs to the netor and copute the netor outputs accordn to equaton (3). ) Calculate the predcton error accordn to equaton (5) and copute atrx ψ t accordn to equaton (8). Note that, eleents of ψ t should be calculated fro the output layer don to the hdden layer. v) Copute atrx λ t and P(t) accordn to equaton (7) and (6) respectvely. < b α v) If α t, update t accordn to equaton (3). v) Update α t and then t accordn to equaton (4) and (2) respectvely. v) Update paraeter vector accordn to equaton. Θˆ v) Repeat steps to (v) for each trann data saple. The desn paraeter b n step (v) s the upper lt of oentu that has typcal value beteen 0.8 to 0.9. So oentu ll be ncreased for each data saple fro a sall value (norally close to 0) to ths value. 4. MODELLING NON-LINEAR SYSTEMS USING MLP NETWORKS Modelln usn MLP netors can be consdered as fttn a surface n a ultdensonal space to represent the trann data set and usn the surface to predct over the testn data set. A de class of nonlnear systes can be represented by nonlnear auto-reressve ovn averae th exoenous nput (NARMAX) odel, Leontarts and Bllns (985). The NARMAX odel can be expressed n ters of a non-lnear functon expanson of laed nput, output and nose ters as follos: y (, = f s y( t ), L, y( t ny ), u( t ), L, u( t nu ) e( t ),, e( t ne )) + e( t ) here y y = M y L (9), u u = M ur and e e = M e t are the syste output, nput and nose vector respectvely; ny, n u and n e are the axu las n the output, nput and nose vector respectvely. 28
6 Modfed Recursve Predcton Error Alorth The non-lnear functon f s ( ) s norally very coplcated and rarely non a pror for practcal systes. If the echanss of a syste are non the functon f ( ) s can be derved fro the functons that overn those echanss. In the case of an unnon syste, f s ( ) s norally constructed based on the observaton of the nput and output data. In the present study, MLP netors ll be used to odel the nput-output relatonshp. In other ords, f s ( ) ll be approxated by usn equaton (3) here F s selected to be sod functon. The netor nput vector, v( t ) s fored fro laed nput, output and nose ters, hch are denoted as u ( t ) L u( t n u ), y( t ) L y( t n y ) and et ( ) L et ( n e equaton (9). ) respectvely The fnal stae n syste dentfcaton s odel valdaton. There are several ays of testn a odel such as one step ahead predctons (OSA), odel predcted outputs (MPO), ean squared error (MSE), correlaton tests and ch-squares tests. In the present study, only OSA and MSE tests ll be use snce t s not easy to see the perforance dfferent usn other tests. OSA s a coon easure of predctve accuracy of a odel that has been consdered by any researchers. OSA can be expressed as: yˆ = fs ( u( t ), L, u( t nu ), y( t ), L, y( t ny ), ε t, θˆ, L, ε t n, θˆ (20) ( ) ( )) and the resdual or predcton error s defned as: ( t θ) y( t) y( ) ε$, $ = $ t (2) ε n here f s ( ) s a non-lnear functon, n ths case the MLP netor. A ood odel ll norally ve a ood predcton, hoever a odel that has a ood one step ahead predcton ht not alays be unbased. The odel ay be snfcantly based and predcton over a dfferent set of data often reveals ths proble. Splttn the data nto to sets, a trann set and a testn set, can norally detect ths condton. MSE s an teratve ethod of odel valdaton here the odel s tested by calculatn the ean squared errors after each trann step. MSE test ll ndcate ho fast a predcton error or resdual converes th the nuber of trann data. The MSE at the t-th trann step, s ven by: E MSE n d 2, (22) n ( t Θ t ) = ( y yˆ (, Θ t )) d = ( ) ( ( )) here E ( ) MSE t, Θ t and y$, Θ t are the MSE and OSA for a ven set of estated paraeters Θ( t) after t trann steps respectvely, and nd s the nuber of data that ere used to calculate the MSE. 5. SIMULATION RESULTS The perforance of MLP netors traned usn the BP, RPE and MRPE alorths presented n secton 3 ere copared. One sulated and to real data sets ere used for ths coparson. The netors ere used to perfor syste dentfcaton and the resultn odels ere used to produce OSA and MSE tests. Internatonal Journal of The Coputer, The Internet and Manaeent, Vol., No.2, 2003, pp
7 Exaple The frst data set s a sulated syste defned by the follon dfference equaton: y 3 ( t) = 0.3y( t ) + 0.6y( t 2) + u ( t ) 2 0.3u ( t ) 0.4u( t ) + e( t) here et ( ) s a Gaussan hte nose sequence th zero ean and varance and the nput, u(t) s a unforly rando sequence beteen (-,+). Ths syste as used to enerate 000 pars of data nput and output. The frst 600 data ere used to tran the netor and the reann 400 data ere used to test the ftted odel. The netor as traned based on the follon confuraton: v ( t) = [ u( t ) y( t ) y( t 2) ] + alorths produced uch better MSE than the one traned usn BP alorth. These fures also ndcate that the MRPE and RPE alorths produced about the sae perforance. OSA tests over the trann and testn data sets for the netor odels traned usn BP, RPE and MRPE are shon n Fure (4), (5) and (6) respectvely. These plots aan sho that RPE and MRPE have slar perforance. Hoever, the OSA test produced by the netor traned usn BP alorth s uch orse than the netor odels that have been traned usn RPE and MRPE alorths. The OSA test n Fure (4) shos that BP traned-netor fal to predct properly. All the netor odels have the sae nput v(t) and 7 hdden nodes but dfferent trann alorth. The desnn paraeters for BP, RPE and MRPE alorths ere set as follos, BP alorth: η η = and α = α = = b b RPE alorth: P(0) = 000I, α = 0.85, α = 0., ( ) 95 λ 0 = 0.99 and λ 0 = 0.. Fure 2: MSE calculated over trann data set MRPE alorth: = I P 000 b = 0.85, λ0 ( 0 ) = 0.2,, α a = 0.99 and λ 0 = ( ) = 0.0, The MSE calculated over both the trann and testn data sets for the netor odels traned usn BP, RPE and MRPE alorths are shon n Fure (2) and (3) respectvely. In ths exaple the netor odels traned usn MRPE and RPE Fure 3: MSE calculated over testn data set 30
8 Modfed Recursve Predcton Error Alorth Exaple 2 The second data set as taen fro a heat exchaner syste and conssts of 000 saples. The frst 500 data ere used to tran the netor and the reann 500 data ere used to test the ftted netor odel. The netor has been traned usn the follon specfcaton: Fure 4: OSA test for BP alorth v ( t) = [ u( t ) u( t 2) y( t ) y( t 4) e( t 3) e( t 4) e( t 5)] and bas nput All the netor odels have the sae structure but dfferent trann alorth. The desn paraeters for BP, RPE and MRPE alorths ere set as follos, BP alorth: η η = and α = b RPE alorth: = α b = 0.85 P(0) = 000I, α = 0.85, α = 0., ( ) 95 λ 0 = 0.99 and λ 0 = 0.. Fure 5: OSA test for RPE alorth MRPE alorth: = I P 000 a = 0.0, λ( 0 ) = b, α ( 0 ) = 0. 6, = 0. 85, λ 0 = and The MSE calculated over both the trann and testn data sets for the netor odels traned usn BP and MRPE alorths are shon n Fure (7) and (8) respectvely. These fures sho that MRPE alorth produced snfcantly better MSE than RPE alorth and uch better than BP alorth. Fure 6: OSA test for MRPE alorth Internatonal Journal of The Coputer, The Internet and Manaeent, Vol., No.2, 2003, pp
9 Fure 7: MSE calculated over the trann data set Fure 9: OSA test for BP alorth Fure 8: MSE calculated over the testn data set OSA tests over the trann and testn data sets for the netor odels traned usn BP, RPE and MRPE are shon n Fure (9), (0) and respectvely. The result n Fure (9) reconfrs that the netor traned usn BP alorth cannot predct properly here the predcton over both trann and testn data set are not satsfactory. The netors traned usn RPE and MRPE alorths on the other hand predct very ell over both the trann and testn data sets. Referrn to Fure (0) and, t can be sad that the netor traned usn MRPE alorth ve snfcantly better predcton (OSA test) copared to the netor traned usn RPE alorth. Fure 0: OSA test for RPE alorth Fure : OSA test for MRPE alorth 32
10 Modfed Recursve Predcton Error Alorth Exaple 3 A data set of 000 nput-output saples ere taen fro a tenson le platfor. Descrpton of the process can be found n Mashor (995). The data set conssts of 000 data saples here the frst 600 data saples ere used for trann and the next 400 data saples ere used for testn. The netor has been traned usn the follon specfcaton: v ( t) = [ u( t ) L u( t 8) y( t ) L y( t 4) e ( t 3) e( t 5)] and bas nput Fure 2: MSE calculated over trann data set BP alorth: η = ηb = 0.00 and α = α = RPE alorth: P(0) =000I, α = 0.85, α = 0.07, 0 = 0.99 and λ 0 = 0.. λ ( ) 95 MRPE alorth: b I P t = 000, b = 0.85, λ0 ( 0 ) = 0.4, α a = 0.0, = 0.99 and λ 0 = 0.. ( ) 95 The MSE calculated over both trann and testn data sets for the netor odels traned usn BP, RPE and MRPE alorths are shon n Fure (2) and (3) respectvely. In ths exaple the netor odels traned usn MRPE and RPE alorths produced uch better MSE than the one traned usn BP alorth. These fures also ndcate that the MRPE s snfcantly better than RPE alorth. Fure (2) and (3) also suestn that the netor traned usn BP alorth cannot learn properly here the converence of ts MSE values are not snfcant. Fure 3: MSE calculated over testn data set Fure 4: OSA test for BP alorth Internatonal Journal of The Coputer, The Internet and Manaeent, Vol., No.2, 2003, pp
11 Fure 7: MSE for syste n exaple Fure 5: OSA test for RPE alorth OSA tests over the trann and testn data sets for the netor odels traned usn BP, RPE and MRPE are shon n Fure (4), (5) and (6) respectvely. For ths exaple t s qute hard to dstnush the perforance advantae beteen the netors traned usn MRPE and RPE alorths. Hoever, the OSA tests produced by the to netors are uch better than the one produced by the netor odel that have been traned usn BP alorth. All the three exaples sho that the MLP netor traned usn BP alorth could not learn properly. Ths s because the learnn rate of BP s very slo. Thus further analyss has been carred out to chec ho ood s BP th ore trann epochs. Fure (7), (8) and (9) sho the MSE plots over the trann data set produced by BP alorth after 00 epochs for the syste n exaple, 2 and 3 respectvely. Each fure has three MSE plots for dfferent learnn rates. The netor specfcaton for the netors ere the sae as n prevous analyss except for the learnn rates that ere assn as ndcated n the respectve fures. Coparn these results th the ones n Fure (2), (7) and (2) t as found that BP alorth cannot tran the MLP netor as ood as RPE and MPRE even after 00 trann epochs. Therefore, t can be deduced that the learnn rate of BP alorth s very slo copare to MRPE and RPE alorths. Fure 6: OSA test for MRPE alorth 34
12 Modfed Recursve Predcton Error Alorth learnn rate than BP alorth and does not requre ultple trann epoch. REFERENCES Fure 8: MSE for syste n exaple 2 [] Arad, N., Dyn, N., Resfeld, D., and Yeshurun, Y., 994, Iae arpn by radal bass functons: applcaton to facal expressons, CVGIP: Graphcal Models and Iae Processn, 56 (2), [2] Chen, S., Coan, C.F.N., Bllns, S.A., and Grant, P.M., 990, A parallel recursve predcton error alorth for trann layered neural netors, Int. J. Control, 5 (6), Fure 9: MSE for syste n exaple 3 6. CONCLUSION MRPE alorth s proposed to tran MLP netor and ts perforance as copared to RPE and BP alorths. Three data sets ere used to test the perforance of the alorths. The MSE and OSA tests for the exaples ndcated that the MRPE has snfcantly proved the perforance of RPE alorth especally for the to real data sets. The results also proved that both RPE and MRPE alorths are uch better than BP alorth. The perforance of BP alorth th 00 trann epochs stll cannot copete th MRPE alorth th one trann epoch. Hence, t can be concluded that MRPE has uch faster [3] Chen, S., and Bllns, S.A., 992, Neural netors for non-lnear dynac syste odelln and dentfcaton, Int. J. of Control, 56 (2), [4] Cybeno, G., 989, Approxatons by superposton of a sodal functon, Matheatcs of Control, Snal and Systes, 2, [5] Funahash, K., 989, On the approxate realsaton of contnuous appns by neural netors, Neural Netors, 2, [6] Leontarts, I.J., and Bllns, S.A., 985, Input-output paraetrc odels for non-lnear systes. Part I - Deternstc non-lnear systes. Part II - Stochastc non-lnear systes, Int. J. Control, 4, Internatonal Journal of The Coputer, The Internet and Manaeent, Vol., No.2, 2003, pp
13 [7] Lnens, D.A., and Ne, J., 993, Fuzzfed RBF netor-based learnn control: Structure and selfconstructon, IEEE Int. Conf. on Neural Netors, 2, [8] Lun, L., and Soderstro, T., 983, Theory and Practce of Recursve Identfcaton, MIT Press, Cabrde. Trans. on Fundaentals of Electroncs, Co. and Coputer Scences, E76-A (9), [5] Werbos, P.J., 974, Beyond Reresson: Ne Tools for Predcton and Analyss n the Behavoural Scences, Ph.D. Thess, Harvard Unversty. [9] Mashor, M.Y., 995, Syste dentfcaton usn radal bass functon netor, PhD Thess, Unversty of Sheffeld, Unted Kndo. [0] Mashor, M.Y., 999, Perforance Coparson Beteen HMLP and MLP Netors, Int. Conf. on Robotcs, Vson & Parallel Processn for Autoaton (ROVPIA 99), pp [] Rosenblu, M., and Davs, L.S., 994, An proved radal bass functon netor for vsual autonoous road follon, Proc. of the SPIE - The Int. Socety for Optcal En., 203, [2] Ruelhart, D.E., and McClelland, J.L., 986, Parallel dstrbuted processn: exploratons n the crostructure of conton, II MIT Press Cabrde, MA, I. [3] Tan, P.Y., L, G., Chua, K., Won, F.S., and Neo, S., 992, Coparatve studes aon neural nets, radal bass functons and reresson ethods, ICARCV '92. Second Int. Conf. on Autoaton, Robotcs and Coputer Vson,, NW-3.3/-6. [4] Yu, D.H., Ja, J., and Mor, S., 993, A ne neural netor alorth th the orthoonal optsed paraeters to solve the optal probles, IEICE 36
PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS
PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS Mohd Yusoff Mashor School of Electrcal and Electronc Engneerng, Unversty Scence Malaysa, Pera Branch Campus,
More informationSystem in Weibull Distribution
Internatonal Matheatcal Foru 4 9 no. 9 94-95 Relablty Equvalence Factors of a Seres-Parallel Syste n Webull Dstrbuton M. A. El-Dacese Matheatcs Departent Faculty of Scence Tanta Unversty Tanta Egypt eldacese@yahoo.co
More informationBAYESIAN CURVE FITTING USING PIECEWISE POLYNOMIALS. Dariusz Biskup
BAYESIAN CURVE FITTING USING PIECEWISE POLYNOMIALS Darusz Bskup 1. Introducton The paper presents a nonparaetrc procedure for estaton of an unknown functon f n the regresson odel y = f x + ε = N. (1) (
More informationLeast Squares Fitting of Data
Least Squares Fttng of Data Davd Eberly Geoetrc Tools, LLC http://www.geoetrctools.co/ Copyrght c 1998-2014. All Rghts Reserved. Created: July 15, 1999 Last Modfed: February 9, 2008 Contents 1 Lnear Fttng
More informationMultigradient for Neural Networks for Equalizers 1
Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT
More informationLeast Squares Fitting of Data
Least Squares Fttng of Data Davd Eberly Geoetrc Tools, LLC http://www.geoetrctools.co/ Copyrght c 1998-2015. All Rghts Reserved. Created: July 15, 1999 Last Modfed: January 5, 2015 Contents 1 Lnear Fttng
More informationAn Accurate Measure for Multilayer Perceptron Tolerance to Weight Deviations
Neural Processng Letters 10: 121 130, 1999. 1999 Kluwer Acadec Publshers. Prnted n the Netherlands. 121 An Accurate Measure for Multlayer Perceptron Tolerance to Weght Devatons JOSE L. BERNIER, J. ORTEGA,
More informationXII.3 The EM (Expectation-Maximization) Algorithm
XII.3 The EM (Expectaton-Maxzaton) Algorth Toshnor Munaata 3/7/06 The EM algorth s a technque to deal wth varous types of ncoplete data or hdden varables. It can be appled to a wde range of learnng probles
More informationWhat is LP? LP is an optimization technique that allocates limited resources among competing activities in the best possible manner.
(C) 998 Gerald B Sheblé, all rghts reserved Lnear Prograng Introducton Contents I. What s LP? II. LP Theor III. The Splex Method IV. Refneents to the Splex Method What s LP? LP s an optzaton technque that
More informationDesigning Fuzzy Time Series Model Using Generalized Wang s Method and Its application to Forecasting Interest Rate of Bank Indonesia Certificate
The Frst Internatonal Senar on Scence and Technology, Islac Unversty of Indonesa, 4-5 January 009. Desgnng Fuzzy Te Seres odel Usng Generalzed Wang s ethod and Its applcaton to Forecastng Interest Rate
More informationITERATIVE ESTIMATION PROCEDURE FOR GEOSTATISTICAL REGRESSION AND GEOSTATISTICAL KRIGING
ESE 5 ITERATIVE ESTIMATION PROCEDURE FOR GEOSTATISTICAL REGRESSION AND GEOSTATISTICAL KRIGING Gven a geostatstcal regresson odel: k Y () s x () s () s x () s () s, s R wth () unknown () E[ ( s)], s R ()
More informationEvaluation of classifiers MLPs
Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label
More informationElastic Collisions. Definition: two point masses on which no external forces act collide without losing any energy.
Elastc Collsons Defnton: to pont asses on hch no external forces act collde thout losng any energy v Prerequstes: θ θ collsons n one denson conservaton of oentu and energy occurs frequently n everyday
More informationApplied Mathematics Letters
Appled Matheatcs Letters 2 (2) 46 5 Contents lsts avalable at ScenceDrect Appled Matheatcs Letters journal hoepage: wwwelseverco/locate/al Calculaton of coeffcents of a cardnal B-splne Gradr V Mlovanovć
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationNeural Networks & Learning
Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred
More informationGradient Descent Learning and Backpropagation
Artfcal Neural Networks (art 2) Chrstan Jacob Gradent Descent Learnng and Backpropagaton CSC 533 Wnter 200 Learnng by Gradent Descent Defnton of the Learnng roble Let us start wth the sple case of lnear
More informationPROBABILITY AND STATISTICS Vol. III - Analysis of Variance and Analysis of Covariance - V. Nollau ANALYSIS OF VARIANCE AND ANALYSIS OF COVARIANCE
ANALYSIS OF VARIANCE AND ANALYSIS OF COVARIANCE V. Nollau Insttute of Matheatcal Stochastcs, Techncal Unversty of Dresden, Gerany Keywords: Analyss of varance, least squares ethod, odels wth fxed effects,
More informationCollaborative Filtering Recommendation Algorithm
Vol.141 (GST 2016), pp.199-203 http://dx.do.org/10.14257/astl.2016.141.43 Collaboratve Flterng Recoendaton Algorth Dong Lang Qongta Teachers College, Haou 570100, Chna, 18689851015@163.co Abstract. Ths
More informationMultilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata
Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,
More informationExcess Error, Approximation Error, and Estimation Error
E0 370 Statstcal Learnng Theory Lecture 10 Sep 15, 011 Excess Error, Approxaton Error, and Estaton Error Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton So far, we have consdered the fnte saple
More informationVERIFICATION OF FE MODELS FOR MODEL UPDATING
VERIFICATION OF FE MODELS FOR MODEL UPDATING G. Chen and D. J. Ewns Dynacs Secton, Mechancal Engneerng Departent Iperal College of Scence, Technology and Medcne London SW7 AZ, Unted Kngdo Eal: g.chen@c.ac.uk
More informationOur focus will be on linear systems. A system is linear if it obeys the principle of superposition and homogenity, i.e.
SSTEM MODELLIN In order to solve a control syste proble, the descrptons of the syste and ts coponents ust be put nto a for sutable for analyss and evaluaton. The followng ethods can be used to odel physcal
More informationDenote the function derivatives f(x) in given points. x a b. Using relationships (1.2), polynomials (1.1) are written in the form
SET OF METHODS FO SOUTION THE AUHY POBEM FO STIFF SYSTEMS OF ODINAY DIFFEENTIA EUATIONS AF atypov and YuV Nulchev Insttute of Theoretcal and Appled Mechancs SB AS 639 Novosbrs ussa Introducton A constructon
More informationECE 2C, notes set 7: Basic Transistor Circuits; High-Frequency Response
class notes, M. odwell, copyrhted 013 EE, notes set 7: Basc Transstor rcuts; Hh-Frequency esponse Mark odwell Unversty of alforna, Santa Barbara rodwell@ece.ucsb.edu 805-893-344, 805-893-36 fax oals class
More informationSignal-noise Ratio Recognition Algorithm Based on Singular Value Decomposition
4th Internatonal Conference on Machnery, Materals and Coputng Technology (ICMMCT 06) Sgnal-nose Rato Recognton Algorth Based on Sngular Value Decoposton Qao Y, a, Cu Qan, b, Zhang We, c and Lu Yan, d Bejng
More informationIntroduction to the Introduction to Artificial Neural Network
Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp
More informationGadjah Mada University, Indonesia. Yogyakarta State University, Indonesia Karangmalang Yogyakarta 55281
Reducng Fuzzy Relatons of Fuzzy Te Seres odel Usng QR Factorzaton ethod and Its Applcaton to Forecastng Interest Rate of Bank Indonesa Certfcate Agus aan Abad Subanar Wdodo 3 Sasubar Saleh 4 Ph.D Student
More informationReliability estimation in Pareto-I distribution based on progressively type II censored sample with binomial removals
Journal of Scentfc esearch Developent (): 08-3 05 Avalable onlne at wwwjsradorg ISSN 5-7569 05 JSAD elablty estaton n Pareto-I dstrbuton based on progressvely type II censored saple wth bnoal reovals Ilhan
More informationASYMMETRIC TRAFFIC ASSIGNMENT WITH FLOW RESPONSIVE SIGNAL CONTROL IN AN URBAN NETWORK
AYMMETRIC TRAFFIC AIGNMENT WITH FLOW REPONIVE IGNAL CONTROL IN AN URBAN NETWORK Ken'etsu UCHIDA *, e'ch KAGAYA **, Tohru HAGIWARA *** Dept. of Engneerng - Hoado Unversty * E-al: uchda@eng.houda.ac.p **
More informationScattering by a perfectly conducting infinite cylinder
Scatterng by a perfectly conductng nfnte cylnder Reeber that ths s the full soluton everywhere. We are actually nterested n the scatterng n the far feld lt. We agan use the asyptotc relatonshp exp exp
More information1.3 Hence, calculate a formula for the force required to break the bond (i.e. the maximum value of F)
EN40: Dynacs and Vbratons Hoework 4: Work, Energy and Lnear Moentu Due Frday March 6 th School of Engneerng Brown Unversty 1. The Rydberg potental s a sple odel of atoc nteractons. It specfes the potental
More informationHopfield Training Rules 1 N
Hopfeld Tranng Rules To memorse a sngle pattern Suppose e set the eghts thus - = p p here, s the eght beteen nodes & s the number of nodes n the netor p s the value requred for the -th node What ll the
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationMulti-layer neural networks
Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More informationComputational and Statistical Learning theory Assignment 4
Coputatonal and Statstcal Learnng theory Assgnent 4 Due: March 2nd Eal solutons to : karthk at ttc dot edu Notatons/Defntons Recall the defnton of saple based Radeacher coplexty : [ ] R S F) := E ɛ {±}
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that e have state of the orld X observatons decson functon L[,y] loss of predctn y th Bayes decson rule s the rule
More informationLECTURE :FACTOR ANALYSIS
LCUR :FACOR ANALYSIS Rta Osadchy Based on Lecture Notes by A. Ng Motvaton Dstrbuton coes fro MoG Have suffcent aount of data: >>n denson Use M to ft Mture of Gaussans nu. of tranng ponts If
More informationy new = M x old Feature Selection: Linear Transformations Constraint Optimization (insertion)
Feature Selecton: Lnear ransforatons new = M x old Constrant Optzaton (nserton) 3 Proble: Gven an objectve functon f(x) to be optzed and let constrants be gven b h k (x)=c k, ovng constants to the left,
More informationQuantum Particle Motion in Physical Space
Adv. Studes Theor. Phys., Vol. 8, 014, no. 1, 7-34 HIKARI Ltd, www.-hkar.co http://dx.do.org/10.1988/astp.014.311136 Quantu Partcle Moton n Physcal Space A. Yu. Saarn Dept. of Physcs, Saara State Techncal
More informationIntelligent Systems: Reasoning and Recognition. Artificial Neural Networks
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationDescription of the Force Method Procedure. Indeterminate Analysis Force Method 1. Force Method con t. Force Method con t
Indeternate Analyss Force Method The force (flexblty) ethod expresses the relatonshps between dsplaceents and forces that exst n a structure. Prary objectve of the force ethod s to deterne the chosen set
More informationQuick Visit to Bernoulli Land
Although we have een the Bernoull equaton and een t derved before, th next note how t dervaton for an uncopreble & nvcd flow. The dervaton follow that of Kuethe &Chow ot cloely (I lke t better than Anderon).
More informationTransfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system
Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng
More informationCOMP th April, 2007 Clement Pang
COMP 540 12 th Aprl, 2007 Cleent Pang Boostng Cobnng weak classers Fts an Addtve Model Is essentally Forward Stagewse Addtve Modelng wth Exponental Loss Loss Functons Classcaton: Msclasscaton, Exponental,
More informationFermi-Dirac statistics
UCC/Physcs/MK/EM/October 8, 205 Fer-Drac statstcs Fer-Drac dstrbuton Matter partcles that are eleentary ostly have a type of angular oentu called spn. hese partcles are known to have a agnetc oent whch
More informationOptimal Marketing Strategies for a Customer Data Intermediary. Technical Appendix
Optal Marketng Strateges for a Custoer Data Interedary Techncal Appendx oseph Pancras Unversty of Connectcut School of Busness Marketng Departent 00 Hllsde Road, Unt 04 Storrs, CT 0669-04 oseph.pancras@busness.uconn.edu
More informationSlobodan Lakić. Communicated by R. Van Keer
Serdca Math. J. 21 (1995), 335-344 AN ITERATIVE METHOD FOR THE MATRIX PRINCIPAL n-th ROOT Slobodan Lakć Councated by R. Van Keer In ths paper we gve an teratve ethod to copute the prncpal n-th root and
More informationOn Pfaff s solution of the Pfaff problem
Zur Pfaff scen Lösung des Pfaff scen Probles Mat. Ann. 7 (880) 53-530. On Pfaff s soluton of te Pfaff proble By A. MAYER n Lepzg Translated by D. H. Delpenc Te way tat Pfaff adopted for te ntegraton of
More information1 Definition of Rademacher Complexity
COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #9 Scrbe: Josh Chen March 5, 2013 We ve spent the past few classes provng bounds on the generalzaton error of PAClearnng algorths for the
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationPHYS 1443 Section 002 Lecture #20
PHYS 1443 Secton 002 Lecture #20 Dr. Jae Condtons for Equlbru & Mechancal Equlbru How to Solve Equlbru Probles? A ew Exaples of Mechancal Equlbru Elastc Propertes of Solds Densty and Specfc Gravty lud
More informationSmall-Sample Equating With Prior Information
Research Report Sall-Saple Equatng Wth Pror Inforaton Sauel A Lvngston Charles Lews June 009 ETS RR-09-5 Lstenng Learnng Leadng Sall-Saple Equatng Wth Pror Inforaton Sauel A Lvngston and Charles Lews ETS,
More informationAdaptive RFID Indoor Positioning Technology for Wheelchair Home Health Care Robot. T. C. Kuo
Adaptve RFID Indoor Postonng Technology for Wheelchar Home Health Care Robot Contents Abstract Introducton RFID Indoor Postonng Method Fuzzy Neural Netor System Expermental Result Concluson -- Abstract
More informationOrthonormal Basis and Radial Basis Functions in Modeling and Identification of Nonlinear Block-Oriented Systems
6 Orthonoral Bass and Radal Bass Functons n odelng and Identfcaton of Nonlnear Block-Orented Systes Rafał Stansławsk and Krzysztof J. Latawec Departent of Electrcal, Control and Coputer Engneerng Opole
More informationRecap: the SVM problem
Machne Learnng 0-70/5-78 78 Fall 0 Advanced topcs n Ma-Margn Margn Learnng Erc Xng Lecture 0 Noveber 0 Erc Xng @ CMU 006-00 Recap: the SVM proble We solve the follong constraned opt proble: a s.t. J 0
More informationSeveral generation methods of multinomial distributed random number Tian Lei 1, a,linxihe 1,b,Zhigang Zhang 1,c
Internatonal Conference on Appled Scence and Engneerng Innovaton (ASEI 205) Several generaton ethods of ultnoal dstrbuted rando nuber Tan Le, a,lnhe,b,zhgang Zhang,c School of Matheatcs and Physcs, USTB,
More informationLecture 3. Camera Models 2 & Camera Calibration. Professor Silvio Savarese Computational Vision and Geometry Lab. 13- Jan- 15.
Lecture Caera Models Caera Calbraton rofessor Slvo Savarese Coputatonal Vson and Geoetry Lab Slvo Savarese Lecture - - Jan- 5 Lecture Caera Models Caera Calbraton Recap of caera odels Caera calbraton proble
More informationCIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M
CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute
More informationFlux-Uncertainty from Aperture Photometry. F. Masci, version 1.0, 10/14/2008
Flux-Uncertanty from Aperture Photometry F. Masc, verson 1.0, 10/14/008 1. Summary We derve a eneral formula for the nose varance n the flux of a source estmated from aperture photometry. The 1-σ uncertanty
More informationStatistical analysis of Accelerated life testing under Weibull distribution based on fuzzy theory
Statstcal analyss of Accelerated lfe testng under Webull dstrbuton based on fuzzy theory Han Xu, Scence & Technology on Relablty & Envronental Engneerng Laboratory, School of Relablty and Syste Engneerng,
More informationANALYSIS OF SIMULATION EXPERIMENTS BY BOOTSTRAP RESAMPLING. Russell C.H. Cheng
Proceedngs of the 00 Wnter Sulaton Conference B. A. Peters, J. S. Sth, D. J. Mederos, and M. W. Rohrer, eds. ANALYSIS OF SIMULATION EXPERIMENTS BY BOOTSTRAP RESAMPLING Russell C.H. Cheng Departent of Matheatcs
More informationInternational Journal of Mathematical Archive-9(3), 2018, Available online through ISSN
Internatonal Journal of Matheatcal Archve-9(3), 208, 20-24 Avalable onlne through www.ja.nfo ISSN 2229 5046 CONSTRUCTION OF BALANCED INCOMPLETE BLOCK DESIGNS T. SHEKAR GOUD, JAGAN MOHAN RAO M AND N.CH.
More informationAn Optimal Bound for Sum of Square Roots of Special Type of Integers
The Sxth Internatonal Syposu on Operatons Research and Its Applcatons ISORA 06 Xnang, Chna, August 8 12, 2006 Copyrght 2006 ORSC & APORC pp. 206 211 An Optal Bound for Su of Square Roots of Specal Type
More informationChapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where y + = β + β e for =,..., y and are observable varables e s a random error How can an estmaton rule be constructed for the
More informationChapter 12 Lyes KADEM [Thermodynamics II] 2007
Chapter 2 Lyes KDEM [Therodynacs II] 2007 Gas Mxtures In ths chapter we wll develop ethods for deternng therodynac propertes of a xture n order to apply the frst law to systes nvolvng xtures. Ths wll be
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationLinear Regression Analysis: Terminology and Notation
ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented
More informationEstimation of Reliability in Multicomponent Stress-Strength Based on Generalized Rayleigh Distribution
Journal of Modern Appled Statstcal Methods Volue 13 Issue 1 Artcle 4 5-1-014 Estaton of Relablty n Multcoponent Stress-Strength Based on Generalzed Raylegh Dstrbuton Gadde Srnvasa Rao Unversty of Dodoa,
More informationEstimating the Odometry Error of a Mobile Robot during Navigation
Estatng the Odoetry Error of a Moble Robot durng Navgaton Agostno Martnell and Roland Segwart Autonoous Systes Lab Swss Federal Insttute of Technology Lausanne (EPFL) CH-5 Lausanne, Swtzerland e-al: agostno.artnell,
More informationCOS 511: Theoretical Machine Learning
COS 5: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #0 Scrbe: José Sões Ferrera March 06, 203 In the last lecture the concept of Radeacher coplexty was ntroduced, wth the goal of showng that
More informationarxiv:cs.cv/ Jun 2000
Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São
More informationOnline Multivariable Identification of a MIMO Distillation Column Using Evolving Takagi-Sugeno Fuzzy Model
Proceedngs of e 6 Chnese Control Conference July 6-3, 7, Zhangae, Hunan, Chna Onlne ultvarable Identfcaton of a IO Dstllaton Colun Usng Evolvng aag-sugeno Fuzzy odel olaze Sananda Borhan, Salahshoor Kar,.
More informationStudy of Classification Methods Based on Three Learning Criteria and Two Basis Functions
Study of Classfcaton Methods Based on hree Learnng Crtera and wo Bass Functons Jae Kyu Suhr Abstract - hs paper nvestgates several classfcaton ethods based on the three learnng crtera and two bass functons.
More informationβ0 + β1xi. You are interested in estimating the unknown parameters β
Ordnary Least Squares (OLS): Smple Lnear Regresson (SLR) Analytcs The SLR Setup Sample Statstcs Ordnary Least Squares (OLS): FOCs and SOCs Back to OLS and Sample Statstcs Predctons (and Resduals) wth OLS
More informationMMA and GCMMA two methods for nonlinear optimization
MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons
More informationODD HARMONIOUS LABELINGS OF CYCLIC SNAKES
ODD HARMONIOUS LABELINGS OF CYCLIC SNAKES ABSTRACT In [8] Lang and Ba hae shon that the M. E. Abdel-Aal Departent of Matheatcs, Faculty of Scence, Benha Unersty, Benha 58, Egypt In ths paper e generalze
More informationLocal operations on labelled dot patterns
Pattern Reconton Letters 9 (1989) 225 232 May 1989 North-Holland Local operatons on labelled dot patterns Azrel RSENFEL and Jean-Mchel JLN Coputer Vson Laboratory, Center J~r Autoaton Research, Unversty
More informationCentroid Uncertainty Bounds for Interval Type-2 Fuzzy Sets: Forward and Inverse Problems
Centrod Uncertanty Bounds for Interval Type-2 Fuzzy Sets: Forward and Inverse Probles Jerry M. Mendel and Hongwe Wu Sgnal and Iage Processng Insttute Departent of Electrcal Engneerng Unversty of Southern
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016 Lessons 7 14 Dec 2016 Outline Artificial Neural networks Notation...2 1. Introduction...3... 3 The Artificial
More informationNon-linear Canonical Correlation Analysis Using a RBF Network
ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane
More informationFlatness Intelligent Control Based on T-S Cloud Inference Neural Network
ISIJ Internatonal, Vol. 54 (04), No., pp. 608 67 Flatness Intellent Control Based on T-S Cloud Inference Neural Networ Xuln ZHANG,,) * Lan ZHAO, ) Jayn ZANG, ) Honn FAN ) and Lon CHENG ) ) Key Laboratory
More informationRevision: December 13, E Main Suite D Pullman, WA (509) Voice and Fax
.9.1: AC power analyss Reson: Deceber 13, 010 15 E Man Sute D Pullan, WA 99163 (509 334 6306 Voce and Fax Oerew n chapter.9.0, we ntroduced soe basc quanttes relate to delery of power usng snusodal sgnals.
More informationCHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD
CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB
More informationNeural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17
Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer
More information, are assumed to fluctuate around zero, with E( i) 0. Now imagine that this overall random effect, , is composed of many independent factors,
Part II. Contnuous Spatal Data Analyss 3. Spatally-Dependent Rando Effects Observe that all regressons n the llustratons above [startng wth expresson (..3) n the Sudan ranfall exaple] have reled on an
More informationarxiv: v2 [math.co] 3 Sep 2017
On the Approxate Asyptotc Statstcal Independence of the Peranents of 0- Matrces arxv:705.0868v2 ath.co 3 Sep 207 Paul Federbush Departent of Matheatcs Unversty of Mchgan Ann Arbor, MI, 4809-043 Septeber
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More informationIntroducing Entropy Distributions
Graubner, Schdt & Proske: Proceedngs of the 6 th Internatonal Probablstc Workshop, Darstadt 8 Introducng Entropy Dstrbutons Noel van Erp & Peter van Gelder Structural Hydraulc Engneerng and Probablstc
More informationModel of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification.
Page 1 Model of Neurons CS 416 Artfcal Intellgence Lecture 18 Neural Nets Chapter 20 Multple nputs/dendrtes (~10,000!!!) Cell body/soma performs computaton Sngle output/axon Computaton s typcally modeled
More informationThe Parity of the Number of Irreducible Factors for Some Pentanomials
The Party of the Nuber of Irreducble Factors for Soe Pentanoals Wolfra Koepf 1, Ryul K 1 Departent of Matheatcs Unversty of Kassel, Kassel, F. R. Gerany Faculty of Matheatcs and Mechancs K Il Sung Unversty,
More informationDepartment of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6
Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.
More informatione i is a random error
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where + β + β e for,..., and are observable varables e s a random error How can an estmaton rule be constructed for the unknown
More informationWeek 9: Multivibrators, MOSFET Amplifiers
ELE 2110A Electronc Crcuts Week 9: Multbrators, MOSFET Aplfers Lecture 09-1 Multbrators Topcs to coer Snle-stae MOSFET aplfers Coon-source aplfer Coon-dran aplfer Coon-ate aplfer eadn Assnent: Chap 14.1-14.5
More informationOn the Eigenspectrum of the Gram Matrix and the Generalisation Error of Kernel PCA (Shawe-Taylor, et al. 2005) Ameet Talwalkar 02/13/07
On the Egenspectru of the Gra Matr and the Generalsaton Error of Kernel PCA Shawe-aylor, et al. 005 Aeet alwalar 0/3/07 Outlne Bacground Motvaton PCA, MDS Isoap Kernel PCA Generalsaton Error of Kernel
More informationtotal If no external forces act, the total linear momentum of the system is conserved. This occurs in collisions and explosions.
Lesson 0: Collsons, Rotatonal netc Energy, Torque, Center o Graty (Sectons 7.8 Last te we used ewton s second law to deelop the pulse-oentu theore. In words, the theore states that the change n lnear oentu
More informationSolving Nonlinear Differential Equations by a Neural Network Method
Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,
More information