An insect-inspired model for visual binding II: functional analysis and visual attention

Size: px
Start display at page:

Download "An insect-inspired model for visual binding II: functional analysis and visual attention"

Transcription

1 Bil Cybern (217) 111: DOI 1.17/s z ORIGINAL ARTICLE An insect-inspired mdel fr visual binding II: functinal analysis and visual attentin Brandn D. Nrthcutt 1 Charles M. Higgins 2 Received: 2 April 216 / Accepted: 27 February 217 / Published nline: 16 March 217 Springer-Verlag Berlin Heidelberg 217 Abstract We have develped a neural netwrk mdel capable f perfrming visual binding inspired by neurnal circuitry in the ptic glmeruli f flies: a brain area that lies just dwnstream f the ptic lbes where early visual prcessing is perfrmed. This visual binding mdel is able t detect bjects in dynamic image sequences and bind tgether their respective characteristic visual features such as clr, mtin, and rientatin by taking advantage f their cmmn tempral fluctuatins. Visual binding is represented in the frm f an inhibitry weight matrix which learns ver time which features riginate frm a given visual bject. In the present wrk, we shw that infrmatin represented implicitly in this weight matrix can be used t explicitly cunt the number f bjects present in the visual image, t enumerate their specific visual characteristics, and even t create an enhanced image in which ne particular bject is emphasized ver thers, thus implementing a simple frm f visual attentin. Further, we present a detailed analysis which reveals the functin and theretical limitatins f the visual binding netwrk and in this cntext describe a nvel netwrk learning rule which is ptimized fr visual binding. B Brandn D. Nrthcutt brandn@nrthcutt.net Charles M. Higgins higgins@neurbi.arizna.edu 1 Department f Electrical and Cmputer Engineering, University f Arizna, E. Speedway Blvd., Tucsn, AZ 85721, USA 2 Departments f Neurscience and Electrical/Cmputer Eng., University f Arizna, 14 E. 4th St., Tucsn, AZ 85721, USA Keywrds Neural netwrks Blind surce separatin Visual binding Object perceptin Visual attentin Artificial intelligence 1 Intrductin Visual binding is the prcess f gruping tgether the visual characteristics f ne bject while differentiating them frm the characteristics f ther bjects (Malsburg 1999), withut regard t the spatial psitin f the bjects. Based n recently identified structures termed ptic glmeruli in the brains f flies and bees (Strausfeld et al. 27; Strausfeld and Okamura 27; Okamura and Strausfeld 27; Paulk et al. 29; Mu et al. 212), we have develped a neural netwrk mdel f visual binding (Nrthcutt et al. 217), which encdes the visual binding in a pattern f inhibitry weights. This mdel s genesis was in the anatmical similarity f insect lfactry and ptic glmeruli and was inspired by the wrk f Hpfield (1991), wh mdeled lfactry binding based n tempral fluctuatins in the mammalian lfactry bulb, and by the seminal wrk f Herault and Jutten (1986) n blind surce separatin (BSS). A pattern f inhibitry weights is learned by the binding netwrk frm visual experience based upn cmmn tempral fluctuatins f spatially glbal visual characteristics, with the underlying suppsitins being that the visual characteristics f any given bject fluctuate tgether, and differently frm ther bjects. An example wuld be when an autmbile passes behind an ccluding tree: its clr, mtin, frm, and rientatin disappear and then reappear tgether, thus underging a cmmn tempral fluctuatin. A detailed descriptin and demnstratin f the functin f this mdel is given in a cmpanin paper (Nrthcutt et al. 217). In the present wrk, we describe the essentials f

2 28 Bil Cybern (217) 111: the mdel, present ne representative demnstratin f its functin in visual binding, and then shw hw t explicitly interpret the binding utput, which the netwrk encdes nly implicitly in the pattern f inhibitry weights. We shw hw t cunt the number f bjects present in the input image sequence and enumerate their characteristics and relative strength. Further, we shw hw t use this infrmatin t create an enhanced image, emphasizing ne particular bject while de-emphasizing all thers, thereby implementing a frm f visual attentin. Finally, we present a theretical analysis f the functinal limitatins and capabilities f this neural netwrk mdel t prvide a better understanding f the netwrk and its ptential. (left) (right) (up) (dwn) i 1 i 2 i 3 i 4 First stage Mtin Secnd stage The visual binding mdel ( ) i 5 Orientatin 5 Mdel simulatins were perfrmed in MATLAB (The MathWrks, Natick, MA) using a simulatin time step f Δt =1 ms and image sizes f 5 5 pixels. Figure 1 shws a diagram f the full tw-stage neural netwrk used. This netwrk begins with a first stage cmprised f three separate, fully cnnected recurrent inhibitry neural netwrks used fr refining the representatin f mtin, rientatin, and clr. The utputs f the first stage are fed int a secnd stage, a fully cnnected ten-unit inhibitry neural netwrk which perfrms visual binding. Despite their apparently disparate purpses, all fur neural netwrks cmprising the verall mdel perate using exactly the same tempral evlutin and learning rules, which are given belw. Each f the fur recurrent neural netwrks used in the mdel (three in the first stage prcessing individual visual submdalities, and ne in the secnd stage perfrming visual binding) learned, by using cmmn tempral fluctuatins, an inhibitry weight matrix that indicated the degree f inhibitin between neurns within each netwrk. We describe the essentials f the peratin and training f these netwrks belw; see Nrthcutt et al. (217) fr full details. 2.1 Cmputatin f inputs t the netwrk Despite the fact that fly clr visin is based n green, blue, and ultravilet phtreceptrs (Snyder 1979), fr ease f human visualizatin and cmputer representatin and withut lss f generality ur clr images had red, green, and blue (RGB) clr planes. Since this netwrk is based n a mdel f the insect brain, an elabrated versin f the Hassenstein Reichardt mtin detectin algrithm (Hassenstein and Reichardt 1956; Santen and Sperling 1985) the standard mdel f insect elementary mtin cmputatin was used t cmpute mtin inputs, and difference-f- Gaussian (DG) filters the best existing mdel f early (6 ) i 6 i 7 (12 ) (red) i 8 (green) i 9 (blue) i 1 Clr Fig. 1 The tw-stage netwrk fr visual submdality refinement and visual binding. Large circles represent units in the neural netwrk. Unshaded half-circles at cnnectins indicate excitatin, and filledhalfcircles indicate inhibitin. During the first phase f training, neurns in the three first-stage netwrks learn t mutually inhibit ne anther, thus refining the representatin f mtin, rientatin, and clr and resulting in mre selective tunings in each submdality. In the secnd phase f training with a stimulus cmprised f mving bars, the secnd-stage visual binding netwrk learns the relative strengths f visual bject features based n cmmn tempral fluctuatins and develps an inhibitry weight matrix t prduce utputs that reflect the tempral fluctuatins unique t each bject insect rientatin prcessing (Rivera-Alvidrez et al. 211) were cnvlved with the image t cmpute rientatin. Inputs t the neural netwrk were cmputed as diagrammed in Fig. 2. Fr achrmatic mtin and rientatin prcessing, each RGB image was first cnverted t grayscale by taking the average f the 3 clr cmpnents

3 Bil Cybern (217) 111: Input Images RGB image G P Feature Prcessing HR I I H V H< H> V< V> DOG DOG 6 DOG 12 2D Feature Images I left I right I dwn I up I I 6 I 12 Wide-field scalar utputs i 1 i 2 i 3 i 4 i 5 i 6 i 7 in vertical and hrizntal mtin images cntaining signed lcal mtin utputs. These were further subdivided int nnnegative leftward, rightward, dwnward, and upward mtin feature images by selecting cmpnents f each mtin image with a particular sign. DG filters selective fr three rientatins (,6, and 12 ) were cnvlved with the grayscale images t create three rientatin feature images. Finally, the red, green, and blue clr planes were used as clr feature images. Each f these ten tw-dimensinal (2D) feature images was spatially summed ver bth dimensins t prduce ten scalar measures f full-image feature cntent. Each grup f scalar signals crrespnding t a given visual submdality (mtin, rientatin, r clr) was then nrmalized t a maximum value f unity, allwing features frm different submdalities t be cmparable while preserving ratis f features within each grup. This nrmalizatin divided each grup f signals by scalar factrs n m (t), n (t), and n c (t) cmputed as the maximum value f any signal in the grup during the last 2 s, thereby autmatically scaling inputs frm different submdalities t becme cmparable with ne anther and simultaneusly adapting the signals t changing visual cnditins. The ten nrmalized scalar values were prvided at each simulatin time step as inputs i 1 (t) thrugh i 1 (t) t the neural netwrk f Fig Netwrk tempral evlutin Fig. 2 Diagram f wide-field visual input cmputatin frm input images. Each input RGB image was cnverted t grayscale (G) fr rientatin and mtin prcessing. Image mtin was cmputed by the Hassenstein Reichardt (HR) elementary mtin detectin mdel in the hrizntal (I H ) and vertical (I V ) directins and then separated int fur feature images I left, I right, I dwn,andi up, respectively, cntaining strictly psitive leftward (H < ), rightward (H > ), dwnward (V < ), and upward (V > ) cmpnents. Three rientatin-selective DG filter kernels were cnvlved with the grayscale image t prduce three rientatin feature images I, I 6,andI 12. The individual red, green, and blue clr planes were taken as the final three feature images I red, I grn,andi blu. Each f these ten 2D feature images was then spatially summed t create a scalar value representing wide-field image feature cntent. The inputs i 1 (t) thrugh i 1 (t) became input t the neural netwrk mdel f Fig. 1 The Hassenstein-Reichardt mtin algrithm was iterated acrss rws and clumns f the grayscale image, resulting I red I grn I blu i 8 i 9 i 1 All inputs t each f the fur recurrent netwrks in the mdel were high-pass filtered befre being prcessed. This ensured that static features f the visual scene such as an unchanging backgrund never became input t the netwrk. As described in detail in a cmpanin paper (Nrthcutt et al. 217), the activatin f each neurn in the mdel which may be psitive r negative represents fr nn-spiking neurns the graded ptential f the neurn relative t its resting ptential and fr spiking neurns the average firing rate relative t the spntaneus rate. The activatin n (t) f neurn n in a recurrent inhibitry neural netwrk may be mdeled as N n (t) = i n (t) W n,k k (t τ i ) (1) k=1 where i n (t) represents a first-rder tempral high-pass filtered versin f the input i n (t) using time cnstant τ HI = 1.s, W n,k represents the strength f the inhibitry synaptic pathway frm neurn k t neurn n, k (t) represents the activatin f a different neurn k in the netwrk, and τ i represents the small but finite delay required t prduce inhibitin. This set f equatins can be expressed in matrix frm as (t) = i (t) W (t τ i ) (2)

4 21 Bil Cybern (217) 111: where lwercase bld symbls indicate N-element clumn vectrs and uppercase bld symbls represent N N matrices. Since the biphysical details f ptic glmeruli and thus τ i are unknwn, but the existence f a finite delay is crucial (as detailed in Nrthcutt and Higgins 217), in ur simulatins we frmulate the netwrk tempral dynamics as (t) = i (t) W (t Δt) (3) where Δt is the simulatin time step. By using τ i = Δt, we prvide the smallest finite inhibitin delay pssible in ur simulatin. This equatin was used in all simulatins. When the time scale f input and utput changes is much larger than the simulatin time step Δt,(3) may be apprximated as (t) = i (t) W (t) (4) Apart fr the use f high-pass filtered inputs, (4)is a cmmn frmulatin fr a fully cnnected recurrent inhibitry neural netwrk used in BSS (Herault and Jutten 1986; Jutten and Herault 1991; Cichcki et al. 1997). Hwever, (4) represents an idealized system, fr which the utputs may be instantaneusly cmputed frm the inputs s lng as the matrix [I + W] 1 exists (I being the identity matrix). This system f equatins can be singular, but, quite unlike any realistic recurrent neurnal netwrk, cannt be temprally unstable. The use f (3) fr tempral dynamics instead f the mre cmmn, seemingly quite reasnable apprximatin f (4) allws fr mdeling f the tempral instability f recurrent neurnal netwrks which, as shwn belw, is crucial t understanding their functin while still allwing netwrk tempral evlutin t be apprximated by (4) when required t make theretical analysis tractable and relate the present netwrk t previus studies. As detailed in a cmpanin paper (Nrthcutt et al. 217), the tempral stability f linear systems such as the ne described by (3) has lng been well understd (Trentelman et al. 212), and stability f such a netwrk may be maintained by simply requiring that the magnitude f all eigenvalues f the weight matrix W be less than unity. 2.3 Netwrk learning rule The Hebbian-style netwrk learning rule used t generate inhibitry weight matrices based n cmmn tempral fluctuatins f the inputs was mdified frm that f Cichcki et al. (1997). This spatially asymmetric multiplicative weight update rule was chsen t supprt the representatin f neurnal activatin described in Sect. 2.2 which des nt explicitly represent neurnal actin ptentials and t leverage existing theretical wrk n neural netwrk slutins t BSS prblems, in awareness f the fact that the underlying bilgical basis f this learning is likely t be spike-timingdependent plasticity (Markram et al. 1997) as discussed in Nrthcutt et al. (217). Weight matrices W were initialized t zer s that the state f the netwrk was (t) = i (t) and thus netwrk utputs were initially identical t high-pass filtered inputs. The netwrk learning rule is frmulated as dw n,k dt = γ μ(t) g( n ) f ( k ) (5) where n = k are neurn indices and γ is a scalar learning rate. Diagnal elements f the weight matrix always remained at zer, preventing self-inhibitin. Any element f W that became negative after a learning rule update was set t zer, thereby enfrcing that netwrk weights were strictly inhibitry. μ(t) is a learning nset functin that has a value f zer at the start f training and rises asympttically t unity with a time cnstant f 2 s. Our frmulatin f μ(t) quite unlike the identically named functin used by Cichcki et al. is used t gradually turn n the learning rule at the start f training t avid a pwerful transient in weights based slely n the initial phase f the inputs. n (t) and k (t), respectively, indicate first-rder tempral high-pass filtered versins f netwrk utputs n and k with time cnstant τ HO =.5 s. Nte that use f high-pass filtered utputs in the learning rule, rather than simply the utputs, makes the netwrk s learning dependent n tempral fluctuatins f the inputs: specifically thse fluctuatins with tempral frequencies greater than the cutff frequency f the utput high-pass filter. We used netwrk activatin functins f (x) = x 3 and g(x) = tanh(π x) similar t thse used by previus authrs (Jutten and Herault 1991; Cichcki et al. 1997) t intrduce higher-rder statistics f the filtered utputs int the learning rule (Hyvärinen and Oja 1998), althugh the psitins f these tw functins with respect t rws n and clumns k f the weight matrix are exchanged in (5) as cmpared t the cnventinal BSS learning rule. This exchange is crucial t the functin f ur mdel, and bth the rle f these activatin functins and the requirement that they be exchanged fr ur mdel are addressed in detail in Sect Fr reasns that will becme apparent later, we will refer t the learning rule with activatin functins in their cnventinal psitins as the cperative learning rule. In cntrast, we will refer t the learning rule f (5) with exchanged activatin functins as the cmpetitive learning rule. 2.4 Training f the mdel Befre training f any netwrk began, a visual input was presented and all linear filters and the input adaptive scaling

5 Bil Cybern (217) 111: algrithm were allwed t reach steady state t eliminate artifactual startup transients. Training began with each f the three first-stage netwrks using a learning rate f γ 1 = 5. This training resulted in the first-stage mtin, rientatin, and clr netwrks, respectively, learning weight matrices M, O, and C. T rapidly give the first-stage netwrks sufficient experience t refine each visual submdality, an artificial visual stimulus (detailed in Nrthcutt et al. 217) was presented that prvided simultaneus tempral fluctuatins in all clrs, rientatins, and directins f mtin. This stimulus prvided near-identical signals t each input f every netwrk, effectively reducing the learning rule f (5) t a purely Hebbian ne (Hebb 1949). This input resulted in unifrm symmetric (zer diagnal) weight matrices, indicating unifrm lateral inhibitin: a wellknwn technique fr sensry refinement (Linster and Smith 1997). Hwever, with this symmetric stimulus weight matrices never cnverge t a stable state, but rather increase in value as lng as training cntinues. Fr this reasn, learning fr each first-stage netwrk was terminated by setting its respective learning rate γ 1 t zer when the maximum magnitude f any eigenvalue f the netwrk weight matrix reached a value f V 1,max =.9. This prcedure allwed us t rapidly learn strng lateral inhibitin in the first stage while aviding temprally unstable recurrent netwrks. During this first-stage training perid, the learning rate γ 2 fr the secnd stage was set t zer. While nt strictly necessary, this islatin f the tw stages allwed fr rapid training f the first-stage netwrk and a clear demnstratin f secnd-stage functin. After first-stage learning was cmplete, the secnd-stage learning rate was set t γ 2 =.5, after which visual stimuli cmpsed f multiple bjects and intended t demnstrate visual binding were presented, as shwn in the next sectin. In this secnd phase f training, the secnd-stage netwrk learned an inhibitry cnnectin matrix T indicating the binding amng visual submdalities. T avid tempral instability f the secnd-stage netwrk, if any weight matrix update resulted in T having an eigenvalue with a magnitude V greater than V 2,max =.95, the weight matrix was multiplied by a scalar factr V 2,max /V, thus hlding the maximum eigenvalue at V 2,max and maintaining netwrk tempral stability. mved dwn and left at 21. Bth bars were riented with their lngest axis rthgnal t the directin f mtin and mved at 5 pixels per secnd. The bars mved thrugh a fixed pattern f multiplicative hrizntal sinusidal shadw with a spatial perid f 5 pixels, a mean value f.5, and an amplitude f.25. As the bars mved independently thrugh this pattern f shadws, the features f each bar fluctuated tgether, allwing the mdel t learn their individual characteristics. Bars wrapped arund tridally t re-enter the image, thus putting n time limitatin n netwrk training. Figure 3a and c, respectively, shws the time curse f netwrk utputs and the final weight matrix when the netwrk was trained fr 15 s with this visual stimulus using the cmpetitive learning rule f (5). Fr cmparisn, Fig. 3b and d shws the same data using the cperative learning rule, in which the activatin functins f () and g() f (5) are placed in their cnventinal psitin in the BSS literature (Cichcki et al. 1997), with the expansive functin then f () applying t rw elements, and the cmpressive functin g() t clumn elements. The results f Fig. 3a and c using the cmpetitive learning rule crrespnd t desired peratin f the visual binding netwrk. The red and green utput neurns clearly cme t dminate all thers, and (neglecting very small weights) the clumns f the final weight matrix crrectly indicate the characteristics f the individual bjects which cmprised the stimulus. Reading the weights frm the red clumn, the bar mved t the right, and dwn t a lesser extent, and gt a rughly equal respnse frm the and 12 rientatin filters, indicating an apprximate rientatin f 3 r equivalently 15. Frm the green clumn, the bar mved t the left, dwnward t a lesser extent, and had an rientatin f apprximately 3 r equivalently 21. In stark cntrast, the results shwn in Fig. 3b and d reveal clearly that the cperative learning rule is nt applicable t the visual binding mdel. The reasns fr this are detailed in Sect This single example suffices t illustrate the methds f weight matrix interpretatin and netwrk functinal analysis presented belw, but many further experiments are described and full details given in a cmpanin paper (Nrthcutt et al. 217). 3 An example f visual binding T demnstrate the peratin f the visual binding netwrk, an artificial visual stimulus cmpsed f mving 5 12-pixel bars n a black backgrund was presented. This stimulus cnsisted f a red bar that started near the upper left crner f the image and mved dwn and right at 3, and a green bar that started near the upper right and 4 Extracting bject-level infrmatin While we have presented and discussed the secnd-stage netwrk s far as if it learned the features f a static set f bjects and then stabilized, the tempral dynamics f learning are in general far mre cmplicated. After initial training f the first stage, the learning f the secnd stage never stps. This is

6 212 Bil Cybern (217) 111: Left Right Dwn Up 6 12 Red Green Blue Stage 2 utputs Time since start f stage 2 training (s) (a) Netwrk utputs cmpetitive learning rule Stage 2 utputs Time since start f stage 2 training (s) (b) Netwrk utputs cperative learning rule Left Left Each rw: weights TO this neurn Right Dwn Up 6 12 Red Green Blue Each rw: weights TO this neurn Right Dwn Up 6 12 Red Green Blue Left Right Dwn Up 6 12 Red Green Blue Each clumn: weights FROM this neurn (c) Wieghts using cmpetitive learning Left Right Dwn Up 6 12 Red Green Blue Each clumn: weights FROM this neurn (d) Wieghts using cperative learning Fig. 3 Outputs and weights f the secnd-stage netwrk as it trained with a visual stimulus cmprised f tw bars mving thrugh sinusidal shadw. A legend t identify each trace in panels a and b is shwn at upper left. a Netwrk utputs when using the cmpetitive learning rule f (5), in which the activatin functins f () and g() are switched in psitin relative t cnventinal learning rules fr BSS, and thus emphasize patterns f clumn ver rw weights. Nte that ver the time f training, the red and green utputs cme t inhibit all thers. b Netwrk utputs when using the cnventinal cperative learning rule, which emphasizes patterns f rw ver clumn weights. N pattern f utputs is evident apart frm nearly unifrm inhibitin. c Final weight matrix T using the cmpetitive learning rule f (5). Brighter clrs indicate strnger inhibitin. The strngest weights are in the red and green clumns, and clearly indicate the features f each bar. d Final weight matrix T using the cperative learning rule desirable because it allws the netwrk t cntinuusly adapt t dynamic visual scenery. Hwever, shuld a set f bjects in the visual scene persist sufficiently lng, the matrix T will cnverge t a particular set f weights representing the characteristics f the bjects, and the number f utputs (t) that are significantly nnzer will cme t crrespnd t the number f bjects. The cnvergence f secnd-stage learning fr a given visual stimulus and thus the validity f what has been learned may mst simply be determined at the current time t by requiring that the sum f all abslute weight matrix

7 Bil Cybern (217) 111: changes ver a recent perid f time τ s declines belw a threshld S thr t t τ s ( N N n=1 k=1 dt n,k dt ) dt < S thr (6) Once the weight matrix has stabilized, the representatin f bjects in the image develped by the visual binding netwrk is implicit in the activity f the utputs (t) and the cnnectin matrix T. 4.1 The number f bjects and their features This representatin may easily be made mre explicit fr human interpretatin, bth making netwrk peratin easier t understand and giving practical utility t the mdel. The weight matrix T may be simplified by nrmalizing it t its maximum value ver all rws and clumns and then remving weights less than a given threshld, which may be expressed as T nrm = T/ max { Tn,k nrm T simp n,k = n,k=1,n ( T n,k ) T nrm n,k (7) >= v min Tn,k nrm (8) <v min We used a threshld value f v min =.33 (1/3 f the maximum weight value). An example f this simplified weight matrix, generated frm the raw weight matrix shwn in Fig. 3c, is shwn in Fig. 4a. Here the clumn-riented pattern f weights is even mre evident, and the relative strength f each feature may be read ut directly frm the matrix. In fact, it is pssible t make the representatin f bjects and their features even mre explicit. By summing T simp vertically, a ten-element rw vectr t sum results t sum k = N n=1 T simp n,k k = (9) Each element tk sum represents the sum f all inhibitry weights t ther neurns frm any neurn k. Using this infrmatin, we can create an bject matrix O that explicitly describes the number f bjects and their features. Every neurn k fr which the entry tk sum is abve a threshld value (fr which we used.6) is cncluded t have accumulated sufficient inhibitry weight t represent an bject. Fr each such neurn k, the weights frm T simp fr clumn k maybeusedtcreateanewrwi f the bject matrix O as Left Right Dwn Up 6 12 Red Green Blue Object 1 Object 2 Left Right Dwn Up 6 12 Red Green Blue (a) simplified visual binding matrix Left Right Dwn Up (a) Object matrix extracted frm binding matrix 6 12 Red Green Blue Fig. 4 Extractin f bject-level infrmatin. a Simplified visual binding matrix T simp, frm which bject features can clearly be read. b Feature matrix O extracted frm T simp using (1). Each rw f the bject matrix crrespnds t a unique bject. The assciated vectr r = [ 8 9 ] indicates that the first rw f the bject matrix crrespnds t utput 8 (red) and the secnd rw t utput 9 (green) r i = k (11) where the vectr r is used t stre the index f the utput neurn crrespnding t each bject matrix rw i, and the zer diagnal element f T simp is replaced with unity t represent the fact that this rw f weights in the bject matrix riginated frm neurn k (see Sect fr a theretical justificatin). An example f the resulting bject matrix O is shwninfig.4b: The number f rws f O crrespnds t the number f bjects, the clumns t each individual visual feature, and the weights in each clumn t the learned strength f each characteristic feature. In this case, the bject matrix O accurately represents the mtin, rientatin, and clr f the tw bjects in the example visual stimulus. 4.2 Elementary visual attentin O i, j = { simp T j,k j = k 1 j = k j = (1) As can be seen frm the example data shwn in Fig. 3a, the mutually inhibitry secnd-stage utputs crrespnding

8 214 Bil Cybern (217) 111: t bjects in the visual scene fluctuate ver time, with each having greater value in prprtin t fluctuatins f the characteristics f the represented bject, measured in terms f the visual features input t the netwrk. This verall measure f feature strength is clsely related t cmputatinal mdels f visual saliency (Itti and Kch 21), and s the mdel might reasnably be said t be switching its visual attentin (Itti et al. 1998) frm ne bject t anther as their relative saliency changes. In fact, visual attentin is ften mdeled as a winner-take-all phenmenn (Lee et al. 1999), which is quite akin t the mutually inhibitry cmpetitin f the secnd-stage netwrk. The neurn that has the largest utput value at any given time crrespnds t the mst salient bject. This mvement f this attentinal sptlight frm ne bject t anther can be used t emphasize the currently attended bject in the visual image and simultaneusly de-emphasize unattended bjects. We may synthesize such an enhanced image by recmbining the feature matrices created in Fig. 2 fr every input image using the rw f weights frm the bject matrix O crrespnding t the currently winning neurn. At any given time, let netwrk utput k (t) currently be the largest. Rw i such that r i = k f the bject matrix O represents the visual features f this utput. T make the raw feature matrices cmparable in magnitude t ne anther, we make use f the input adaptive grup nrmalizatin factrs already cmputed and described in Sect. 2.1, which may be cmpsed int a single ten-element vectr n(t) =[n m n m n m n m n n n n c n c n c ] We may then cmbine this vectr f nrmalizatin factrs with the feature weights assciated with this bject and the neurn utput t create a vectr f (t) f weights t be applied t each feature matrix f j (t) = k(t) O i, j n j (t) j = (12) where the abslute value is used s that large values f the utput k (t), regardless f sign, result in large cntributins t the enhanced image. The vectr f (t) may then be used t create a linear recmbinatin f the feature matrices cmputed frm the current input image as shwn in Fig. 5, resulting in a nrmalized RGB mask identifying where salient features exist in the current image. By multiplying this mask with the current input image, an enhanced image is created in which the characteristics assciated with the bject represented by utput k are emphasized while effectively de-emphasizing ther bjects. Figure 6 presents a demnstratin f the algrithm t enhance the mst salient bject in input images using ur example tw-bar visual stimulus, the netwrk utputs and Feature images I left I right I dwn I up I red I blu Filtering, rectificatin with HPF HPF HPF HPF HPF HPF ABS ABS ABS ABS I HPF ABS Weighting f j (t) f left f right f up f dwn f f I 6 6 HPF ABS f 12 I 12 HPF ABS ABS I grn HPF ABS ABS f red f green f blue Image summatin Enhanced Nrmalizatin Masking Image NORM RGB mask Input image Enhanced image Fig. 5 Cmputatinal diagram fr creating an enhanced image. Thin lines indicate scalarquantities; thick lines represent matrices. Starting at figure left, each f the ten 2D feature images created in Fig. 2 is first prcessed thrugh a first-rder high-pass filter (HPF) using the same time cnstant τ HI as was used fr netwrk inputs, and the abslute value f each matrix taken (ABS) s that bth increases and decreases in features are represented in the utput. Each f the resulting filtered matrices is then multiplied (Π) by the crrespnding scalar weight f j frm (12). These weighted feature images are summed pint-by-pint (Σ) int a single RGB image, after which this image is nrmalized (NORM) by a scalar value crrespnding t its maximum ver all pixels and clr planes. This results in an RGB mask that identifies where salient features f the input image exist. This mask is then multiplied pint-bypint (Π) with the input RGB image, resulting in a 2D RGB enhanced image that emphasizes the mst salient bject. Nte that mtin and rientatin feature images cntribute equally t red, green, and blue clr planes, but red, green, and blue feature images cntribute nly t their wn clr plane final weight matrix f which are shwn in Fig. 3a and c. Cmparing the input and enhanced images Fig. 6a and b taken at 6.5 s int training, the red bar is clearly strnger relative t the green in the enhanced image than in the input. T quantify this effect, the rati f the maximum red value t the maximum green value in the input image is apprximately

9 Bil Cybern (217) 111: This algrithm prvides a bttm-up methd fr autmatically attending t the mst salient bject in an image sequence ver time withut needing any prir infrmatin abut bjects r their features. 5 Functin and limitatins f the mdel Fig. 6 Demnstratin f enhancement f the mst salient bject. Each f the fur panels shws a pixel regin crpped frm the center f a 5 5 image, taken at times when the tw mving bars passed near ne anther. a The input image at 6.5 s after the beginning f secnd-stage training, a time at which the red bar was the mst salient (refer t red and green neurn utputs in Fig. 3a). The rle f shadwing in saliency is evident here: the red bar is strngly visible in the image, almst maximally ut f shadw, whereas the green bar is almst half shadwed. b The enhanced image resulting frm the algrithm f Fig. 5, emphasizing the red bar while the green bar is barely visible. c The input image at 7.2 s int secnd-stage training, a time at which the green bject was mst salient (refer t Fig. 3a). b The resulting enhanced image, emphasizing the green bar while the red bar is barely visible unity, but in the enhanced image, the maximum red value is 6.2 times strnger than that f green. In cntrast, cmparing the input and enhanced images shwn in Fig. 6c and d taken at 7.2 s int training, the green bar is enhanced relative t the red, and again red and green have nearly the same maximum value in the input image, but green is 4.1 times strnger in the enhanced image. Lking mre clsely at Fig. 6, the effect is nt that the characteristics assciated with the largest neurn utput, representing the mst salient bject, are greatly increased in the enhanced image; rather, the characteristics f all ther bjects, even backgrund bjects (due t the high-pass filtering f the feature matrices), are weakened. It is ntable that this is very reminiscent f attentinal effects bserved in primate visual crtex (Mran and Desimne 1985): Respnses t attended bjects are nt increased by attentin, but rather respnses t unattended bjects are relatively weakened. We have presented ne example set f experimental results abve and a variety f experiments in a cmpanin paper (Nrthcutt et al. 217) practically demnstrating the utility f this tw-stage neural netwrk mdel fr sensry refinement, visual binding, internal bject-level representatin, and even rudimentary visual attentin. Hwever, it is impssible t present an exhaustive set f visual stimuli. Under what cnditins des this subtly cmplex neural netwrk mdel actually perfrm visual binding, hw is this accmplished, and what are its limitatins? Intriguingly, the same recurrent neural netwrk subunit is used fur times fr tw distinct purpses in this mdel. The first-stage netwrks (refer t Fig. 1) take as their input feature sensrs which may have significant verlap in their particular visual submdality, and the utput signals shw a significant reductin in verlap f the input sensrs: Fr example, ne f the first-stage netwrks takes inputs which are bradly rientatin-tuned and refines them using lateral inhibitin int utputs with narrwer rientatin tuning. In the first stage, we have applied this same recurrent netwrk t three specific visual submdalities: clr, rientatin, and mtin. Balanced mutual inhibitin is a naturally stable state f a fully cnnected recurrent inhibitry neural netwrk, reached autmatically by the learning rule when given an apprpriately balanced set f visual stimuli. In fact, ur training f the first stage is specifically designed t elicit lateral inhibitin. Lateral inhibitin has lng been understd t prvide refinement f sensry inputs at many levels, its effects having been studied as early as the mid-19th century (Mach 1866), and a number f sensry systems have been prpsed t functin using this mechanism, including visin (Blakemre and Tbin 1972), auditin (Shamma 1985), and lfactin (Linster and Smith 1997). While sensry refinement via lateral inhibitin may well be a useful functin in many neural systems, des the first stage cntribute anything t the peratin f the visual binding netwrk? In fact, we have shwn experimentally in a cmpanin paper (Nrthcutt et al. 217) that strng mutual inhibitin in the first stage is essential t the rapid learning demnstrated in Fig. 3a. Withut strng first-stage inhibitin, secnd-stage learning prceeded very slwly, and netwrk weights are very unlikely t ever have reached the stable state shwn in Fig. 3c.

10 216 Bil Cybern (217) 111: While first-stage refinement f selectivity in mtin directin, rientatin, and clr inputs undubtedly aids the visual binding netwrk in distinguishing bjects, the first stage s mst essential functin fr secnd-stage learning arises frm a mre subtle effect. The strng inhibitin between firststage utputs creates a tendency fr them t becme mutually exclusive, much like a weak winner-take-all functin. Thus, when an bject passes ut f shadw and becmes mre salient, the first-stage netwrk aids in causing all f its visual attributes t increase simultaneusly by inhibiting weaker attributes in each visual submdality. Since the learning rule f (5) develps inhibitry weights based n simultaneus increases r decreases f visual attributes, the synchrnizatin f visual features frm the same bject caused by first-stage inhibitin greatly speeds the learning f secndstage netwrk weights. Althugh the netwrk structure, tempral evlutin, and learning rule f the secnd-stage netwrk (refer t Fig. 1) are the same as the three first-stage netwrks, the peratin f the secnd stage is much mre bscure. The secnd-stage netwrk learns by assciating input signals which are temprally crrelated, just as the first-stage netwrks d. Hwever, as has been prpsed fr mammalian crtex (Duglas et al. 1989), it is the variety f dissimilar inputs t this netwrk, nt its internal structure, that makes its functin differ frm the first stage. In the present wrk, these signals are full-field spatial summatins f elementary visual features. Based n the results we have shwn, the empirical result f this stage s peratin is t detect signals which riginate frm each independently fluctuating bject in the input and t quantify hw strngly that bject expresses each visual feature, thereby prviding a slutin t the visual binding prblem. 5.1 Visual binding as blind surce separatin The prblem that the secnd-stage netwrk must slve is really that f blind surce separatin. The cannical example f BSS is typically a linear mixture f auditry surces. Several spatially separated micrphnes listening t a mixture f spatially distinct independent audi surces can be used t recver the individual audi surces. The use f recurrent neural netwrks fr BSS is a very well studied area, with much f the research stemming frm the seminal wrk f Herault and Jutten (1986). Mathematically, the prblem f BSS can be stated as fllws. Cnsider a clumn vectr f N time-varying surces s(t). These surces are cmbined in an unknwn (but static) linear mixture described by an N N mixing matrix M t prvide a vectr f N mixed inputs i(t) = M s(t) (13) The challenge f BSS, as it is generally cnsidered, is t recver the hidden surces s(t) given nly the mixed bservatins i(t). The mixing matrix M is recvered implicitly in this prcess, but ften is f little interest: fr example, in the auditry case, M wuld describe the relative micrphne lcatins, which are usually knwn. Rather, it is mst cmmnly the hidden surce signals s(t) that cntain the desired infrmatin. Given the inputs i(t), the fully cnnected inhibitry neural netwrk used in this paper as described by the instantaneus update rule f (4), and neglecting the high-pass filter n the inputs has been shwn t cnverge (given a prper learning rule, and under certain cnditins, addressed belw) t a state in which the utputs (t) becme scaled versins f the hidden surces s(t) (Jutten and Herault 1991), thus revealing each surce separately. The technique we have develped fr prcessing 2D images int a small number f highly meaningful scalar signals, shwn in Fig. 2, allws the prblem f visual binding t be reduced t ne f BSS. Each f the full-reslutin visual feature images is summed int a single time-varying signal which is a linear instantaneus summatin f the visual feature cntributins f all bjects in the scene. In the case f visual binding quite ppsite t that f auditry BSS we are nt particularly interested in recvering the hidden surces s(t). These surces crrespnd t tempral fluctuatins f the visual feature signals frm a given bject caused, fr example, by changes in lighting, mvement thrugh cclusins, r changes in distance, and are f little practical value. Rather, we are interested in determining hw many independent surces exist in the scene and with which visual features they crrespnd. This implies that, in visual binding, the mixing matrix M is f primary interest because it reveals the features f bjects in the scene, and can be used t enumerate them. 5.2 Assumptins f the visual binding mdel Fr any BSS prblem, the number f distinct hidden surces and inputs and the details f their mixture determines whether that prblem can be slved. After all, slving the BSS prblem is nt pssible in every case: cnsider separating many auditry surces given micrphnes all lcated at the same spatial lcatin! An analysis f the requirements fr the islated secnd-stage BSS netwrk t functin prperly is given in the next sectin. Hwever, due t the vastly greater cmplexity f the visual binding prblem which takes a sequence f 2D RGB images rather than a time-varying vectr f scalar signals as input the additinal assumptins required fr ur mdel t perfrm prperly include the fllwing, all f which we assert are reasnable in virtually any practical situatin.

11 Bil Cybern (217) 111: The visual scene is dynamic. In rder fr the visual binding mdel t detect and differentiate bjects frm the backgrund and ther bjects, the visual scene must change, as bserved in the feature space measured by the inputs t the netwrk. This is in direct cntrast t visual BSS mdels prpsed by ther authrs that are designed fr separating mixtures f static images (Jutten and Herault 1991; Gu and Garland 26). Instead, ur mdel takes as input glbal spatial summatins f lcal feature detectr circuits and relies n changes ver time in the scene t prduce tempral fluctuatins in these signals. Due t the tempral high-pass filters in the input pathway (3) and in the learning rule (5), there wuld be n input t the netwrk and n changes t the weight matrix f this system in respnse t static images, and thus n learning. This assumptin implies an inherent interest in nvelty detectin, ignring static backgrund features and making the netwrk prefer bjects which are mre active in the visual scene. The assumptin f a dynamic visual scene is hardly restrictive; in fact, this is unavidable in almst every practical situatin. Specifically, t be useful fr binding, the tempral frequency f fluctuatins in the features f an bject must nt be lwer than the cutff frequency set by the high-pass filter time cnstant τ HO used in the learning rule (5) r they will be filtered ut. This time cnstant, f curse, can be adjusted t fit the rate f change f the features f an bject f interest. Related t this assumptin is a subtle dependence n spatial image reslutin. Since the inputs t ur netwrk are scalar glbal spatial sums, it is nt bvius what 2D spatial reslutin f input images is required t supprt visual binding. In the limit f very tiny images, n matter what the visual features are, they will have little r n tempral fluctuatin due t the fact that there are nt enugh image pixels. As image size increases, feature fluctuatins will becme smther, and thus, the quality f features prvided t the netwrk will increase. Hwever, this quality will saturate as the image reslutin becmes sufficient t clearly visualize bjects in a given scene. The ptimal spatial reslutin will be highly dependent n the scene in which bjects are t be bserved. 2. Visual features f an bject are crrelated. Fr the visual binding netwrk t functin prperly, visual features riginating frm the same bject, which then becme inputs t the netwrk, must have tempral crrelatin with each ther. This requirement simply means that visual features f any given bject measured by the netwrk (in ur case, mtin, clr, and rientatin) must vary tgether ver time, as they wuld in any natural situatin. A wide variety f cmmn scene changes satisfy this assumptin. Occlusin f an bject by anther visual scene element causes a rapid decrease in all lcal visual features f the ccluded bject, and therefre a reductin in the widefield summatin f these signals; there is a crrespnding increase if the bject then reappears. Similarly, when an bject passes in and ut f shadws, r becmes nearer t r mre distant frm a light surce, the visual measures f the bject will vary tgether in prprtin t the resulting brightness fluctuatins. Als, as an bject draws nearer t the camera, the bject size increases in the visual image. Since a clser bject stimulates a greater number f lcal feature detectin circuits, its wide-field signals grw in inverse prprtin t its distance. 3. Fluctuatins frm different bjects are distinct. Herault and Jutten s riginal wrk n BSS (Herault and Jutten 1986) inspired many related methds: independent cmpnent analysis (Cmn 1994), infrmatin maximizatin (Bell and Sejnwski 1995), and mutual infrmatin minimizatin (Yang and Si 1997). The majrity f these methds require statistical independence fr surces t be separated. Hwever, we bserve that while statistically independent surces can be guaranteed t separate, this assumptin is nt strictly necessary. Fr example, as we have shwn in Fig. 3, tw sinusidal hidden surces with the same frequency but different phase can be separated, despite nt being statistically independent. Hwever, these surces are sufficiently distinct t be separable. Implicit in the requirement f statistical independence is that visual feature fluctuatins are stchastic. This will generally be true in practical situatins, due t the fact that these fluctuatins derive frm bjects mving thrugh shadws, past cclusins, r mving unpredictably with respect t the camera. The assumptin that fluctuatins frm different bjects will be statistically independent is als reasnable in practical cases: visual features frm different bjects will be independent simply because they result frm independent physical prcesses in the wrld. Hwever, the netwrk des nt simply fail if this assumptin is nt satisfied. If features f tw bjects are nt statistically independent, nr even distinct, the visual binding netwrk will bind these signals tgether t represent a single bject, which in perspective is nt an unreasnable cnclusin given a set f visual features that are highly crrelated with ne anther. 4. Object features are persistent. If a given visual feature (the clr f an bject, fr example) is t be used in binding, an bject must retain that feature ver a perid f time sufficient fr the netwrk t learn a pattern f weights based upn it. While the netwrk relies n tempral fluctuatins f these features fr learning, we must assume that at least sme subset f an bject s features (chsen frm the clr, mtin, and rientatin submdalities fr the present netwrk) are relatively persistent. Fr example, the mtin feature inputs frm a car driving rapidly in a circle, and thus cnstantly changing directin, wuld nt be useful in slving the visual binding prblem,

12 218 Bil Cybern (217) 111: althugh its clr wuld likely remain cnstant as it turned, and wuld be useful in binding. The upper limit n the speed that visual features f an bject may change is set by the learning rate γ in the learningrulef(5). Features f a given bject that appear and disappear s quickly that the integratin f their effect ver time never creates a significant change in the weight matrix T will nt affect the utput. Within the bunds f numerical stability, the learning rate can be adjusted t fit the time curse f feature changes fr a given visual stimulus. 5. Measured visual features must be diverse. In rder t separate ut a variety f different bjects, the visual features measured by the mdel must be diverse, and span a range f visual submdalities. Preferably, the feature inputs shuld span the full space f interest in each visual submdality. An example f this diversity is the feature set we have shwn in Fig. 2, fully spanning mtin, clr, and rientatin. If the visual features measured are nt sufficient t separate the bjects f interest (fr an extreme example, imagine that all ten feature detectrs were nly sensitive t the amunt f red clr in the image), in general, visual binding will nt be pssible. 5.3 Analysis f a tw-neurn BSS netwrk Given the assumptins f the previus sectin, we can reduce the visual binding prblem slved by the mdel as a whle t ne f blind surce separatin in the secnd stage. The netwrk utput and inhibitry weight tempral dynamics f the secnd-stage netwrk that we have used fr visual binding in the mdel are surprisingly cmplex, nce equipped with the time-stepping neurn update rule f (3) and the learning rule f (5), bth f which are ptentially unstable. Outside the cntext f visual binding, the generalized analysis f an N-neurn BSS netwrk with N(N 1) inhibitry weights is highly frmalized and unrevealing and has already been well addressed in the literature (Jutten and Herault 1991; Sruchyari 1991; Jh et al. 2). Hwever, the secnd stage is sufficiently cmplex that even a tw-neurn netwrk with nly tw inhibitry weights can generate extremely unexpected results, and an analysis f this minimal netwrk better serves t clarify hw the secnd stage f ur mdel wrks, and hw ur nvel changes t the netwrk update and learning rule affect the mdel s perfrmance. Fr this reasn, we base ur theretical analysis f the secnd stage arund a tw-neurn netwrk fr which all slutins can be simply enumerated and generalize ur results t the full netwrk wherever pssible. We fllw with demnstratins f the functin f this tw-neurn netwrk that can be clearly understd in terms f the analysis shwn, and make cnclusins abut the full ten-neurn secnd-stage netwrk thereafter. Our analysis begins by enumerating the pssible crrect slutins t the BSS prblem fr a tw-neurn netwrk withut any learning using the instantaneus netwrk update rule f (4), which fr very small simulatin time steps apprximates the far less analytically tractable time-stepping update rule actually used in the mdel. Fr simplicity, in this sectin we assume that all surces (and thus inputs) have tempral frequencies sufficiently higher than the cutff frequency f the input high-pass filter s that i (t) = i(t) The typical case The typical case in BSS is ne in which the number f distinct, nnzer hidden surces is the same as the number f netwrk inputs and utputs. In this case, as will be shwn, there is n single crrect slutin but rather a family f clsely related slutins. Fr the tw-neurn netwrk, the set f scalar equatins described by (13) is i 1 (t) = M 1,1 s 1 (t) + M 1,2 s 2 (t) (14) i 2 (t) = M 2,1 s 1 (t) + M 2,2 s 2 (t) (15) The tempral evlutin equatin f (4) becmes 1 (t) = i 1 (t) T 1,2 2 (t) (16) 2 (t) = i 2 (t) T 2,1 1 (t) (17) int which we can substitute (14) and (15) t get 1 (t) = M 1,1 s 1 (t) + M 1,2 s 2 (t) T 1,2 2 (t) (18) 2 (t) = M 2,1 s 1 (t) + M 2,2 s 2 (t) T 2,1 1 (t) (19) Fr the blind surce separatin prblem t be slved, the utputs (t) must becme equal t scaled versins f the hidden inputs s(t) (Jutten and Herault 1991). As lng as the weight matrix M is nnsingular, there are exactly tw pssible slutins that can be learned in the inhibitry cnnectin matrix T. As previus authrs have shwn (and may be easily derived frm the equatins abve), the first situatin in which crrect surce separatin ccurs is when surce indices are nt permuted with respect t the utputs 1 (t) = M 1,1 s 1 (t) (2) 2 (t) = M 2,2 s 2 (t) (21) which results in the cnnectin matrix T = [ M 1,2 ] M 2,2 M 2,1 M 1,1 (22)

An insect-inspired model for visual binding I: learning objects and their characteristics

An insect-inspired model for visual binding I: learning objects and their characteristics Bil Cybern (27) :85 26 DOI.7/s422-7-75- ORIGINAL ARTICLE An insect-inspired mdel fr visual binding I: learning bjects and their characteristics Brandn D. Nrthcutt Jnathan P. Dyhr 2 Charles M. Higgins 3

More information

Least Squares Optimal Filtering with Multirate Observations

Least Squares Optimal Filtering with Multirate Observations Prc. 36th Asilmar Cnf. n Signals, Systems, and Cmputers, Pacific Grve, CA, Nvember 2002 Least Squares Optimal Filtering with Multirate Observatins Charles W. herrien and Anthny H. Hawes Department f Electrical

More information

ENSC Discrete Time Systems. Project Outline. Semester

ENSC Discrete Time Systems. Project Outline. Semester ENSC 49 - iscrete Time Systems Prject Outline Semester 006-1. Objectives The gal f the prject is t design a channel fading simulatr. Upn successful cmpletin f the prject, yu will reinfrce yur understanding

More information

Differentiation Applications 1: Related Rates

Differentiation Applications 1: Related Rates Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm

More information

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y=

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y= Intrductin t Vectrs I 21 Intrductin t Vectrs I 22 I. Determine the hrizntal and vertical cmpnents f the resultant vectr by cunting n the grid. X= y= J. Draw a mangle with hrizntal and vertical cmpnents

More information

initially lcated away frm the data set never win the cmpetitin, resulting in a nnptimal nal cdebk, [2] [3] [4] and [5]. Khnen's Self Organizing Featur

initially lcated away frm the data set never win the cmpetitin, resulting in a nnptimal nal cdebk, [2] [3] [4] and [5]. Khnen's Self Organizing Featur Cdewrd Distributin fr Frequency Sensitive Cmpetitive Learning with One Dimensinal Input Data Aristides S. Galanpuls and Stanley C. Ahalt Department f Electrical Engineering The Ohi State University Abstract

More information

Building to Transformations on Coordinate Axis Grade 5: Geometry Graph points on the coordinate plane to solve real-world and mathematical problems.

Building to Transformations on Coordinate Axis Grade 5: Geometry Graph points on the coordinate plane to solve real-world and mathematical problems. Building t Transfrmatins n Crdinate Axis Grade 5: Gemetry Graph pints n the crdinate plane t slve real-wrld and mathematical prblems. 5.G.1. Use a pair f perpendicular number lines, called axes, t define

More information

Fall 2013 Physics 172 Recitation 3 Momentum and Springs

Fall 2013 Physics 172 Recitation 3 Momentum and Springs Fall 03 Physics 7 Recitatin 3 Mmentum and Springs Purpse: The purpse f this recitatin is t give yu experience wrking with mmentum and the mmentum update frmula. Readings: Chapter.3-.5 Learning Objectives:.3.

More information

Module 4: General Formulation of Electric Circuit Theory

Module 4: General Formulation of Electric Circuit Theory Mdule 4: General Frmulatin f Electric Circuit Thery 4. General Frmulatin f Electric Circuit Thery All electrmagnetic phenmena are described at a fundamental level by Maxwell's equatins and the assciated

More information

, which yields. where z1. and z2

, which yields. where z1. and z2 The Gaussian r Nrmal PDF, Page 1 The Gaussian r Nrmal Prbability Density Functin Authr: Jhn M Cimbala, Penn State University Latest revisin: 11 September 13 The Gaussian r Nrmal Prbability Density Functin

More information

Interference is when two (or more) sets of waves meet and combine to produce a new pattern.

Interference is when two (or more) sets of waves meet and combine to produce a new pattern. Interference Interference is when tw (r mre) sets f waves meet and cmbine t prduce a new pattern. This pattern can vary depending n the riginal wave directin, wavelength, amplitude, etc. The tw mst extreme

More information

INTRODUCTION TO THE PHYSIOLOGY OF PERCEPTION Chapter 2

INTRODUCTION TO THE PHYSIOLOGY OF PERCEPTION Chapter 2 Intr t the Physilgy f Perceptin 1 Perceptin (PSY 4204) Christine L. Ruva, Ph.D. INTRODUCTION TO THE PHYSIOLOGY OF PERCEPTION Chapter 2 RECEPTORS & NEURAL PROCESSING Theme f Chapter: We d nt just perceive

More information

Pattern Recognition 2014 Support Vector Machines

Pattern Recognition 2014 Support Vector Machines Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft

More information

David HORN and Irit OPHER. School of Physics and Astronomy. Raymond and Beverly Sackler Faculty of Exact Sciences

David HORN and Irit OPHER. School of Physics and Astronomy. Raymond and Beverly Sackler Faculty of Exact Sciences Cmplex Dynamics f Neurnal Threshlds David HORN and Irit OPHER Schl f Physics and Astrnmy Raymnd and Beverly Sackler Faculty f Exact Sciences Tel Aviv University, Tel Aviv 69978, Israel hrn@neurn.tau.ac.il

More information

AP Statistics Notes Unit Two: The Normal Distributions

AP Statistics Notes Unit Two: The Normal Distributions AP Statistics Ntes Unit Tw: The Nrmal Distributins Syllabus Objectives: 1.5 The student will summarize distributins f data measuring the psitin using quartiles, percentiles, and standardized scres (z-scres).

More information

5 th grade Common Core Standards

5 th grade Common Core Standards 5 th grade Cmmn Cre Standards In Grade 5, instructinal time shuld fcus n three critical areas: (1) develping fluency with additin and subtractin f fractins, and develping understanding f the multiplicatin

More information

Cells though to send feedback signals from the medulla back to the lamina o L: Lamina Monopolar cells

Cells though to send feedback signals from the medulla back to the lamina o L: Lamina Monopolar cells Classificatin Rules (and Exceptins) Name: Cell type fllwed by either a clumn ID (determined by the visual lcatin f the cell) r a numeric identifier t separate ut different examples f a given cell type

More information

A Scalable Recurrent Neural Network Framework for Model-free

A Scalable Recurrent Neural Network Framework for Model-free A Scalable Recurrent Neural Netwrk Framewrk fr Mdel-free POMDPs April 3, 2007 Zhenzhen Liu, Itamar Elhanany Machine Intelligence Lab Department f Electrical and Cmputer Engineering The University f Tennessee

More information

Determining the Accuracy of Modal Parameter Estimation Methods

Determining the Accuracy of Modal Parameter Estimation Methods Determining the Accuracy f Mdal Parameter Estimatin Methds by Michael Lee Ph.D., P.E. & Mar Richardsn Ph.D. Structural Measurement Systems Milpitas, CA Abstract The mst cmmn type f mdal testing system

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins

More information

SPH3U1 Lesson 06 Kinematics

SPH3U1 Lesson 06 Kinematics PROJECTILE MOTION LEARNING GOALS Students will: Describe the mtin f an bject thrwn at arbitrary angles thrugh the air. Describe the hrizntal and vertical mtins f a prjectile. Slve prjectile mtin prblems.

More information

How do scientists measure trees? What is DBH?

How do scientists measure trees? What is DBH? Hw d scientists measure trees? What is DBH? Purpse Students develp an understanding f tree size and hw scientists measure trees. Students bserve and measure tree ckies and explre the relatinship between

More information

Section 6-2: Simplex Method: Maximization with Problem Constraints of the Form ~

Section 6-2: Simplex Method: Maximization with Problem Constraints of the Form ~ Sectin 6-2: Simplex Methd: Maximizatin with Prblem Cnstraints f the Frm ~ Nte: This methd was develped by Gerge B. Dantzig in 1947 while n assignment t the U.S. Department f the Air Frce. Definitin: Standard

More information

February 28, 2013 COMMENTS ON DIFFUSION, DIFFUSIVITY AND DERIVATION OF HYPERBOLIC EQUATIONS DESCRIBING THE DIFFUSION PHENOMENA

February 28, 2013 COMMENTS ON DIFFUSION, DIFFUSIVITY AND DERIVATION OF HYPERBOLIC EQUATIONS DESCRIBING THE DIFFUSION PHENOMENA February 28, 2013 COMMENTS ON DIFFUSION, DIFFUSIVITY AND DERIVATION OF HYPERBOLIC EQUATIONS DESCRIBING THE DIFFUSION PHENOMENA Mental Experiment regarding 1D randm walk Cnsider a cntainer f gas in thermal

More information

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007 CS 477/677 Analysis f Algrithms Fall 2007 Dr. Gerge Bebis Curse Prject Due Date: 11/29/2007 Part1: Cmparisn f Srting Algrithms (70% f the prject grade) The bjective f the first part f the assignment is

More information

BASD HIGH SCHOOL FORMAL LAB REPORT

BASD HIGH SCHOOL FORMAL LAB REPORT BASD HIGH SCHOOL FORMAL LAB REPORT *WARNING: After an explanatin f what t include in each sectin, there is an example f hw the sectin might lk using a sample experiment Keep in mind, the sample lab used

More information

Revision: August 19, E Main Suite D Pullman, WA (509) Voice and Fax

Revision: August 19, E Main Suite D Pullman, WA (509) Voice and Fax .7.4: Direct frequency dmain circuit analysis Revisin: August 9, 00 5 E Main Suite D Pullman, WA 9963 (509) 334 6306 ice and Fax Overview n chapter.7., we determined the steadystate respnse f electrical

More information

Physics 2B Chapter 23 Notes - Faraday s Law & Inductors Spring 2018

Physics 2B Chapter 23 Notes - Faraday s Law & Inductors Spring 2018 Michael Faraday lived in the Lndn area frm 1791 t 1867. He was 29 years ld when Hand Oersted, in 1820, accidentally discvered that electric current creates magnetic field. Thrugh empirical bservatin and

More information

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10]

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10] EECS 270, Winter 2017, Lecture 3 Page 1 f 6 Medium Scale Integrated (MSI) devices [Sectins 2.9 and 2.10] As we ve seen, it s smetimes nt reasnable t d all the design wrk at the gate-level smetimes we just

More information

Synchronous Motor V-Curves

Synchronous Motor V-Curves Synchrnus Mtr V-Curves 1 Synchrnus Mtr V-Curves Intrductin Synchrnus mtrs are used in applicatins such as textile mills where cnstant speed peratin is critical. Mst small synchrnus mtrs cntain squirrel

More information

Flipping Physics Lecture Notes: Simple Harmonic Motion Introduction via a Horizontal Mass-Spring System

Flipping Physics Lecture Notes: Simple Harmonic Motion Introduction via a Horizontal Mass-Spring System Flipping Physics Lecture Ntes: Simple Harmnic Mtin Intrductin via a Hrizntal Mass-Spring System A Hrizntal Mass-Spring System is where a mass is attached t a spring, riented hrizntally, and then placed

More information

Matter Content from State Frameworks and Other State Documents

Matter Content from State Frameworks and Other State Documents Atms and Mlecules Mlecules are made f smaller entities (atms) which are bnded tgether. Therefre mlecules are divisible. Miscnceptin: Element and atm are synnyms. Prper cnceptin: Elements are atms with

More information

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme

Enhancing Performance of MLP/RBF Neural Classifiers via an Multivariate Data Distribution Scheme Enhancing Perfrmance f / Neural Classifiers via an Multivariate Data Distributin Scheme Halis Altun, Gökhan Gelen Nigde University, Electrical and Electrnics Engineering Department Nigde, Turkey haltun@nigde.edu.tr

More information

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India

CHAPTER 3 INEQUALITIES. Copyright -The Institute of Chartered Accountants of India CHAPTER 3 INEQUALITIES Cpyright -The Institute f Chartered Accuntants f India INEQUALITIES LEARNING OBJECTIVES One f the widely used decisin making prblems, nwadays, is t decide n the ptimal mix f scarce

More information

Physics 2010 Motion with Constant Acceleration Experiment 1

Physics 2010 Motion with Constant Acceleration Experiment 1 . Physics 00 Mtin with Cnstant Acceleratin Experiment In this lab, we will study the mtin f a glider as it accelerates dwnhill n a tilted air track. The glider is supprted ver the air track by a cushin

More information

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must

the results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins

More information

Subject description processes

Subject description processes Subject representatin 6.1.2. Subject descriptin prcesses Overview Fur majr prcesses r areas f practice fr representing subjects are classificatin, subject catalging, indexing, and abstracting. The prcesses

More information

Weathering. Title: Chemical and Mechanical Weathering. Grade Level: Subject/Content: Earth and Space Science

Weathering. Title: Chemical and Mechanical Weathering. Grade Level: Subject/Content: Earth and Space Science Weathering Title: Chemical and Mechanical Weathering Grade Level: 9-12 Subject/Cntent: Earth and Space Science Summary f Lessn: Students will test hw chemical and mechanical weathering can affect a rck

More information

Writing Guidelines. (Updated: November 25, 2009) Forwards

Writing Guidelines. (Updated: November 25, 2009) Forwards Writing Guidelines (Updated: Nvember 25, 2009) Frwards I have fund in my review f the manuscripts frm ur students and research assciates, as well as thse submitted t varius jurnals by thers that the majr

More information

Chapter 3: Cluster Analysis

Chapter 3: Cluster Analysis Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA

More information

WRITING THE REPORT. Organizing the report. Title Page. Table of Contents

WRITING THE REPORT. Organizing the report. Title Page. Table of Contents WRITING THE REPORT Organizing the reprt Mst reprts shuld be rganized in the fllwing manner. Smetime there is a valid reasn t include extra chapters in within the bdy f the reprt. 1. Title page 2. Executive

More information

ROUNDING ERRORS IN BEAM-TRACKING CALCULATIONS

ROUNDING ERRORS IN BEAM-TRACKING CALCULATIONS Particle Acceleratrs, 1986, Vl. 19, pp. 99-105 0031-2460/86/1904-0099/$15.00/0 1986 Grdn and Breach, Science Publishers, S.A. Printed in the United States f America ROUNDING ERRORS IN BEAM-TRACKING CALCULATIONS

More information

Biocomputers. [edit]scientific Background

Biocomputers. [edit]scientific Background Bicmputers Frm Wikipedia, the free encyclpedia Bicmputers use systems f bilgically derived mlecules, such as DNA and prteins, t perfrm cmputatinal calculatins invlving string, retrieving, and prcessing

More information

Lab 1 The Scientific Method

Lab 1 The Scientific Method INTRODUCTION The fllwing labratry exercise is designed t give yu, the student, an pprtunity t explre unknwn systems, r universes, and hypthesize pssible rules which may gvern the behavir within them. Scientific

More information

A Few Basic Facts About Isothermal Mass Transfer in a Binary Mixture

A Few Basic Facts About Isothermal Mass Transfer in a Binary Mixture Few asic Facts but Isthermal Mass Transfer in a inary Miture David Keffer Department f Chemical Engineering University f Tennessee first begun: pril 22, 2004 last updated: January 13, 2006 dkeffer@utk.edu

More information

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons

Slide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons Slide04 supplemental) Haykin Chapter 4 bth 2nd and 3rd ed): Multi-Layer Perceptrns CPSC 636-600 Instructr: Ynsuck Che Heuristic fr Making Backprp Perfrm Better 1. Sequential vs. batch update: fr large

More information

A Polarimetric Survey of Radio Frequency Interference in C- and X-Bands in the Continental United States using WindSat Radiometry

A Polarimetric Survey of Radio Frequency Interference in C- and X-Bands in the Continental United States using WindSat Radiometry A Plarimetric Survey f Radi Frequency Interference in C- and X-Bands in the Cntinental United States using WindSat Radimetry Steven W. Ellingsn Octber, Cntents Intrductin WindSat Methdlgy Analysis f RFI

More information

Technical Bulletin. Generation Interconnection Procedures. Revisions to Cluster 4, Phase 1 Study Methodology

Technical Bulletin. Generation Interconnection Procedures. Revisions to Cluster 4, Phase 1 Study Methodology Technical Bulletin Generatin Intercnnectin Prcedures Revisins t Cluster 4, Phase 1 Study Methdlgy Release Date: Octber 20, 2011 (Finalizatin f the Draft Technical Bulletin released n September 19, 2011)

More information

Sections 15.1 to 15.12, 16.1 and 16.2 of the textbook (Robbins-Miller) cover the materials required for this topic.

Sections 15.1 to 15.12, 16.1 and 16.2 of the textbook (Robbins-Miller) cover the materials required for this topic. Tpic : AC Fundamentals, Sinusidal Wavefrm, and Phasrs Sectins 5. t 5., 6. and 6. f the textbk (Rbbins-Miller) cver the materials required fr this tpic.. Wavefrms in electrical systems are current r vltage

More information

A Matrix Representation of Panel Data

A Matrix Representation of Panel Data web Extensin 6 Appendix 6.A A Matrix Representatin f Panel Data Panel data mdels cme in tw brad varieties, distinct intercept DGPs and errr cmpnent DGPs. his appendix presents matrix algebra representatins

More information

Chapter 16. Capacitance. Capacitance, cont. Parallel-Plate Capacitor, Example 1/20/2011. Electric Energy and Capacitance

Chapter 16. Capacitance. Capacitance, cont. Parallel-Plate Capacitor, Example 1/20/2011. Electric Energy and Capacitance summary C = ε A / d = πε L / ln( b / a ) ab C = 4πε 4πε a b a b >> a Chapter 16 Electric Energy and Capacitance Capacitance Q=CV Parallel plates, caxial cables, Earth Series and parallel 1 1 1 = + +..

More information

Kinetic Model Completeness

Kinetic Model Completeness 5.68J/10.652J Spring 2003 Lecture Ntes Tuesday April 15, 2003 Kinetic Mdel Cmpleteness We say a chemical kinetic mdel is cmplete fr a particular reactin cnditin when it cntains all the species and reactins

More information

8 th Grade Math: Pre-Algebra

8 th Grade Math: Pre-Algebra Hardin Cunty Middle Schl (2013-2014) 1 8 th Grade Math: Pre-Algebra Curse Descriptin The purpse f this curse is t enhance student understanding, participatin, and real-life applicatin f middle-schl mathematics

More information

This section is primarily focused on tools to aid us in finding roots/zeros/ -intercepts of polynomials. Essentially, our focus turns to solving.

This section is primarily focused on tools to aid us in finding roots/zeros/ -intercepts of polynomials. Essentially, our focus turns to solving. Sectin 3.2: Many f yu WILL need t watch the crrespnding vides fr this sectin n MyOpenMath! This sectin is primarily fcused n tls t aid us in finding rts/zers/ -intercepts f plynmials. Essentially, ur fcus

More information

Lab 11 LRC Circuits, Damped Forced Harmonic Motion

Lab 11 LRC Circuits, Damped Forced Harmonic Motion Physics 6 ab ab 11 ircuits, Damped Frced Harmnic Mtin What Yu Need T Knw: The Physics OK this is basically a recap f what yu ve dne s far with circuits and circuits. Nw we get t put everything tgether

More information

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp

1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp THE POWER AND LIMIT OF NEURAL NETWORKS T. Y. Lin Department f Mathematics and Cmputer Science San Jse State University San Jse, Califrnia 959-003 tylin@cs.ssu.edu and Bereley Initiative in Sft Cmputing*

More information

CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS

CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS 1 Influential bservatins are bservatins whse presence in the data can have a distrting effect n the parameter estimates and pssibly the entire analysis,

More information

Pipetting 101 Developed by BSU CityLab

Pipetting 101 Developed by BSU CityLab Discver the Micrbes Within: The Wlbachia Prject Pipetting 101 Develped by BSU CityLab Clr Cmparisns Pipetting Exercise #1 STUDENT OBJECTIVES Students will be able t: Chse the crrect size micrpipette fr

More information

Computational modeling techniques

Computational modeling techniques Cmputatinal mdeling techniques Lecture 11: Mdeling with systems f ODEs In Petre Department f IT, Ab Akademi http://www.users.ab.fi/ipetre/cmpmd/ Mdeling with differential equatins Mdeling strategy Fcus

More information

NUMBERS, MATHEMATICS AND EQUATIONS

NUMBERS, MATHEMATICS AND EQUATIONS AUSTRALIAN CURRICULUM PHYSICS GETTING STARTED WITH PHYSICS NUMBERS, MATHEMATICS AND EQUATIONS An integral part t the understanding f ur physical wrld is the use f mathematical mdels which can be used t

More information

Checking the resolved resonance region in EXFOR database

Checking the resolved resonance region in EXFOR database Checking the reslved resnance regin in EXFOR database Gttfried Bertn Sciété de Calcul Mathématique (SCM) Oscar Cabells OECD/NEA Data Bank JEFF Meetings - Sessin JEFF Experiments Nvember 0-4, 017 Bulgne-Billancurt,

More information

I. Analytical Potential and Field of a Uniform Rod. V E d. The definition of electric potential difference is

I. Analytical Potential and Field of a Uniform Rod. V E d. The definition of electric potential difference is Length L>>a,b,c Phys 232 Lab 4 Ch 17 Electric Ptential Difference Materials: whitebards & pens, cmputers with VPythn, pwer supply & cables, multimeter, crkbard, thumbtacks, individual prbes and jined prbes,

More information

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning Admin Reinfrcement Learning Cntent adapted frm Berkeley CS188 MDP Search Trees Each MDP state prjects an expectimax-like search tree Optimal Quantities The value (utility) f a state s: V*(s) = expected

More information

You need to be able to define the following terms and answer basic questions about them:

You need to be able to define the following terms and answer basic questions about them: CS440/ECE448 Sectin Q Fall 2017 Midterm Review Yu need t be able t define the fllwing terms and answer basic questins abut them: Intr t AI, agents and envirnments Pssible definitins f AI, prs and cns f

More information

Emphases in Common Core Standards for Mathematical Content Kindergarten High School

Emphases in Common Core Standards for Mathematical Content Kindergarten High School Emphases in Cmmn Cre Standards fr Mathematical Cntent Kindergarten High Schl Cntent Emphases by Cluster March 12, 2012 Describes cntent emphases in the standards at the cluster level fr each grade. These

More information

NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION

NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION NUROP Chinese Pinyin T Chinese Character Cnversin NUROP CONGRESS PAPER CHINESE PINYIN TO CHINESE CHARACTER CONVERSION CHIA LI SHI 1 AND LUA KIM TENG 2 Schl f Cmputing, Natinal University f Singapre 3 Science

More information

Multiple Source Multiple. using Network Coding

Multiple Source Multiple. using Network Coding Multiple Surce Multiple Destinatin Tplgy Inference using Netwrk Cding Pegah Sattari EECS, UC Irvine Jint wrk with Athina Markpulu, at UCI, Christina Fraguli, at EPFL, Lausanne Outline Netwrk Tmgraphy Gal,

More information

Department of Electrical Engineering, University of Waterloo. Introduction

Department of Electrical Engineering, University of Waterloo. Introduction Sectin 4: Sequential Circuits Majr Tpics Types f sequential circuits Flip-flps Analysis f clcked sequential circuits Mre and Mealy machines Design f clcked sequential circuits State transitin design methd

More information

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) > Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);

More information

Methods for Determination of Mean Speckle Size in Simulated Speckle Pattern

Methods for Determination of Mean Speckle Size in Simulated Speckle Pattern 0.478/msr-04-004 MEASUREMENT SCENCE REVEW, Vlume 4, N. 3, 04 Methds fr Determinatin f Mean Speckle Size in Simulated Speckle Pattern. Hamarvá, P. Šmíd, P. Hrváth, M. Hrabvský nstitute f Physics f the Academy

More information

Relationship Between Amplifier Settling Time and Pole-Zero Placements for Second-Order Systems *

Relationship Between Amplifier Settling Time and Pole-Zero Placements for Second-Order Systems * Relatinship Between Amplifier Settling Time and Ple-Zer Placements fr Secnd-Order Systems * Mark E. Schlarmann and Randall L. Geiger Iwa State University Electrical and Cmputer Engineering Department Ames,

More information

Admissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs

Admissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs Admissibility Cnditins and Asympttic Behavir f Strngly Regular Graphs VASCO MOÇO MANO Department f Mathematics University f Prt Oprt PORTUGAL vascmcman@gmailcm LUÍS ANTÓNIO DE ALMEIDA VIEIRA Department

More information

Flipping Physics Lecture Notes: Simple Harmonic Motion Introduction via a Horizontal Mass-Spring System

Flipping Physics Lecture Notes: Simple Harmonic Motion Introduction via a Horizontal Mass-Spring System Flipping Physics Lecture Ntes: Simple Harmnic Mtin Intrductin via a Hrizntal Mass-Spring System A Hrizntal Mass-Spring System is where a mass is attached t a spring, riented hrizntally, and then placed

More information

Chemistry 20 Lesson 11 Electronegativity, Polarity and Shapes

Chemistry 20 Lesson 11 Electronegativity, Polarity and Shapes Chemistry 20 Lessn 11 Electrnegativity, Plarity and Shapes In ur previus wrk we learned why atms frm cvalent bnds and hw t draw the resulting rganizatin f atms. In this lessn we will learn (a) hw the cmbinatin

More information

Lecture 13: Electrochemical Equilibria

Lecture 13: Electrochemical Equilibria 3.012 Fundamentals f Materials Science Fall 2005 Lecture 13: 10.21.05 Electrchemical Equilibria Tday: LAST TIME...2 An example calculatin...3 THE ELECTROCHEMICAL POTENTIAL...4 Electrstatic energy cntributins

More information

Plan o o. I(t) Divide problem into sub-problems Modify schematic and coordinate system (if needed) Write general equations

Plan o o. I(t) Divide problem into sub-problems Modify schematic and coordinate system (if needed) Write general equations STAPLE Physics 201 Name Final Exam May 14, 2013 This is a clsed bk examinatin but during the exam yu may refer t a 5 x7 nte card with wrds f wisdm yu have written n it. There is extra scratch paper available.

More information

Perfrmance f Sensitizing Rules n Shewhart Cntrl Charts with Autcrrelated Data Key Wrds: Autregressive, Mving Average, Runs Tests, Shewhart Cntrl Chart

Perfrmance f Sensitizing Rules n Shewhart Cntrl Charts with Autcrrelated Data Key Wrds: Autregressive, Mving Average, Runs Tests, Shewhart Cntrl Chart Perfrmance f Sensitizing Rules n Shewhart Cntrl Charts with Autcrrelated Data Sandy D. Balkin Dennis K. J. Lin y Pennsylvania State University, University Park, PA 16802 Sandy Balkin is a graduate student

More information

Analysis on the Stability of Reservoir Soil Slope Based on Fuzzy Artificial Neural Network

Analysis on the Stability of Reservoir Soil Slope Based on Fuzzy Artificial Neural Network Research Jurnal f Applied Sciences, Engineering and Technlgy 5(2): 465-469, 2013 ISSN: 2040-7459; E-ISSN: 2040-7467 Maxwell Scientific Organizatin, 2013 Submitted: May 08, 2012 Accepted: May 29, 2012 Published:

More information

Activity Guide Loops and Random Numbers

Activity Guide Loops and Random Numbers Unit 3 Lessn 7 Name(s) Perid Date Activity Guide Lps and Randm Numbers CS Cntent Lps are a relatively straightfrward idea in prgramming - yu want a certain chunk f cde t run repeatedly - but it takes a

More information

Lyapunov Stability Stability of Equilibrium Points

Lyapunov Stability Stability of Equilibrium Points Lyapunv Stability Stability f Equilibrium Pints 1. Stability f Equilibrium Pints - Definitins In this sectin we cnsider n-th rder nnlinear time varying cntinuus time (C) systems f the frm x = f ( t, x),

More information

The standards are taught in the following sequence.

The standards are taught in the following sequence. B L U E V A L L E Y D I S T R I C T C U R R I C U L U M MATHEMATICS Third Grade In grade 3, instructinal time shuld fcus n fur critical areas: (1) develping understanding f multiplicatin and divisin and

More information

ENG2410 Digital Design Sequential Circuits: Part A

ENG2410 Digital Design Sequential Circuits: Part A ENG2410 Digital Design Sequential Circuits: Part A Fall 2017 S. Areibi Schl f Engineering University f Guelph Week #6 Tpics Sequential Circuit Definitins Latches Flip-Flps Delays in Sequential Circuits

More information

ENGI 4430 Parametric Vector Functions Page 2-01

ENGI 4430 Parametric Vector Functions Page 2-01 ENGI 4430 Parametric Vectr Functins Page -01. Parametric Vectr Functins (cntinued) Any nn-zer vectr r can be decmpsed int its magnitude r and its directin: r rrˆ, where r r 0 Tangent Vectr: dx dy dz dr

More information

(1.1) V which contains charges. If a charge density ρ, is defined as the limit of the ratio of the charge contained. 0, and if a force density f

(1.1) V which contains charges. If a charge density ρ, is defined as the limit of the ratio of the charge contained. 0, and if a force density f 1.0 Review f Electrmagnetic Field Thery Selected aspects f electrmagnetic thery are reviewed in this sectin, with emphasis n cncepts which are useful in understanding magnet design. Detailed, rigrus treatments

More information

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis

SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical mdel fr micrarray data analysis David Rssell Department f Bistatistics M.D. Andersn Cancer Center, Hustn, TX 77030, USA rsselldavid@gmail.cm

More information

General Chemistry II, Unit I: Study Guide (part I)

General Chemistry II, Unit I: Study Guide (part I) 1 General Chemistry II, Unit I: Study Guide (part I) CDS Chapter 14: Physical Prperties f Gases Observatin 1: Pressure- Vlume Measurements n Gases The spring f air is measured as pressure, defined as the

More information

37 Maxwell s Equations

37 Maxwell s Equations 37 Maxwell s quatins In this chapter, the plan is t summarize much f what we knw abut electricity and magnetism in a manner similar t the way in which James Clerk Maxwell summarized what was knwn abut

More information

22.54 Neutron Interactions and Applications (Spring 2004) Chapter 11 (3/11/04) Neutron Diffusion

22.54 Neutron Interactions and Applications (Spring 2004) Chapter 11 (3/11/04) Neutron Diffusion .54 Neutrn Interactins and Applicatins (Spring 004) Chapter (3//04) Neutrn Diffusin References -- J. R. Lamarsh, Intrductin t Nuclear Reactr Thery (Addisn-Wesley, Reading, 966) T study neutrn diffusin

More information

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank

CAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank CAUSAL INFERENCE Technical Track Sessin I Phillippe Leite The Wrld Bank These slides were develped by Christel Vermeersch and mdified by Phillippe Leite fr the purpse f this wrkshp Plicy questins are causal

More information

Lecture 23: Lattice Models of Materials; Modeling Polymer Solutions

Lecture 23: Lattice Models of Materials; Modeling Polymer Solutions Lecture 23: 12.05.05 Lattice Mdels f Materials; Mdeling Plymer Slutins Tday: LAST TIME...2 The Bltzmann Factr and Partitin Functin: systems at cnstant temperature...2 A better mdel: The Debye slid...3

More information

Thermodynamics Partial Outline of Topics

Thermodynamics Partial Outline of Topics Thermdynamics Partial Outline f Tpics I. The secnd law f thermdynamics addresses the issue f spntaneity and invlves a functin called entrpy (S): If a prcess is spntaneus, then Suniverse > 0 (2 nd Law!)

More information

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition

The Kullback-Leibler Kernel as a Framework for Discriminant and Localized Representations for Visual Recognition The Kullback-Leibler Kernel as a Framewrk fr Discriminant and Lcalized Representatins fr Visual Recgnitin Nun Vascncels Purdy H Pedr Mren ECE Department University f Califrnia, San Dieg HP Labs Cambridge

More information

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint

Biplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint Biplts in Practice MICHAEL GREENACRE Prfessr f Statistics at the Pmpeu Fabra University Chapter 13 Offprint CASE STUDY BIOMEDICINE Cmparing Cancer Types Accrding t Gene Epressin Arrays First published:

More information

Lecture 5: Equilibrium and Oscillations

Lecture 5: Equilibrium and Oscillations Lecture 5: Equilibrium and Oscillatins Energy and Mtin Last time, we fund that fr a system with energy cnserved, v = ± E U m ( ) ( ) One result we see immediately is that there is n slutin fr velcity if

More information

o o IMPORTANT REMINDERS Reports will be graded largely on their ability to clearly communicate results and important conclusions.

o o IMPORTANT REMINDERS Reports will be graded largely on their ability to clearly communicate results and important conclusions. BASD High Schl Frmal Lab Reprt GENERAL INFORMATION 12 pt Times New Rman fnt Duble-spaced, if required by yur teacher 1 inch margins n all sides (tp, bttm, left, and right) Always write in third persn (avid

More information

Instructional Plan. Representational/Drawing Level

Instructional Plan. Representational/Drawing Level Instructinal Plan Representatinal/Drawing Level Name f Math Skill/Cncept: Divisin Prcess and Divisin with Remainders Prerequisite Skills Needed: 1.) Mastery f dividing cncrete bjects int equal grups. 2.)

More information

Assume that the water in the nozzle is accelerated at a rate such that the frictional effect can be neglected.

Assume that the water in the nozzle is accelerated at a rate such that the frictional effect can be neglected. 1 HW #3: Cnservatin f Linear Mmentum, Cnservatin f Energy, Cnservatin f Angular Mmentum and Turbmachines, Bernulli s Equatin, Dimensinal Analysis, and Pipe Flws Prblem 1. Cnservatins f Mass and Linear

More information

Dead-beat controller design

Dead-beat controller design J. Hetthéssy, A. Barta, R. Bars: Dead beat cntrller design Nvember, 4 Dead-beat cntrller design In sampled data cntrl systems the cntrller is realised by an intelligent device, typically by a PLC (Prgrammable

More information

Lecture 02 CSE 40547/60547 Computing at the Nanoscale

Lecture 02 CSE 40547/60547 Computing at the Nanoscale PN Junctin Ntes: Lecture 02 CSE 40547/60547 Cmputing at the Nanscale Letʼs start with a (very) shrt review f semi-cnducting materials: - N-type material: Obtained by adding impurity with 5 valence elements

More information

Part One: Heat Changes and Thermochemistry. This aspect of Thermodynamics was dealt with in Chapter 6. (Review)

Part One: Heat Changes and Thermochemistry. This aspect of Thermodynamics was dealt with in Chapter 6. (Review) CHAPTER 18: THERMODYNAMICS AND EQUILIBRIUM Part One: Heat Changes and Thermchemistry This aspect f Thermdynamics was dealt with in Chapter 6. (Review) A. Statement f First Law. (Sectin 18.1) 1. U ttal

More information