On Some Mathematical Results of Neural Networks
|
|
- Thomas Walsh
- 5 years ago
- Views:
Transcription
1 On Some Mathematical Results of Neural Networks Dongbin Xiu Department of Mathematics Ohio State University
2 Overview (Short) Introduction of Neural Networks (NNs) Successes Basic mechanism (Incomplete) Review of Mathematical Studies Universal approximator Constructive proofs Deep networks Some New Results Constructive proof Parameter Reduction for Training
3 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqq sha_base64="6dqjoted5/y2+ppsaxwqpmnvsu=">aaacghicbzdlsgmxfibpelfeqls3wskisjkros4fny4vraqdacmkz9pg5kjyrijdvimv4su4b7cepwjzftu9dwawf+/h+sk3xhpqqh/5uyxfpewvzra+sbmv7fubk+zayfukktv sha_base64="jn67u8io2scazc82nt+eabde2o=">aaaci3icbvdlssnafj34rpvvdelmsagiupiikxgxmufawtnljpjtts4etbzi5sqf/an/aw3uncnbly48uuccx8hbhw5px7584cp5vco22/wxpzc4tly7wv+ura+szmy2v7siez4tdjiuz sha_base64="kzqlz3pavtlx3vtewfucoxvwnga=">aaaci3icbvdlsgmxfm3uv62vuzdugkuqktijgi4lblxwsk3qgusmvdogzh4kd4qyzd/4e/6cw927ezcu3pglztoufb24chlovbnjcvipndrouvzwfxaxqmutbwnza37o2djk4yxahne5m One Example of Successes of Deep Learning The game of GO Complexity: Typical games (5 moves): 36 Upper & lower bounds (longest game): h 48, 7i In March 26, AlphaGo from Google s DeepMind beat world champion Lee Sedol, 4: May 27, beat No. player Ke, Jie, 3: Dec 27, ver. 2 recorded 6: win against all top pro players online Retired in 28 with ver. 3 (Self learning without human influence, : vs ver. 2) The mechanism behind AlphaGO Fundamentally different from the traditional AI Deep neural network with tree search Silver, et al, Nature, 26
4 BUT It is alchemy We don t know why algorithms work or why they don t (no theory) Algorithms are developed through trial and error Some results are hard to replicate (many hyperparameters) Finding good architectures relies on guesswork Very deep networks (more 4 layers) are difficult to train with backpropagation Algorithms are not robust to adversarial examples "Machine learning has become alchemy Ali Rahimi NIPS 27 Test of Time Award Science Mag, May 28 (Slide courtesy of Houman Owhadi, Caltech)
5 <latexit sha_base64="hp+6lruf2d3tzaldqaqqvekmxyw=">aaab2xicbzdnsgmxfixvl86vqrn8eiucozbnqpuhfzwbzco5rm5k4bmskmyr2hdhbf25efc93vo3pzjbdwq+zknivscullqubn9ebwd3b/+gfugfnfzjk9nmo2fzgjsilzl5jnmfpxu2cvjcp8lgzylffbj6fi77+gstlxtzqrmmr4wmtuck7o6oyaraadlmw2ivxdc9yanb+gss7kddujxadhefbucunsafw7g9liwuxuz7ggupnm7rrtrxzzi6dk7an+5oykv394ukz9bostjdzdhn7ga2mp/lbiwlteldvesarh6kcvo5wtdmajnchizrxwyasblykjnyq sha_base64="ruajbshhi7cfaenuuzkqs4k32w=">aaaconicbvblswmxgpy2vmqtwr6crahgprdl3orba96rgifklm2bbgx2tblisuyf8+ldm//aiwdfpjs+rgdcmw3myffxos4u9q2n63cwuls8kp+tbbwxn/ylgvgyqmjafevjqtjyskgecjxtnlyisxhgcdrhmcjv3lhpwkhunzjrdsb7gvmm4kkbolw2whnh+mmtobjxvsf7bp8aubcx7qefywvimnjpvxyjzgpjtcdolsp2r4dzrnnssowrabenj7iykdkjthwkm2ye6k2kpgeek7ixohemq9ynbumfdqjqpomwmrrnlb7yq2mogis/r6r4kcpjpbm sha_base64="2edaw/mjn8p4n83fmcjlehxjx/=">aaacrxicbvdlssnafj34rpvvdelmsagvpctd6eyounblffuappbjznionuzizeqmit/nxr7/8cncxc6qstqkkhbs499zuneoejeplmk/g3pzc4tjyyaw4ura+svna2m7jibkynhhaatfxkcsmctjuvdhscqvbvsni2xmdzv32lrgsbvxkxshp+wjaqucxulrql2zbr2roeemcwhn4vvku79kdq2jfrmifp5jn+arwksvmucncc7g9tls2qoawejnsvlmewjx3q3qbhpuekmyrlzjduuqubqzkhbtsjiq4reakk6mhple9pjxcinc4olvudoxxucq78neurlgfuo sha_base64="ueuavtlkczekpjghd3qxf/lkvy=">aaacrxicbvdlssnafj3uv62vqks3gwoicupgm6eggtdvrepagqztcftmkkzkzeepjzbty78w/cufderu7aipp6yodcc8/l3jlowkhupvlk5obmfxax8sufldw9y3i5lzt+qhapif95ou2gyrhljogooqrdiai8hxgws7ono23bomqodxkgpimdtl2kkdjsr2jbhljdx42jbj7as/j3ezfshd7jkr9+cpzle8kj75mrnnwegumxq9ymivmghcwwfnsalpue8vhu+/jcncyyak7fhmoloxeopirpkchuosidxca9lrlcopyg48tigberpq9cx+nefx+rvirh5ukae Short Introduction of NNs A massively parallel distributed model to learn y = G(x), x 2 R n, y 2 R m Quintessential model: feedforward deep network Multilayer perceptron (MLP) Inspired by how brain functions Not as a model for brain Many variations in structure Fully connected feedforward network Recurrent network Convolution network etc
6 The Role of Perceptron (Very) loosely mimics neuron activity Assigns weights to input signals Takes weighted sum Decides action: active or not Choices of activation function Binary/step function Sigmoidal function Rectified linear unit (ReLU) And many variations ReLU
7 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/r sha_base64="atibifcstvv6r5dosvcrwl5hpza=">aaaccnicbzdnsgmxfixvl9aq45udresgqsy4zxjbhxwcx+qkewjjnpqzozickizejgl/av3orenfgobnsm2bbbqo sha_base64="wkztmnlmphtr+emam6dcrjdfau=">aaacfxicbvc7tsmwfhv4lvikmmjgusexvukxefmlfsac6enqsmu7tmvvdilbqaqilvwev8akoxtizwbls3dasedlkswdn sha_base64="brmv93m6ykic5zr9k2giwrrpbqa=">aaacfxicbvdlsgmxfm3uv62vuze6cbbbvzkpgukq4mzlffuazlistkynzwsgjcouort/wl9wq3t34taw7/etdugth4in <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="fs5xhzw9txpo/bsexlnrmbokxgs=">aaaca3icbzhbsxwxfmyzytq+2qrfcrbqildo96euqvhgqklvnuvwksmswzpmmgtezzg/tnf2n+ixmdkvq/zb4mv3vucev9bccuui6gcqlrx7/2fxabnzcwx+fu2sqlzuvd+idlmjfxfcyxqvobe7y68jwuftyk3p3urxd9xykesfblrwkykxfplg4lyvdovvie497uijtgypkmpyfnc3grnk4gxvtajctwtzsqokwlhae6ejmd6jyat3xyf9cil/ct27uqivkn3mwlbajvg5yjpbkf9qc38vsrzsy3mdzzf5ez6xi2jejg7jqhcjcowttpk6qrlc2b3mozdlzuobkdvi6ngo95jczybf7tdrfvvraxk2qmivrpzl7ogvn/2bb2egoeroohdds9lbwsuxy3ddhqtccotnapgrflfmbseac/5noitlgvgmgnsomsvebwvl9/6cdspzyohdbrf9rdmtpax+gunaebyuh3sbysbxvbn3ar3jnhc4m5x3xoskvfwfupb8r</latexit> sha_base64="xkl2ac4s/5cmnkpzblzpytovwe=">aaacdnicbvfna9wwfjtdpe22abtjjouisoruafjsxnplinbltyut2ssw2pgnwd5oi8mojicsxj+/zp9fjz65lpacfozh56jgghhxvr9csix6ysvnytt57vfhm7bv+5tapzuvd+jjlmjfnfcyxqvoxe7y88jwuftym3rtfhpbrixitcnblhwqykzfplg4lyu9ovvir7e7ufdtgypkmp+gncxgrnk7guxuzakctwlzsqokzlhae7+c7fj5rm9/bj9xujkx6qe3p5pdxp+ildx6snnbbdbshsjqav+sukodfch46t/m6q5kxxxjkmwdhjhhztwyjxgktc9ulpealucgz94qkfxo63ammq865uuz7nxrzvcqvcnkldwlhtywzd+9hrxoe8semyl9nk6kjxlplqkpsctxzlohehmyyunwizwu2j2cqay8z/tiynpikjvv2btsvy4gqfk9gaur6p4rzq4oujqwupvq4aohh9rkfogzpgy8tqn2a92aq2g7/hh3a3/limhke3s4eiiz+az4pv98=</latexit> sha_base64="ajo2wozgn4gldpa4phdszk5qyi=">aaacdnicbvfna9wwfjtdte22h9kmxirwui3ncx2klsxqkcxnepasklgttfpsrzrrpidss5zjh9or82f6lgyywfa4lrzdzgnfccuui6hcqplt5/ull6lrves3b9f77zzobf4axscsl7k5o2c5fjqpnxcsnxwgg6ksn9llb4/+osbk3j97byfnyqyazejbs5lsb/+nujh9q7ex8swkqnm+3f9rjfl5l4qmwvdosbdkycopktlubuv3bdf6ljfgexxf2vkoj7oagxvr9rm/txuzacgdfgsl/ueilrgxytuyaboer6fiasjx7zgeaydxvlhpbcyjjnndi6xlbbblmpgjpxout9oqrang25jczybf7tdrxp3ogjl7ujrn2w2ta+9rnzkm5qu+zqthc5kxzvbppsverscn53jvbjonfx4aswivytmf2caof8zpzlyjchadqu2pcqpk3hmtvzgctskf3wehox9ayi92gldvgmvqaddiiobgxdbosbrvbzva3/bbuhx+xtdozjbrpytrp59pv+m=</latexit> Single Layer Feedforward NN f : R d! R N n (x) = nx c j (a j x + b j ), a j 2 R d, b j 2 R j= Similarity to the standard approximation methods Polynomial series Radial basis series
8 Multiple Layer Feedforward NN (Deep NN) Propagation: y () = x, h i h y (m) = W (m )i T y (m ) + b (m), m =2,...,M, y (M) = y (m) j = h i h W (M )i T y (M ). h w (m ) j i T y (m ) + b (m) j, j =,...,J m, m =2,...,M, Denoted as (d, J, J 2, J M-, J M ) (4. Exceedingly large number of parameters: weights and thresholds
9 Network Training All the parameters are to be trained by using data Typically by minimizing a loss function Nonlinear optimization Gradient descent Back propagation (GP) Stochastic gradient descent ADAM algorithm Challenges Exceedingly large number of parameters (weights, thresholds) Local minima Overfitting The exceptional difficulty in numerical optimization makes it hard to identify issues, reproduce results, test ideas.
10 Current State of Mathematics in NN Most studies are on feedforward NN with one hidden layer Universal approximator Barron 993, Cybenko 989, Funahash 989, Hornik 99, Leshno et al, 993, Pinkus 999, etc Constructive proof: explicit construction of FNN Cardaliagnet-Euvrard operator (D operator) Anastassiou 997, Chen & Cao 29, Costarelli & Spigler 23 Llanas & Sainz 26 Ridglet transform Candes 998, Sonoda & Murata 28 Restrictions on activation function Multiple hidden layer NN Kolmogorov representation theory: Kurkova 992 Constructive: Sprecher 22 Multiresolution wavelets: Yarotsky 27
11 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="fs5xhzw9txpo/bsexlnrmbokxgs=">aaaca3icbzhbsxwxfmyzytq+2qrfcrbqildo96euqvhgqklvnuvwksmswzpmmgtezzg/tnf2n+ixmdkvq/zb4mv3vucev9bccuui6gcqlrx7/2fxabnzcwx+fu2sqlzuvd+idlmjfxfcyxqvobe7y68jwuftyk3p3urxd9xykesfblrwkykxfplg4lyvdovvie497uijtgypkmpyfnc3grnk4gxvtajctwtzsqokwlhae6ejmd6jyat3xyf9cil/ct27uqivkn3mwlbajvg5yjpbkf9qc38vsrzsy3mdzzf5ez6xi2jejg7jqhcjcowttpk6qrlc2b3mozdlzuobkdvi6ngo95jczybf7tdrfvvraxk2qmivrpzl7ogvn/2bb2egoeroohdds9lbwsuxy3ddhqtccotnapgrflfmbseac/5noitlgvgmgnsomsvebwvl9/6cdspzyohdbrf9rdmtpax+gunaebyuh3sbysbxvbn3ar3jnhc4m5x3xoskvfwfupb8r</latexit> sha_base64="xkl2ac4s/5cmnkpzblzpytovwe=">aaacdnicbvfna9wwfjtdpe22abtjjouisoruafjsxnplinbltyut2ssw2pgnwd5oi8mojicsxj+/zp9fjz65lpacfozh56jgghhxvr9csix6ysvnytt57vfhm7bv+5tapzuvd+jjlmjfnfcyxqvoxe7y88jwuftym3rtfhpbrixitcnblhwqykzfplg4lyu9ovvir7e7ufdtgypkmp+gncxgrnk7guxuzakctwlzsqokzlhae7+c7fj5rm9/bj9xujkx6qe3p5pdxp+ildx6snnbbdbshsjqav+sukodfch46t/m6q5kxxxjkmwdhjhhztwyjxgktc9ulpealucgz94qkfxo63ammq865uuz7nxrzvcqvcnkldwlhtywzd+9hrxoe8semyl9nk6kjxlplqkpsctxzlohehmyyunwizwu2j2cqay8z/tiynpikjvv2btsvy4gqfk9gaur6p4rzq4oujqwupvq4aohh9rkfogzpgy8tqn2a92aq2g7/hh3a3/limhke3s4eiiz+az4pv98=</latexit> sha_base64="ajo2wozgn4gldpa4phdszk5qyi=">aaacdnicbvfna9wwfjtdte22h9kmxirwui3ncx2klsxqkcxnepasklgttfpsrzrrpidss5zjh9or82f6lgyywfa4lrzdzgnfccuui6hcqplt5/ull6lrves3b9f77zzobf4axscsl7k5o2c5fjqpnxcsnxwgg6ksn9llb4/+osbk3j97byfnyqyazejbs5lsb/+nujh9q7ex8swkqnm+3f9rjfl5l4qmwvdosbdkycopktlubuv3bdf6ljfgexxf2vkoj7oagxvr9rm/txuzacgdfgsl/ueilrgxytuyaboer6fiasjx7zgeaydxvlhpbcyjjnndi6xlbbblmpgjpxout9oqrang25jczybf7tdrxp3ogjl7ujrn2w2ta+9rnzkm5qu+zqthc5kxzvbppsverscn53jvbjonfx4aswivytmf2caof8zpzlyjchadqu2pcqpk3hmtvzgctskf3wehox9ayi92gldvgmvqaddiiobgxdbosbrvbzva3/bbuhx+xtdozjbrpytrp59pv+m=</latexit> <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="dsrpi6fvaofhjpotveqkkzqz7k8=">aaacahicbvfdaxqxfm2mwuta7dtxvorwyyuyzprfqrqex3wouqhbfjblciet2qnmknyr7ug+ag+if/b92zmtbblwtoped+czlxsjpm9r/ojxk42nm88gz7devnxodrboxdvylsa8upw9zmejjyyoqllmsrqodkxorxnzv94ruwtlbmdjemgqyglkdhiowfktacafb+w/tkpm5fzdr3ys+gt4x84wauhg6jihuearvavbtmyvyoz9z76hy3jryw36xx7nj84onduaw4nrxl+rgsnsx76sjtgz4e2rrsk3wczpi/rkh4o4vbrsc5szbwopvguxil2gfrnkibx8fctaioiwb+t6jlr4jtehlyoznkpbs3q4p2rmlzknld7m7r3xk/7rjg+whqzemblayvlpunopirtvdasgt4kiwaqc3mtxk+qiscazfmmcfkjno/dquzptsvgupwfnhketh2beubjjdskegjcpvysfyhzysmehkd7qrbudj9dfejfdw9sxr2sdx5j+ix98aui/og==</latexit> sha_base64="ywrg3ntu8spn/bibq/ftodry2u=">aaacc3icbvfna9wwejxdnk23h3hbyy4i28kgtoudstkizbld6wkkecq2uzy/kuicqbazxkk/xdeyv5d73x9jqqjhqvhlv3kg8pawsdup4vxa+epho4/hmk8htz89fbeuvx524orjcthihcnuwghnkgjfbiuqclvaatpu4tc8pw/3qlgnc3omqlmncymzcuhbkh59jnpwcuh5b/xi+bkqsnn9q3xz/cehs8fwu7+f4bicq32rgrtm78eg3xonpexnenzgtftvfu3d2ntrpbjbstg26tayer8n4hhdf74okbps9e8umzzwsstdhifzk2tumszb4uskepwovecfwcfmlaqanaujnvmqrp24bjaf7y5hikhxvb4ue7t9jpm9m+3n3vwvj/2rtc/npmsnwkaxfx5rximjb28bpjq3gqfynag5l8bkl2cby/mta5ajnonu93gosr3i7gptvbgstxofstdg7+nk2ytxbiictkizkgx8krmrbofgcbwvyqbx/c7xanflmedype85r8u+ghv59gwau=</latexit> sha_base64="ryy82rkiqtecxpxhlwnbdy3h/s=">aaacc3icbvfda9rafjebev6vqf+zjfbaosiepsiufpfbpek3lewsy8ksjtzhjmbmrxit/un/e/+g6stag2xhg495x77gxnljjh3h8mwjv3x+wsbncpdo8zon29hosnxvjalcs9uyc9tcejjiyyouynzgrqqrjn6cwnvj+7fnbjwpzgqhqzdqsjc8kbg2oe/waacmlb+a/idm5pcbfwn8gbxmjubsh/4kag4qqu9k8huzk/hrpzzf33mvgswov2qn6v7h/sg6oxtp2jbdwlbpo2e8jruidhsgyhp63ge/wzzwsstdhifzk2tumszb4uskepwovecfwcfmlaqanaujnvmqrpy4bjaf7y5hikhxvt4ue7t9jpm9m+3n3wwvj/2rtc/p3msnwkaxfx5rximjb28bpjq3gqfynag5l8bkl2cby/mta5ajnonu93goss3i7gltg/gstxovrdhh38wyrxbjhriqh78gr+uyoyyrw8ivycladkpgt7oz74yvabjnufknwrf/awgomaj</latexit> <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/r sha_base64="atibifcstvv6r5dosvcrwl5hpza=">aaaccnicbzdnsgmxfixvl9aq45udresgqsy4zxjbhxwcx+qkewjjnpqzozickizejgl/av3orenfgobnsm2bbbqo sha_base64="wkztmnlmphtr+emam6dcrjdfau=">aaacfxicbvc7tsmwfhv4lvikmmjgusexvukxefmlfsac6enqsmu7tmvvdilbqaqilvwev8akoxtizwbls3dasedlkswdn sha_base64="brmv93m6ykic5zr9k2giwrrpbqa=">aaacfxicbvdlsgmxfm3uv62vuze6cbbbvzkpgukq4mzlffuazlistkynzwsgjcouort/wl9wq3t34taw7/etdugth4in Universal Approximation of SLNN nx N n (x) = c j (a j x + b j ), a j 2 R d, b j 2 R j= f : R d! R Universal approximator Barron 993, Cybenko 989, Funahash 989, Hornik 99, Leshno et al, 993, Pinkus 999, etc Ex: Pinkus 93 Theorem 3. ([], Theorem ). Let be a function in L loc (R), of which the set of discontinuities has Lebesgue measure zero. Then the set N ( ; R d, R) is dense in C(R d ), in the topology of uniform convergence on compact sets, if and only if is not an algebraic polynomial almost everywhere. N ( ;, ) := span{ (w x + b) :w 2,b2 } Many extensions/variations since
12 <latexit sha_base64="hp+6lruf2d3tzaldqaqqvekmxyw=">aaab2xicbzdnsgmxfixvl86vqrn8eiucozbnqpuhfzwbzco5rm5k4bmskmyr2hdhbf25efc93vo3p sha_base64="upbz4aoe9fosg3s82zwrt8ldeu=">aaab93icbvbns8naej34wwvv6e28bivgqsrefe+cf49v7aeowy2m3bpzjfstoqsihf/ihcpivhpvplv sha_base64="7pnxt3biw+vtuj5d5skeadnz=">aaacanicbvdlssnafj3uv62vqctxeyycq5jo7gquhfzxt6gcwuynbrdjznh5kyoobjxv9y4umstx+ho sha_base64="gywab+wo35hgx7uv9tdaa4vbdeo=">aaacanicbvdlssnafj34rpuvdsvubovgqirfufwv3lisyh/qhdkzttqhkmyurfkkg78ftcufhhrv7jz <latexit sha_base64="hp+6lruf2d3tzaldqaqqvekmxyw=">aaab2xicbzdnsgmxfixvl86vqrn8eiucozbnqpuhfzwbzco5rm5k4bmskmyr2hdhbf25efc93vo3pzjbdwq+zknivscullqubn9ebwd3b/+gfugfnfzjk9nmo2fzgjsilzl5jnmfpxu2cvjcp8lgzylffbj6fi77+gstlxtzqrmmr4wmtuck7o6oyaraadlmw2ivxdc9yanb+gss7kddujxadhefbucunsafw7g9liwuxuz7ggupnm7rrtrxzzi6dk7an+5oykv394ukz9bostjdzdhn7ga2mp/lbiwlteldvesarh6kcvo5wtdmajnchizrxwyasblykjnyqa8z3hysbg29d77odbu3wmya6nmmfxeein3ahd9cblghi4bxevyn35n2suqp569lo4i+8zx84xio4</latexit> sha_base64="srkwle6y9hpjcepozwdqjwxdviq=">aaacs3icbzhbt9swfmadainbcu77mkbkfjp7riucabpi5wy9ikshuboy7twrwdzh5bvfb+yxgbp4xlnkftgcgezovt97n2z9nlrqw4vgxcdc2t95s77ztvnt9/+fjd2/3ypayxzislmam4xaloxmqxag+ulofwz5nfz/hubx99yywpf8ki4mnfpouglhwvtpvraqjux4+w8twknxzs76ehdut+6kmjs5ipp/b3gxrcwl4dhfbzzwiteckliptejf2fd325ixhqgvs+y/kppjuptvxyapn2debavc78wyvocohvdpt3fjc9zrbggjqmoysuyoyoacekbzqktryibe6nfoslporbsvtiafchd3jclmyvdxjppp9wvfm7ujnvvbrm9mxwmv/lrjuuj2mndfud2xuffldcvugencgm5alrygzah/v8xmlme/xmddyf5+etx4upokmsd5eemdtbnti8ilkbj9a2dosrazdo6cgddach+bpulxcfqzrbp/qpxvu/wvtbbga</latexit> sha_base64="k2xpkdwscfhwzj6cglvqgfrqk+u=">aaacvnicbvfpb9mwhhuyxkb5swbhlhyvuirrkvqch2axavuq6lbplqnhmdprdposh9bq6x8sbjar+gccnocyonjtp7eez/zfs5rkswkyc8gplh3ep/o+mhg4apht6ip88ubduyxmeskpw5zqnlumg+awgsx9egu5vlfpvv3nf+vdurkjz9jwfkhosotsmapeyijvzjq+geeztgyjmrc5g+vltf26bielocyv8eanhrxui2ffbvgvczxmry5kbyewc9dou6l+gbczykrqzwmxmpypaeftk739mmarcnkkuya75kjpu4yklvpgiyo3igpiks7tpiafowyek7wdkmbymrinxfg5p5oqbhduvulx3mlwgvl/nkad+rfe44qa7cq9lfywve534p2/eqplu4ysug+ca7q8qg4mhwl3hubcgm5bbtygzwt8vszxxyl/iyevib395lvkcjpjkn6krmet/s6jtel9blfkevvtn6gc7qddhhfkwuag+bh8dg/do3dpqz5+gfhnefelcylg==</latexit> sha_base64="zpezplte49r5vqlegew3gimij=">aaacvnicbvfpb9mwhhuyxkb5f+dixajcsivakgojdpsiqvchks3sxuboy7twrwdyp+cvln5kncbj8if4bq5wmatbd29936y/zzxulhikp9behdn8o7r8b3b/qcphz2onjy9sfvjgj+xslbmkqews6h5dariflubtluu+ww+ed/5l+5salsn2fb84wikykwsh4kytumen4eorpmbgnytzmdkyx3bpup2uhjjxxpvxets6j4qvfwc9jftwa9rmrpisyr6bix4utxlyvgrnyweoxjl4ywodnz2ydpfg2tsbidvk3sngxrj/ms+kakijwka2cswjtpkxowjhoqtpj2qbrla8o2dmxnnmqquf24xstfumvapev8usd3ql/tziqrn2q3ccvhbw96xxi/7x5a+w7hro6bobrtj+obcsgcncd4iyzkbupahmch9xznbudwn+jwa+hptmk2+ti+kktsbppzfds2lfxzf6jl6ggkxoltpdh9a5miggvqnfqrgcbd+c3+fhelsphke/8wz9gzd6axoqsji=</latexit> <latexit sha_base64="hp+6lruf2d3tzaldqaqqvekmxyw=">aaab2xicbzdnsgmxfixvl86vqrn8eiucozbnqpuhfzwbzco5rm5k4bmskmyr2hdhbf25efc93vo3pzjbdwq+zknivscullqubn9ebwd3b/+gfugfn sha_base64="njasugnuulwkrsc/endiqi7cgs=">aaab8nicbvbnswmxfhzrz6vv/hmjvgedj2vsiebc8eq9gp2fkkmbbgx2sbjcxyp/xyshrfwx3vw3ztsethugmmy8x5smyqtxxvo+nzxvtfwnzcpwd sha_base64="b8ic89ecocgs7wdsp+iyabe7kfe=">aaab/xicbvdlsgmxfm3uv62v8bfzeyyccykz3siucm5cvrepma4lstntacyzkoxqh+kvuhghifv/w5/y6adhbyecbzouzd7cndcmtae9+2uvlbxjfkm sha_base64="+ugmipsop4hhbmztpeihrsgsc=">aaab/xicbvdlsgmxfm34rpupnzugkvwiwwmciqrghuxvewdoknjkwbmkmgjcpuofgrblwo4tb/coffmglnoahaodz7uwehjxwpo3nfttlyyura+ulj <latexit sha_base64="hp+6lruf2d3tzaldqaqqvekmxyw=">aaab2xicbzdnsgmxfixvl86vqrn8eiucozbnqpuhfzwbzco5rm5k4bmskmyr2hdhbf25efc93vo3pzjbdwq+zknivscullqubn9ebwd3b/+gfugfnfzjk9nmo2fzgjsilzl5jnmfpxu2cvjcp8lgzy sha_base64="5sq7qbgyzkqer6jvq5uee/qryw=">aaacahicbzdnsgmxfixvl9aqa3bkklubhljbvdciib3vwwtddwkkkzbwgmmyr3hdj78zxcencez/cnw9j2s5cww+efjytknzjxiydnvj7eyura+kd8sbbw3d3zle8wmirlneinfmtitnxouheinfch sha_base64="cwycoyovydakucqecdnvezrumo=">aaacc3icbzdlsgmxfiyz9vbrbdslm9aivmqy4uhiib3vwwf2jhkkkzbwgmmyrnhdj78zxcencebe+gdvfxrsdhbyecpn4/3nizu/hgmtwng8rt7k6tr6r3yxsbe/s7tn7bdjyqyboepno+uxwyrr sha_base64="vhwru4bq7l2b2sltwzqu/bwii/c=">aaacc3icbzdlsgmxfiyz9vbrrerstwgrkmkzkyjuhiib3vwwf2jrkekzbwgmmyrnhgho3o2v4safim59axe+jwk7c6ecpn4/3nizu9fgmuw7s8rt7s8srqwxy9sbg5t7xr39o6jbvltrqkuhu8opngkjw <latexit sha_base64="hp+6lruf2d3tzaldqaqqvekmxyw=">aaab2xicbzdnsgmxfixvl86vqrn8eiucozbnqpuhfzwbzco5rm5k4bmskmyr2hdhbf25efc93vo3pzjbdwq+zknivscullqubn9ebwd3b/+gfugfnfzjk9nmo2fzgjsilzl5jnmfpxu2cvjcp8lgzylffbj6fi77+gstlxtzqrmmr4wmtuck7o6oyaraadlmw2ivxdc9yanb+gss7kddujxadhefbucunsafw7g9liwuxuz7ggupnm7rrtrxzzi6dk7an+5oykv394ukz9bostjdzdhn7ga2mp/lbiwlteldvesarh6kcvo5wtdmajnchizrxwyasblykjnyqa8z3hysbg29d77odbu3wmya6nmmfxeein3ahd sha_base64="qbotptkxdqkuiszksehqv+yg5tc=">aaacchiclzhlssnafiyn8vbrrbovzlciqwhn3ohgeny4vlaxagqytcftklmemylyql7cr3pnm7grnnyuthxjgygp/58zl//egapa+p674y4tr6yuvdarg5tb2zu3c22lrncpiulk6obiyfarlqggkmymcemxij5vj37nmshnpxg44zorokmlcmjjwi2mssce+lcrmcueidmpipbhqlkhubmqczvpvkqylkoqsdwmwmles89fi8xgnaohxlkifqsbzt/o2wxp6rv/ay/lbgiwqzqyfb3uetheiccyimzkjrxubnpl8gzshmpkygusyzwikakp5fgtjr/wkaxqmprdkaivr2cqon6s+oangtxzy2ozkyiz3vtcs/v sha_base64="pssskxi+c2wkueralgafve5dpqo=">aaace3iclzfns8mwgmft+j7fph4fcq6xe53tqpqicf48tnaqrlokwbqf5qukqthkv4qfzzvfxitgnnfqzyspbh78n5ck/yfognxg998dd25+yxfpeawyura+svnd2r7xmleytlfkuj3gsbngbwkbahh5zbrbpgbkiu6vr/mhz6ileloddps5agvaeixmlakqq9jjlyx+mwykisluoc8kkkgcwwfqgworlq+wslhuqpcxfybywktlzv9tabcjutm+rj9kttf7+b8psfst+q/hhawggnuwcraufut7emccyimzkjrtubnplsgzshmpkyeusyzwinqk45fgtjr3wlsxqkprnkdivt2caph6s+oangthzy2lryzgz7ojcs/c sha_base64="/qrukrucje3ljkwqqx2awn+xau=">aaace3iclzfns8mwgmft+j7fph4fcq6xe52tkhorbl48tnaqrlokwbqf5qukqthkv4qfzzvfxitgnndq58uhaj/+zus/xnnjgrj+2+oozm7n7+wufrzxlldw69ubn5pmstm2lgyqr5ipamjgrqnnyw8zioghjnyh6dxo/z9egasnfrhhnpctqxnkeygstfzcket5z/tjmfmjfqhmefshdhljcodjuiyofrzqwkvuhyqunsysjlx6lepgnabrq2ufie+lrwi//n2w6p6rw/iy/djgnwqrqybktqpoa9itooregm6rj/azy2qmhqzulbcxjmm4rtsceiqjzobjh2ror7vunbrcp7hifj9xthgbjwqx7bso7mqp/ojcs/c <latexit sha_base64="hp+6lruf2d3tzaldqaqqvekmxyw=">aaab2xicbzdnsgmxfixvl86vqrn8eiucozbnqpuhfzwbzco5rm5k4bmskmyr2hdhbf25efc93vo3pzjbdwq+zknivscullqubn9ebwd3b/+gfugfnfzjk9nmo2fzgjsilzl5jnmfpxu2cvjcp8lgzylffbj6fi77+gstlxtzqrmmr4wmtuck7o6oyaraadlmw2ivxdc9yanb+gss7kddujxadhefbucunsafw7g9liwuxuz7ggupnm7rrtrxzzi6dk7an+5oykv394ukz9bostjdzdhn7ga2mp/lbiwlteldvesarh6kcvo5wtdmajnchizrxwyasb sha_base64="kekopzym8+hjtiaxubqrmnhvac=">aaack3icbzbpsysxfmxv+oc9rfqsbterrcemungl4kbteubctasmkd2wwk4zjnccrw3wqn34od25ckolxmfo78n+fhb/njctnjlmsjslwiziznzv/9xthsbgvpjntbm23hgmsalbwihjlxluuemnbzkk8ck3ylneyte5pqr97j+thp9tumc+xm/jkvgpoxbsjhusdz5xgsvkevje7mef/slnjxcfgjeacwmvijvcquvmqtjv5aw/8lipjbkktl9wnwkw+fk2heiprafzkdno/jorffhpqe4s7ojcnfsktsagwassfw5yla36fpy+az+j65sr2xba9 sha_base64="xhzasz5iyjkisputmnvynj55uk=">aaacnnicbzdnsgmxfiuz/lv/qi7dbisgcgxgjs4fn24ubaufti2z9i4nzpixusowyz7kjc/hzolrdz6cgbalrtqslhufesnbolulj/vdvanpmdm5+ybgytlyyulzd37iyojmcglxlbzorsycfggyklnbmdbakknad3r2x/eshmfzodyn9fnojuuifpyhkzrvuxukfsymegunxu3eyhci5okpzbjpyo9hrsgoeqgzk5yezqsnad37oqlizxkzmh2knw/lo/kdojwqhqzftnnepl2nu8sahl8zavucn2m6zqcelfjuws5ayfsduoevqsqrsox/yluiou7o sha_base64="mkscscevellreoo7evch4y4r9ru=">aaacnnicbzdlsgmxfiyz3q23qks3wsiiqpkrqzcfn26ucryknvoy6rkbzcrjckyswzyvg5/dnrsxirjeczulrwdspj4zzkk/x+lulj/sdvynjqemz2br6yslivfjdxwtbnrkola6lnhcrsycfghyklhcrgmbjjoe8uj4s++e3ykzq6gyhkxqtdqveldhdj/wqxyoukkwydqiopykudkeozrjtm3gbxqhdglbq8lbyfyxijqloqfu3bvlru2uiiepsfet+xv/vpqvbgookxee9xhsk95lobclpmncbpszszg4jlkcphzifl/jpdqcehygnybj6yxdatp/rp <latexit sha_base64="hp+6lruf2d3tzaldqaqqvekmxyw=">aaab2xicbzdnsgmxfixvl86vqrn8eiucozbnqpuhfzwbzco5rm5k4bmskmyr2hdhbf25efc93vo3pzjbdwq+zknivscullqubn9ebwd3b/+gfugfnfzjk9nmo2fzgjsilzl5jnmfpxu2cvjcp8lgzylffbj6fi77+gstlxtzqrmmr4wmtuck7o6oyaraadlmw2ivxdc9yanb+gss7kddujxadhefbucunsafw7g9liwuxuz7ggupnm7rrtrxzzi6dk7an+5oykv394ukz9bostjdzdhn7ga2mp/lbiwlteldvesarh6kcvo5wtdmajnchizrxwyasblykjnyqa8z3hysbg29d77 sha_base64="detrebu3rjqmpmflzgwliuzcyi=">aaace3icbzblswmxfixv+kyanxrjliefmmzfeq3gudgzqx7gm4wznjmg5p5kgtemvsxupgvuhghiodof2p6am9epg4j+hmhj8rxgnb/rbwjc2t7zzo/ndwt7+qfgwfjxkilrljesumtxqspwfnzlvgnkyyevmbtf3gzydsptcoer/d6lda3jp2ib5wsbsyveo4ka+45ivddun6socvkbjlqdi+z+ri88m9wpbrgkq5uvgljrtltovxacyjbxa2v+oxypqglnjuekw62e6mxgporvsnhdsxrjch6tpugyjejllztpxujuoduxnkcskop+/tfrkklrqfvbozed9rynjh/y7qpdi7dje sha_base64="w2mlmcyci8jck9lkhofmf6mes=">aaachnicbzdlsgmxfiyzxmu9jbpeyxcb2mzfeq3qsgnywr2aphyksznjrzicmizeitupfv3lhqrhclb2patkfbfwh8/occts7vj5xjzdvfxsrq2vrgzmgrul2zu7dvhhy2zjwkqpsk5rho+fhszilavexx2kkexahpadsfxk/q7xsqjiujozvkqbvifsqcrrdslmeeo8maey5k/rcxhyx4bzayjkhcvybl+f+gbiqc64gy/lmkl2p4llghiogvwnz/xejfjqxopwrguxwqnys2wuixwoi46qaqjjkpcp2neq6pdlppewn4qpedgkhx6tgp9kefqylho684qq4fcre3m/2rdvawxbs sha_base64="7p4xrsbdpasx4g/tly436yxppm=">aaachnicbzdlsgmxfiyz9vbrbdslm2arwqrluhtdcau3livyc3rkyasznjstgzkmwiy+irtfxylrqrx+jam7rs9yfax3/o4et8xsszo7zbwvwvtfwn7kbua3tnd9e/+gocjyeloniq9ly8okcizoxtpnasusfacepved2pn++pvcwud3ouu6a+4l5jgbtrk597kydnuv6we48fcev9djsyjgievcwhun6jiac4lvcx27bxtdqacy4bsyinuta796fzcegduamkxum3krlqtykkz4xscc2nfiygue/bbguoqook/pg8mq4peihjyh4dt9pzhgqklr4jnoaoubwqxnzp9q7vj7l5 SLNN Constructive Proof Cardaliaguet & Euvrard operator (992) f n (x) = n 2 X f(k/n) I n b n (x k/n), < < k= n 2 where b is bell shaped function, and I = Extensive analysis by Anastassiou et al Several variations/extensions f : R! R Z + b(t)dt Costarelli & Spigler (23) f :[a, b]! R where f n (x) = P bnbc dnae P bnbc f(k/n) (nk k) dnae (nk k) n 2 N + such that dnae applebnbc (x) = ( (x + ) (x )) 2 (bell shaped) At most st order accuracy
13 Error Analysis of Costarelli & Spigler Cosntruction 2 nd order accuracy in the interior of the domain Theorem 4.9. Assume that f 2 C (D) and has uniformly bounded second-order derivative. Given any closed set (, ), forsu ciently large n, i.e., n max x2 x, we have E n (f,x) apple 2kf k L (D) n 2, 8x 2. (4.2) Furthermore, ke n (f, )k Lp ( ) apple 2+ p kf k L (D) n 2, apple p apple +. (4.3) How to treat boundaries is an open problem
14 Results for Smooth and Discontinuous Functions Fig Approximation error E n (f, ) versus Fig. 6.. n. Approximation of the discontinuous function (6.) by the NN op Despite of the lack n of =256. high The accuracy, approximate it can function be effective is plotted at 3 random points drawn from uni consider the NN operator F n in (5.) with boundary conditions. The l p errors, shown in Fig. 6.8, are obtained 7. by Conclusion. approximating In this thepaper sine we presented new error analysis of x). We can see that, as expected fromapproximation Theorem Xiu, Aug 295.,,28 by the convergence neural Damenetwork is operator (2.4) in [5, 4, 2], to u ond rate. The error behavior is very interesting similar to that numerical of cosine phenomenon function illustrated in Fig... Sharp low
15 Constructive Proof for Multivariate Functions Methods based on Cardaliaguet-Euvrard operator All have been extended to multi dimensions All are based on tensor product rule Valid mathematically Ridglet transform Candes (998): does not include the standard activation function Sonoda & Murata (28): Relaxed the restriction (to a degree) Kolmogorov representation theory Sprecher 22, 24: Mathematical complexity Multiresolution wavelets Yarotsky (27): Tensor structure
16 A More Flexible Constructive Proof I I 2 I d N (),2 N (),3 N (),n N () k, N () k,k N () k,k+ N () k,n N () n, N () n,2 N () n,n N (2) N (2) k N n (2) Two hidden layers: ) n (n-) neurons, where n is the number of data samples 2) n neurons. O
17 Output of the first hidden layer z () k,j (x) =s(w k,j x b k,j ), apple k apple n, apple j apple n, j 6= k, where w k,j = x (k) x (j), b k,j = 2 (x(k) x (j) ) (x (k) + x (j) ). Output of the second hidden layer z (2) k (x) =s( z() k (x) b(2) ), k =,...,n, Final output I I2 Id y(x) = nx k= f (k) z (2) k (x), N (),2 N (),3 N (),n N () k, N () k,k N () k,k+ N () k,n N () n, N () n,2 N () n,n N (2) N (2) k N n (2) O
18 Theorem: The NN is a piecewise constant approximation of a given function, based on Voronoi tessellation of the domain by the data. Unstructured construction Wu & Xiu, Exact Numerical Approximation Exact x
19 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="fs5xhzw9txpo/bsexlnrmbokxgs=">aaaca3icbzhbsxwxfmyzytq+2qrfcrbqildo96euqvhgqklvnuvwksmswzpmmgtezzg/tnf2n+ixmdkvq/zb4mv3vucev9bccuui6gcqlrx7/2fxabnzcwx+fu2sqlzuvd+idlmjfxfcyxqvobe7y68jwuftyk3p3urxd9xykesfblrwkykxfplg4lyvdovvie497uijtgypkmpyfnc3grnk4gxvtajctwtzsqokwlhae6ejmd6jyat3xyf9cil/ct27uqivkn3mwlbajvg5yjpbkf9qc38vsrzsy3mdzzf5ez6xi2jejg7jqhcjcowttpk6qrlc2b3mozdlzuobkdvi6ngo95jczybf7tdrfvvraxk2qmivrpzl7ogvn/2bb2egoeroohdds9lbwsuxy3ddhqtccotnapgrflfmbseac/5noitlgvgmgnsomsvebwvl9/6cdspzyohdbrf9rdmtpax+gunaebyuh3sbysbxvbn3ar3jnhc4m5x3xoskvfwfupb8r</latexit> sha_base64="xkl2ac4s/5cmnkpzblzpytovwe=">aaacdnicbvfna9wwfjtdpe22abtjjouisoruafjsxnplinbltyut2ssw2pgnwd5oi8mojicsxj+/zp9fjz65lpacfozh56jgghhxvr9csix6ysvnytt57vfhm7bv+5tapzuvd+jjlmjfnfcyxqvoxe7y88jwuftym3rtfhpbrixitcnblhwqykzfplg4lyu9ovvir7e7ufdtgypkmp+gncxgrnk7guxuzakctwlzsqokzlhae7+c7fj5rm9/bj9xujkx6qe3p5pdxp+ildx6snnbbdbshsjqav+sukodfch46t/m6q5kxxxjkmwdhjhhztwyjxgktc9ulpealucgz94qkfxo63ammq865uuz7nxrzvcqvcnkldwlhtywzd+9hrxoe8semyl9nk6kjxlplqkpsctxzlohehmyyunwizwu2j2cqay8z/tiynpikjvv2btsvy4gqfk9gaur6p4rzq4oujqwupvq4aohh9rkfogzpgy8tqn2a92aq2g7/hh3a3/limhke3s4eiiz+az4pv98=</latexit> sha_base64="ajo2wozgn4gldpa4phdszk5qyi=">aaacdnicbvfna9wwfjtdte22h9kmxirwui3ncx2klsxqkcxnepasklgttfpsrzrrpidss5zjh9or82f6lgyywfa4lrzdzgnfccuui6hcqplt5/ull6lrves3b9f77zzobf4axscsl7k5o2c5fjqpnxcsnxwgg6ksn9llb4/+osbk3j97byfnyqyazejbs5lsb/+nujh9q7ex8swkqnm+3f9rjfl5l4qmwvdosbdkycopktlubuv3bdf6ljfgexxf2vkoj7oagxvr9rm/txuzacgdfgsl/ueilrgxytuyaboer6fiasjx7zgeaydxvlhpbcyjjnndi6xlbbblmpgjpxout9oqrang25jczybf7tdrxp3ogjl7ujrn2w2ta+9rnzkm5qu+zqthc5kxzvbppsverscn53jvbjonfx4aswivytmf2caof8zpzlyjchadqu2pcqpk3hmtvzgctskf3wehox9ayi92gldvgmvqaddiiobgxdbosbrvbzva3/bbuhx+xtdozjbrpytrp59pv+m=</latexit> On the Practical Side Success comes from DEEP network with many hidden layers nx N n (x) = c j (a j x + b j ), a j 2 R d, b j 2 R j= BUT Network construction is based on training/optimization Exceptionally large number of parameters local minimums
20 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="fs5xhzw9txpo/bsexlnrmbokxgs=">aaaca3icbzhbsxwxfmyzytq+2qrfcrbqildo96euqvhgqklvnuvwksmswzpmmgtezzg/tnf2n+ixmdkvq/zb4mv3vucev9bccuui6gcqlrx7/2fxabnzcwx+fu2sqlzuvd+idlmjfxfcyxqvobe7y68jwuftyk3p3urxd9xykesfblrwkykxfplg4lyvdovvie497uijtgypkmpyfnc3grnk4gxvtajctwtzsqokwlhae6ejmd6jyat3xyf9cil/ct27uqivkn3mwlbajvg5yjpbkf9qc38vsrzsy3mdzzf5ez6xi2jejg7jqhcjcowttpk6qrlc2b3mozdlzuobkdvi6ngo95jczybf7tdrfvvraxk2qmivrpzl7ogvn/2bb2egoeroohdds9lbwsuxy3ddhqtccotnapgrflfmbseac/5noitlgvgmgnsomsvebwvl9/6cdspzyohdbrf9rdmtpax+gunaebyuh3sbysbxvbn3ar3jnhc4m5x3xoskvfwfupb8r</latexit> sha_base64="xkl2ac4s/5cmnkpzblzpytovwe=">aaacdnicbvfna9wwfjtdpe22abtjjouisoruafjsxnplinbltyut2ssw2pgnwd5oi8mojicsxj+/zp9fjz65lpacfozh56jgghhxvr9csix6ysvnytt57vfhm7bv+5tapzuvd+jjlmjfnfcyxqvoxe7y88jwuftym3rtfhpbrixitcnblhwqykzfplg4lyu9ovvir7e7ufdtgypkmp+gncxgrnk7guxuzakctwlzsqokzlhae7+c7fj5rm9/bj9xujkx6qe3p5pdxp+ildx6snnbbdbshsjqav+sukodfch46t/m6q5kxxxjkmwdhjhhztwyjxgktc9ulpealucgz94qkfxo63ammq865uuz7nxrzvcqvcnkldwlhtywzd+9hrxoe8semyl9nk6kjxlplqkpsctxzlohehmyyunwizwu2j2cqay8z/tiynpikjvv2btsvy4gqfk9gaur6p4rzq4oujqwupvq4aohh9rkfogzpgy8tqn2a92aq2g7/hh3a3/limhke3s4eiiz+az4pv98=</latexit> sha_base64="ajo2wozgn4gldpa4phdszk5qyi=">aaacdnicbvfna9wwfjtdte22h9kmxirwui3ncx2klsxqkcxnepasklgttfpsrzrrpidss5zjh9or82f6lgyywfa4lrzdzgnfccuui6hcqplt5/ull6lrves3b9f77zzobf4axscsl7k5o2c5fjqpnxcsnxwgg6ksn9llb4/+osbk3j97byfnyqyazejbs5lsb/+nujh9q7ex8swkqnm+3f9rjfl5l4qmwvdosbdkycopktlubuv3bdf6ljfgexxf2vkoj7oagxvr9rm/txuzacgdfgsl/ueilrgxytuyaboer6fiasjx7zgeaydxvlhpbcyjjnndi6xlbbblmpgjpxout9oqrang25jczybf7tdrxp3ogjl7ujrn2w2ta+9rnzkm5qu+zqthc5kxzvbppsverscn53jvbjonfx4aswivytmf2caof8zpzlyjchadqu2pcqpk3hmtvzgctskf3wehox9ayi92gldvgmvqaddiiobgxdbosbrvbzva3/bbuhx+xtdozjbrpytrp59pv+m=</latexit> <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzz sha_base64="87wh8oegof8x/jhouaafqillve=">aaaclxicbzdlsurafizpdc7a6ti6dvmoggdk7iz2cgmuhgpykvqtqnlznyvusqyodiftj+bk+gltdubm3w7yfrfey/xq8pp/p25fxcppxrg+baulnz5/+bq3fpzxfu23t5ypbnfzqtrkekcxgjjsvz6jnpff2uhldhis7jy8njf/6hjjvffurqkgyasymuqdzbd9iuzadzjqmorsrrddhjpul9ehtafqdvfgb+qmggzgmdxcjheyfshloxjyaamx2y63jyfujjispnurmkrehyekgdronh sha_base64="djhkictvn3wld+2jbwagigrzsw=">aaacohicbvbpsxtbhj3vtry2lspvqwgiyesdnoxf6ngxaocusetwm9nf7szmro7mzkvliufp/cr+bvd96kl+lvt+akwcq/ftdweo/9fjpzwlwky33/zltz/fbx7dp658bgl6+b35rft85mvmiofz7jtf+eyfckfptwwikxuuzqoctzchi4988vururpae2zhgoielfldhyj42av5kriyi2a5mpgzyduk9zaupz6ta6uxz+ujytikjlgyu4pf6o2fk7/glpqlqiijkfnvyzkekewtvycmypaz+2wa sha_base64="kzyonm5lnsde/8sni4mhvq7clfs=">aaacohicbvdnatwwgjsttem3f25y7evkkexcwexqsc6lgv5ytccbbfzm+sx/9opitipjawp2cfosfyve2nvizfsa54g8q4pzsydgmfmvk/sxkuuxgbbb29tfepz882tf72xr6/eeu/2z4zrau5jnkhc3rgepchxbysvelbpbxrlp48uvrx9+jdqiij+dymrgiwxqebgnttvzajmgudbrkcaa2h9dnlgah/pb2ixr4kbkrchk6nfigvzsy+vgfcxah5owi33s4xjq37kk4jxc3hijxkzcolrra <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyu sha_base64="slnjfpdhj7+4zxze4bdksgyu4ye=">aaacjhicbzdlssqwfizpvtverrdugiioync6udec4malgqpcdbho9ozynlwjbvkmtfxjxwft7oxnyi+izml4o2hwmf/jyfjh+vsgov7r97e5ntzozcfghcwl5pb66egmyqnnq8kxm+jpcqkkltcsrronagkjffnyed/oqotbfzemhlnnoku6libefrre59pzsiq3a7rjn3kju77iinebyuo7ssvcw/rk7dmv8tn3tb/hdsb8qjgetxjrrd sha_base64="5v+utl5rkzhfny4f7lcyl794hg8=">aaacl3icbzdlssnafiynxmu9vv26gsycgpskc3ujcg5cvraqnkwcte7awzkkzkyeepomvosv4fb34kbenw/hti3g7cdax/+fc2bmdlbtxhdf2dqemz2br6yufcwl5zra2tx+gkuwzblbgjugpao+axtg3aq9shsadgzfb9cniv7xfpxksn5s8xa6efswjzsbyqvfb9zxvs9jxqaqdopkupaitlo8d4/6nxmex3ifb6jbq9xdhjsu+he8euqkrf sha_base64="xkkolhld/qe7g/y2ysdhdbvmhc=">aaacl3icbzdlssnafiyn3q23qks3gvqkjkiqbtbcooygqcu8rj5kqdnenizeqiow/is/gkbnuvbktc+rzo2wjedgx8/p85z2b+ibvcg9d9csymp6znzufmkwuls8srdwlk4yxbdjepgoqwach5j3aj8cpvcdiqeblcnw79ytumifxhclt7ejoxtzidiyvutudx/oehgfrnohmu/qyzrmsh3dql/kh4jffwhrrdas2tu6oif8eroubkan <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf sha_base64="v4/fjrlxjx/zymepw4k5bdtdqyk=">aaachnicbzdnsiqxfivvqtojrto2btervaymtrszjyda25cktgqdjrmvupwdzcpkpouubt9gvmsvojb3bttlz6j6z+ffwcch3osm+rlsq2c5/wxwlhc+vl2/jka3xt+4/9sbaqssqk6krc3y8wqdazvtyuv6byhcbrdjzchez6syuythx5ia9l6hsc5ipten2ibmunboa3booyxgyep/9zfoo3v/jxgwfkzuvy sha_base64="r+b93ncberlxhldlkkzmevusui=">aaackxicbvbnaxsxenwmseq6aek2x5etcggyrs+jjeaozcexyjtggxmrhbwfpz25i2scz+g/t/qu9nvfc2h7tplz3kpz8wdg8d7mspoixenrgpsd7l3ypzh82xjvfh35vik9fbdygafetgumcrmdqqwluxx6krtejbbbpheflzxt/fipgyiy9cmwouw3zvczsgppsrmw4lxmnzxxuvgbaduglrawy84nyvqex3 sha_base64="ir6itffs5ethd68vxrgetvl9kro=">aaackxicbvdlsgmxfm34tr6qltei6agzuye3qicg5ckvowmlduzo2wmrmtjdam/q/wl9wq3t36lj/xlsdha8dfw7n3hute8jmcmn9/82bmjyanpmdm68tlc4tr9rxy5nmmuolz7kvf+hyfckbftwwinxmuzqocsr8ozk6f/dotyits5skwfhqs8rsebgndst+8yinojtbjlray26bgtpgjnl7lbhci6nlkpb6nfr <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="yfpewkmefw9khfv+2tq5amqvswg=">aaacnnicdvfdtxqxfomfuckuvlacy+nxasc2cz4og9uymigphfixni4xsedzp2lm4m7rhu/qf/du++kvslhtdqg/s5ptc9obc/ngsytj8jokvx48fpr49unn6dqz5y+6l9eobdaaqnrq9omc25byqogkfhbsdhada7gjd//nouffadjzv9wwkdy8nlsyl4biorpudxcocukoc3ihfzprjav66k79d+5tzvmfurj/6b+6wztolq/dzwzdkiebj8fs/9iharnfjfsvpddtsidyucnu7hoeuhz9x9nrvtptkesm86h2qlsawwdrhv3filqgiouils7spmgx44blekb77dwqspfoz/akmcka7bjn/w9ebkwhzm3aqphn2+yxj2tqpzonynro925ur/+qnwizfj52smhahejcfla2iwnpzmmghdqhuwc4mdlmssupnxgwgahfvaynbtfud6ekt6n4d44fttlk56ljbvskl sha_base64="yf46xigic7sjt8b7paevboleppg=">aaacqxicdvfdb9mwfhxcyhfbsreelgokdynvcle4ifkkxcij7rjdkuos+q4n53o8nsgbl+t/xd3jll+b2exqbxmns8bnnhl8d542sfppkzxtfuhnr9p2nu779x88fnr//otaqrmbaqsk5xaurgcmehvmggnc5woo85n3y/7hnzbwtvnxdqw3xeyvikjohk+j/ymswapsraffkbthm8ykv33w/reww2zk7hqx+q7uk83ttknx2hkryrjn9y7dwp/nbwerri/rt7fxlfkg2hrftk9bxlb2/l+t8ga9qfjmfkvvq7sdgxiv3tz/xcratfqqfaobutrqcow5qcgw+xoldrcnfa7tacuuwc7ckllpxwamogvtwqmqrtjccetqudb+vyz3utyt/zu2wl6zovkluillh4qwwxpstvoouifataudcylarfufccihhm3usgjlp3hvb+rbkejwc6+bgz5gmw3q/gezudpf sha_base64="2vwfpvlqepd9oouuyuiyvoeyvru=">aaacqxicdvfdsxwxfm2m/bdbdcfx/osuhquyzijbxwqzbknshqs32ygtubngk5kxyahlzh/q3+lrf4nzdwhxbs8ets499+ryklacaxnfv4jw4dnzfy8xx7vev3n7bqm9vhkiyox6lnslgqqug2cf9a33agyvaqotawcphf7/7pfsjny+kbmvqwknrc8jwzajyvth+sa56b4side+jwiatmlm3tjdvapuxlrn73ovdd/ta5/dclbnk3ev5wfl8x2cuf+28glcsnpivpd6cti9yumtubysaybj7x9nbntxu5e3whw+cmig9bbtrl7d8kkktotbmuk2hcvszkaxkccbatuitoalsgo5h6gfbjeirnsxr8efpzdgvlt+fwtn2fsjsqfvepl453vk/7k3jf/wgtcl3rpyxvw2gypcp5bxapsttb8izv8cmmhhamej+v8zoqklm+m9skqxyilpbbov8kphjcj6ck6uhhxj48+dva <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/r sha_base64="atibifcstvv6r5dosvcrwl5hpza=">aaaccnicbzdnsgmxfixvl9aq45udresgqsy4zxjbhxwcx+qkewjjnpqzozickizejgl/av3orenfgobnsm2bbbqo sha_base64="wkztmnlmphtr+emam6dcrjdfau=">aaacfxicbvc7tsmwfhv4lvikmmjgusexvukxefmlfsac6enqsmu7tmvvdilbqaqilvwev8akoxtizwbls3dasedlkswdn sha_base64="brmv93m6ykic5zr9k2giwrrpbqa=">aaacfxicbvdlsgmxfm3uv62vuze6cbbbvzkpgukq4mzlffuazlistkynzwsgjcouort/wl9wq3t34taw7/etdugth4in Parameter Reduction f : R d! R Single-layer FNN nx N n (x) = c j (a j x + b j ), a j 2 R d, b j 2 R j= Activation function with Scaling property: ( y) = ( ) (y), Two special cases: Rectified linear unite (ReLU) Binary/step function ( y) = (y), ( y) = (y), Proposition: N n (x) has an equivalent form en(x) = enx j= ec j ( ew j x + e b j ), k ew j k =
21 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krv sha_base64="3xnnnqq6x2+lrrzxlctdxqglas=">aaacqnicbzbls8qwfivvftu+rrdugoogceprrjec4mavjdavmi4ltdmxmrqlsduh9i/5jy7e3e6dydi5ogpqubwzn3cm++monmadd9dkamz2bn5hcwsvlk6tr5fxlpkpzswidpdyv7raryllcg5ppttuzpfienlbc65nb3rqhureqet+rrsc9xiwm4ktyjy3zwvo77a+jkmzv2xi46qr3irmksjr7iwitkkck6syxoco6/ww2v6jeo+h5ge8iet8nyhw36g4l/rxewfrgxlwg/oxhkckfttthwkmo52a6a7dujh sha_base64="qqvou45gjdbjuaaeqovhdifw2c=">aaactxicbzdns8mwgmbt+be5v6oevqshscgmdhe9dazepimcu2gdju3tltnps5kqi/qf85/w7e286d2binksfr8qepg978v75glsrqvynhurndm7n+ulfqxl5zxvu29vozzaktlk5yis4djamjmekqqhg5twvbpgdklljcn/hnvrimsqnapyspkedmeyui2wqb5/ow7zucasgqarv8gzsq9m3nejtptfgbnqnpsja+mai/jzem2gh8newa9huanyhjdt2to5kw/cvcqtrauue+/eifcc44irvmsmqe66sqr5fqfd sha_base64="ufbr98v/uybd2ruphldkrmjmzc=">aaactxicbzblswmxfiuz9dfaxwxbojfabhktbfiwhuxekfqkndplmpk2bzaxjrih/ph/wru7cad7dykmdfbvlwqo37mxe3p8hfgpbpvbkszmzsxswvlxaxlldxk2vq5jfobsrvhlbaxppkeyifvwmxcacio4zcuepj8b+xturksbrmrolpmtrl6ihxugz5fxo9elwczlsft/utkdhkbxptztgwmnuzimdjzbgbshkfwq/xgqbchsylfw3ahat/wulep2g7unbaolmogrxaxuxjdwkcchipzjcuhcdovfcjos <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="yfpewkmefw9khfv+2tq5amqvswg=">aaacnnicdvfdtxqxfomfuckuvlacy+nxasc2cz4og9uymigphfixni4xsedzp2lm4m7rhu/qf/du++kvslhtdqg/s5ptc9obc/ngsytj8jokvx48fpr49unn6dqz5y+6l9eobdaaqnrq9omc25byqogkfhbsdhada7gjd//nouffadjzv9wwkdy8nlsyl4biorpudxcocukoc3ihfzprjav66k79d+5tzvmfurj/6b+6wztolq/dzwzdkiebj8fs/9iharnfjfsvpddtsidyucnu7hoeuhz9x9nrvtptkesm86h2qlsawwdrhv3filqgiouils7spmgx44blekb77dwqspfoz/akmcka7bjn/w9ebkwhzm3aqphn2+yxj2tqpzonynro925ur/+qnwizfj52smhahejcfla2iwnpzmmghdqhuwc4mdlmssupnxgwgahfvaynbtfud6ekt6n4d44fttlk56ljbvskl sha_base64="yf46xigic7sjt8b7paevboleppg=">aaacqxicdvfdb9mwfhxcyhfbsreelgokdynvcle4ifkkxcij7rjdkuos+q4n53o8nsgbl+t/xd3jll+b2exqbxmns8bnnhl8d542sfppkzxtfuhnr9p2nu779x88fnr//otaqrmbaqsk5xaurgcmehvmggnc5woo85n3y/7hnzbwtvnxdqw3xeyvikjohk+j/ymswapsraffkbthm8ykv33w/reww2zk7hqx+q7uk83ttknx2hkryrjn9y7dwp/nbwerri/rt7fxlfkg2hrftk9bxlb2/l+t8ga9qfjmfkvvq7sdgxiv3tz/xcratfqqfaobutrqcow5qcgw+xoldrcnfa7tacuuwc7ckllpxwamogvtwqmqrtjccetqudb+vyz3utyt/zu2wl6zovkluillh4qwwxpstvoouifataudcylarfufccihhm3usgjlp3hvb+rbkejwc6+bgz5gmw3q/gezudpf sha_base64="2vwfpvlqepd9oouuyuiyvoeyvru=">aaacqxicdvfdsxwxfm2m/bdbdcfx/osuhquyzijbxwqzbknshqs32ygtubngk5kxyahlzh/q3+lrf4nzdwhxbs8ets499+ryklacaxnfv4jw4dnzfy8xx7vev3n7bqm9vhkiyox6lnslgqqug2cf9a33agyvaqotawcphf7/7pfsjny+kbmvqwknrc8jwzajyvth+sa56b4side+jwiatmlm3tjdvapuxlrn73ovdd/ta5/dclbnk3ev5wfl8x2cuf+28glcsnpivpd6cti9yumtubysaybj7x9nbntxu5e3whw+cmig9bbtrl7d8kkktotbmuk2hcvszkaxkccbatuitoalsgo5h6gfbjeirnsxr8efpzdgvlt+fwtn2fsjsqfvepl453vk/7k3jf/wgtcl3rpyxvw2gypcp5bxapsttb8izv8cmmhhamej+v8zoqklm+m9skqxyilpbbov8kphjcj6ck6uhhxj48+dva Proposition : Any NN expression NX N(x) e X= c j (w j x + b j ) e e j= e e e has an equivalent form Proof: en(x) = N(x) = = = k ec j k6( ew j x + e b j ), k ew j k = NX c j (w j x + b j ) j= NX c j kw j k j= NX c j (kw j k) j= enx j= wj kw j k x + wj kw j k x + b j kw j k b j, kw j k Remark: Training of the weights can be constrained on unit sphere
22 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krv sha_base64="3xnnnqq6x2+lrrzxlctdxqglas=">aaacqnicbzbls8qwfivvftu+rrdugoogceprrjec4mavjdavmi4ltdmxmrqlsduh9i/5jy7e3e6dydi5ogpqubwzn3cm++monmadd9dkamz2bn5hcwsvlk6tr5fxlpkpzswidpdyv7raryllcg5ppttuzpfienlbc65nb3rqhureqet+rrsc9xiwm4ktyjy3zwvo77a+jkmzv2xi46qr3irmksjr7iwitkkck6syxoco6/ww2v6jeo+h5ge8iet8nyhw36g4l/rxewfrgxlwg/oxhkckfttthwkmo52a6a7dujh sha_base64="qqvou45gjdbjuaaeqovhdifw2c=">aaactxicbzdns8mwgmbt+be5v6oevqshscgmdhe9dazepimcu2gdju3tltnps5kqi/qf85/w7e286d2binksfr8qepg978v75glsrqvynhurndm7n+ulfqxl5zxvu29vozzaktlk5yis4djamjmekqqhg5twvbpgdklljcn/hnvrimsqnapyspkedmeyui2wqb5/ow7zucasgqarv8gzsq9m3nejtptfgbnqnpsja+mai/jzem2gh8newa9huanyhjdt2to5kw/cvcqtrauue+/eifcc44irvmsmqe66sqr5fqfd sha_base64="ufbr98v/uybd2ruphldkrmjmzc=">aaactxicbzblswmxfiuz9dfaxwxbojfabhktbfiwhuxekfqkndplmpk2bzaxjrih/ph/wru7cad7dykmdfbvlwqo37mxe3p8hfgpbpvbkszmzsxswvlxaxlldxk2vq5jfobsrvhlbaxppkeyifvwmxcacio4zcuepj8b+xturksbrmrolpmtrl6ihxugz5fxo9elwczlsft/utkdhkbxptztgwmnuzimdjzbgbshkfwq/xgqbchsylfw3ahat/wulep2g7unbaolmogrxaxuxjdwkcchipzjcuhcdovfcjos <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="rsy9rdco4iuklcdtem9jvdog5jg=">aaacnxicbvfba9swgjw9w5dwry7eyiddjdh7waeeyi6wp5hbmremfksy4layxykewq9zv2e/a6xzlzswg9fca4ouc7upwvrqtxjor+bogdu/fup9h62hm/fjj+7o9reuaxzijaivjmunbo8ycpdjwctsjgqqwdj9prjo49/mav5wxw3q4rnjmwlnnmkxlnj9zdzglffxy9imist+duhw8wbvm7mkgdj/tpsphflcxnhizzyxgxstcos+clwjnsoov8ofe+hrbkv7b8hywuogycutotb7pvgfjtjphdbluudgme782zs3yv6uvv4jog3ya9taph/5ksplvkhaectj7guwvmfpthvddxibvmfdbtmlophwvipme2ddbhl57jcf4qvwqdw/z/hwwp9uqmvrp5ib6unert2rq2+chm8qkqdsvo+qk8ftiuujkszrhiiivbav92/fdaekqpgz7jcm5usml/e5hp8pykb4phdp4768bcibahn6axqori9rfocxqiealbs+aw+br8dl+fw3c8ji8mnjnuoisv/vghgcrsag==</latexit> sha_base64="vklbfphxdzj9adpm6resjwoiga=">aaacqhicbvfba9swgjw9w5fdpy97uusdnjdgt2xfuqg7aj7ghkstuaugumwe7ws7ujy2qdqn+337hw/zlljwnr5qhbzndor8pbdcmiv4e4z279+4/2hryeft4ydnn3e2dyuirixlushpgrrjnjoxoybwaalylgswsbk9gott34xpxmrfzfrkslxuq84xqbtyxd32ijjf3q+khisyszpxd7cairrmrit4ax+2nbdgcbqfy4mw+kbj2g+bseea8hghaghifp/fwn9asr+69ratvhvoilm6xxgxjr9fyu2nyaqm22hxuo5udz5nulxpetcgbig5bd7qsrp/uvrqsrlcuignsvraeywk8opyk6dkskte/xgs8zlfkem6byb85zkuzoxykzewyf93wcyxkvio+uf6otatd6mzsqthcwtz8vksjxulsoqaub6ynblctgjvh7gkni/q2qlrhcphzdldkmitjzxzohxjfj+amon4fxneg/hbjvbbelbac/as9eem3omj8awmwbjq4hlwghwkpoevwe4cx9swsog9eyckxwsfiq2o=</latexit> sha_base64="slfjjnwwd5gqzm7k6s3rv228vfi=">aaacqhicbvfbb9mwghxcbsu3atotlxyvugessiykkfclaqojjqkuhbvjbidp/vmj6ntwcrpv4nfwyu/bcfnjhb5jevh53zhl/orunbtouhpen66fefuva3tzvhdx897j55eqylsle2pouojrgzqtp2dhwi9ivaxlitienb7v+uqnu5ox+tezltlc4kxom6x8vts/y2w2ngvro8knkus2to3c4cq6uom9mqyux+27xcwadqlj7mc4lhv2eunl+clxbncwmv8wfe+hpa4txdn2iqnakfknvnh7pva3psmhiweag2tnspjte9egagpeb3eleqctudl9i9kcvpllhgqs9syosjo3wblobxmdvglwynqkf2zmyy4l3pbbovgs8+kmcuux7mbdfu/w2kp9vos3n/rf/vavimbvaz7p3c8rysdmvp5qksetausj4stlliiib5gq7t8k6rirti2fzqelleosxmtnfcjxqiug+p9qrwn4q9vewf7btxb4dl4afogbu/aafgmrmamalatfag+bp/cv+eonitfn6hhqegusvkn9dunnu</latexit> <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzk sha_base64="v5uwymz8lft5dgac/i62sewzq=">aaacgxicbzdnsgmxfixvl9aqa3bojfeiqy4y3gqgllxxsd3tkkekzbwgmmyqzsuznixwjx8gt7t2jc7c+izm2qm9edick3bvpj/mtgnb/rikk6tr6xvfzdjwextnt7jxbqo sha_base64="sgvmvfblvv65c5gtdxqpl+pyk=">aaacjhicbzdlssnafiynxmu9rv26gsycijskg9irv24rgavjqwmuzaoznjmjmijcd+bk+glvduxmxblz4je7aclxh4gf/5zdofn5manswdansbs8srq2xtoob25t7+yae/s sha_base64="zihpcxy9hqsl4eibahzqkczqj7u=">aaacjhicbzdlssnafiyn9vbrlerszwarbkekrdcnunsfywr2akik8mkhtqzhjmjwni8hc/hk7jvvttx4caft+littwhwz+/nmo58znj4xkzvmfrmfpewvrbhe2tjc2t4xd/e Proposition 2: Any NN expression NX N(x) = c j (w j x + b j ) has an equivalent form ˆN(x) = ˆNX j= j= ĉ j (ŵ j ˆx + b j ), kŵ j k =, X B apple ˆb j apple X B where X B =sup x2d kxk Remark: Training of the weights can be constrained on unit sphere Training of the thresholds can be constrained in a bounded interval. The reduced training space is (significantly) smaller
23 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="dsrpi6fvaofhjpotveqkkzqz7k8=">aaacahicbvfdaxqxfm2mwuta7dtxvorwyyuyzprfqrqex3wouqhbfjblciet2qnmknyr7ug+ag+if/b92zmtbblwtoped+czlxsjpm9r/ojxk42nm88gz7devnxodrboxdvylsa8upw9zmejjyyoqllmsrqodkxorxnzv94ruwtlbmdjemgqyglkdhiowfktacafb+w/tkpm5fzdr3ys+gt4x84wauhg6jihuearvavbtmyvyoz9z76hy3jryw36xx7nj84onduaw4nrxl+rgsnsx76sjtgz4e2rrsk3wczpi/rkh4o4vbrsc5szbwopvguxil2gfrnkibx8fctaioiwb+t6jlr4jtehlyoznkpbs3q4p2rmlzknld7m7r3xk/7rjg+whqzemblayvlpunopirtvdasgt4kiwaqc3mtxk+qiscazfmmcfkjno/dquzptsvgupwfnhketh2beubjjdskegjcpvysfyhzysmehkd7qrbudj9dfejfdw9sxr2sdx5j+ix98aui/og==</latexit> sha_base64="ywrg3ntu8spn/bibq/ftodry2u=">aaacc3icbvfna9wwejxdnk23h3hbyy4i28kgtoudstkizbld6wkkecq2uzy/kuicqbazxkk/xdeyv5d73x9jqqjhqvhlv3kg8pawsdup4vxa+epho4/hmk8htz89fbeuvx524orjcthihcnuwghnkgjfbiuqclvaatpu4tc8pw/3qlgnc3omqlmncymzcuhbkh59jnpwcuh5b/xi+bkqsnn9q3xz/cehs8fwu7+f4bicq32rgrtm78eg3xonpexnenzgtftvfu3d2ntrpbjbstg26tayer8n4hhdf74okbps9e8umzzwsstdhifzk2tumszb4uskepwovecfwcfmlaqanaujnvmqrp24bjaf7y5hikhxvb4ue7t9jpm9m+3n3vwvj/2rtc/npmsnwkaxfx5rximjb28bpjq3gqfynag5l8bkl2cby/mta5ajnonu93gosr3i7gptvbgstxofstdg7+nk2ytxbiictkizkgx8krmrbofgcbwvyqbx/c7xanflmedype85r8u+ghv59gwau=</latexit> sha_base64="ryy82rkiqtecxpxhlwnbdy3h/s=">aaacc3icbvfda9rafjebev6vqf+zjfbaosiepsiufpfbpek3lewsy8ksjtzhjmbmrxit/un/e/+g6stag2xhg495x77gxnljjh3h8mwjv3x+wsbncpdo8zon29hosnxvjalcs9uyc9tcejjiyyouynzgrqqrjn6cwnvj+7fnbjwpzgqhqzdqsjc8kbg2oe/waacmlb+a/idm5pcbfwn8gbxmjubsh/4kag4qqu9k8huzk/hrpzzf33mvgswov2qn6v7h/sg6oxtp2jbdwlbpo2e8jruidhsgyhp63ge/wzzwsstdhifzk2tumszb4uskepwovecfwcfmlaqanaujnvmqrpy4bjaf7y5hikhxvt4ue7t9jpm9m+3n3wwvj/2rtc/p3msnwkaxfx5rximjb28bpjq3gqfynag5l8bkl2cby/mta5ajnonu93goss3i7gltg/gstxovrdhh38wyrxbjhriqh78gr+uyoyyrw8ivycladkpgt7oz74yvabjnufknwrf/awgomaj</latexit> <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25seh sha_base64="opholld6hhtjdjwej42pkk6addg=">aaacgxicbzbns8mwhmb/nw9ztqevrshmffg6xbi+dfkxxl7dokabpfpajumfufoh/bj+ba969yyevppjtlubuvla4mnzjosfnx sha_base64="hyjpdzi8v/swwwzjdwomu89fer=">aaacjhicbzbns8mwgmdtx+d8q3rehzcrbntlgpebl48yrt3amsdazpuyulbklqypr/cl+fx8kp3b+lbiwc/ivlxudf/epjn/zwpef sha_base64="eijrnnyiquvculejywrbyjkpjq=">aaacjhicbzdlssnafiyn9vbrlerstbaifaukrvbwu3djsqryczsxtcatduhmemymqgl5cf/cv3cre3fiwoln8rjglbbfxj45z/nmg <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7j sha_base64="o57axjl6pixqonnr3+tzveuhfw=">aaachxicbzdlssnagix/futvatbn8eivncauffwi7hxjfxsczsxtcatduhmemymqgl5df/cv3cre3fsru/ink2grqcgdufm8p/zetgjulnw2cgsla+srhxxsxvlza3t sha_base64="ujb/aqczfqguhkfxakqru6hd=">aaackhicbzdlssnafiynxmu9rv26crahgtakgwu3bteupf56gsawywtsdpjwsxekcgp4uv4cm57667zm4ssno6w8dp/85h3pmcynkhdtnibavlk6tl7akg9ube/s sha_base64="/idnu63ncaidkiudc6lzkej3lhm=">aaackhicbzdlssnafiyn9vbrlerstbaifbqmrvbwu3djsuqlf2hqmuwm7dczjmxmhblygl6er+bw9+6k2z6jkzsctv4w8pofczhnpiekrejtngifpewvbxiemljc2t7 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnu sha_base64="f8933vn4izu+5psemawtly/gg=">aaacqnicbzbnsxxbeizrkjzgln6zawjbfyiywuceqiepekglwvdjzlt/n2tjdm3txieszf8w/4dmbedp33ets+whxiwunb79vfx9zjwsjul4jlpyxhqzvll6tvvu7f36h/bg2rerayuwjpv2tomotsyi8kktytlhkdktzjzncn+ckfwidlctjcgeaj4wspoaurgh7knwczgrxfr/ppe6onp8xtblm/2x++bz58u+6zvlcs7lam+lyjsyhk4yblfzpcffhra34m48lfzajhoxbfm6glbvrwutuzdqnhn+klccbzsiobfpp7bdi4pypsb+k4rrdwe9/ sha_base64="aehtlvkwgdw+tx/4342rskr/5bm=">aaactxicbvdpsxwxgm2sra5rq6s9egldhbxkmrmxbs/cxnosk64ko+usyxyzbppmkhwjlmh+sf4tpfcm3uy9nxgzp6bwfrb4ee975mtlcikshuhvolb4epysn2sfbp8/pgc3przoal4ddnucznrcisskghjwilxbqgmeokncfxval/fgpgilyf4qsaowjjltlbgxppdynfcmrzqq7qtqxfwpfdmzskrit6tklbd/ai9yiuy4kspqfqjqtkk99otmujtqtsjooan9s6ifazefjkfn+zjnealai5fm2keufjhzkdgeqpgxfoogl9myxh4qpkco3sz3d sha_base64="ebgfxyakdyqkipt78w6lyjslfm=">aaactxicbvdpsxwxgm2svbfrr7u9egldhbvkmrfbwyvgpaeyiqvczrpkmt+swsqzjn+is5h/rp9ez96kn3v3vkqzp6bwfrb4ee975mtlcikshufdufv4slixp/ywfldw99obn66shlpopr4lnnzltalumjoouajv4ubphijl8ntycs/vanjra7pcvzaqlgrfpngdlbj7hiuenz9j9q9qxfspfjqzskriz6tqle6/6w6nee7rkeefpslocrowtgt9otmujnstsjooav9s6i5aze5uspmy5zmvfsgkutmbt8kcxw4zlbwcvujliujn+yefq9uybhbjp7yu6 <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpnis sha_base64="2b5/th6wvg5ttfqjotwk/dkhvwg=">aaacihicbzdlssqwgix/ene8vbdugomwgg6tgwu3oi5cyyjobaapgk6hknakqtcupoevosv4fb3rhrxpomzyjhdyqczkniny/mofpacd6tmdm5+yxfpexkyura+oa9udpsas4jbzkup7ityku5s2htm8pj5mui5dtdtg/h/btbyovs5nbpciol3avytejwjsosa89gfu9wby4koolmqdyt+cturigxu5vqhbrnfpegez/vibp5eyfedujms+mvciancri3 sha_base64="+riuo25z++o6zkxfeg+hyajmso=">aaack3icbzdlsgmxfiyz9vbrrerstbaifdoy42cm6iuxelfe4hoogqymtymrmsjfcgeq9fwldwq3txiivb9zc9lltqmlp/59dtj4vzlqq3w3ckvlk6tr+fxcxubw9k5xd68torgsiritxq5iwgpkwooqrbiwi4h4jhw94mc47drigov3ahqth6n+saokkdkww6zbhkkbriy9ztzlsipn6oziel56w2n/pvk6vql33val5ry7xzjzmycff4ueyu sha_base64="jai2daqrq3uisqhlbo4njaslbba=">aaack3icbzdlsgmxfiyz9vbrrerstbaifdoyuwqfnvdujkkthy645djznrqzgzimkiz5j8cv/bre5dka4e38pstdwawk//38oofm8mfgptppdycsli2v5fcla+sbmvf7z22jbkbsqthlbiddncaehaiipgoregihum3hmd8f+9cepff4q4yxctjqhtsggcltucw6zzhqy8tsq8y9knus9jg6hzuel95k96lftbjkt9pxzypqx86hwyyznxnccf5yue <latexit sha_base64="azxz4z3+o5bgcqa3jxje/aifdc8=">aaab6xicbza7t8mwfivvyqueamvmsaiqmkqebuykfsyiyfuvjxj3lrwbseyhuqv9q+wsrmhfhmrvwsn7qatr7l6rw/rk+cc25sehx5tz3dvf2d+qf/pcpt6bjz7jcs2wyzkr6ufmdqqusgu5ftjinvizc+zhs/sq7z+jnjxtt3ae4jsieipz9q6qznutoj2sbtzhnanlvhr3pyokowvepvlghozdipcjkqqlwccf35ugmwpm9ejdhqktgmyuwyc3lpniskmxzlwbjf58oqtrmlmo3ui7nztzzf6xdqub3o5krvlcomkrh9jcejur6s8k4rqzfxmhlgnuzivssjvljxjrwmmkyzlqlo4theuk3czgw3oxbfdob+blchc7iakwjhbu7gatrqbqyjvmkb9+k9ex+r7mreusqz+cpv8wcbh5ee</latexit> sha_base64="3syigwgolgng6kz2egshyoedqfq=">aaacd3icfzhdahrbeiv7xqhxjclg8erqxibsqrlmmasfeytcecuxuklgzqemo2tbp7hu4acwnmaxxeb3bx8ge3rxyixyhe6dooqv8pjrnym4rvrn29d3/9qe/hxqpnrf72xqmrgitwjcpv2fmchcppcessfj7xfkhncs/yy6ouf/ynrzov+ulzgicazkawugafa9r/kwmgcwhkf2whmzmzdw8xvp77z+xxxyk7f5f46tdffef/emxv5be5xnhd7laczafl44xabxyaxjljwzxbovdax8ngsal4rdfuhi7bfxh/6vrkheo9gquodcoeqmniwjixctpcdmsqlzddczagnlqjxybr+cvgflysbhig+mk9oufbozfxeuh2r7qbvc78v2/cuplm4qwpgijlovkrngqemeff9kiidupaosv4vyulsccopblvazamto5x5ip2walvyngtjh9nuytyfopyevskxvbbixlr9l79oedsxet7hfjhowpy/x4r4xekloxxhx+xaxyd/akhfwwa=</latexit> sha_base64="emqjjnvuw+jgvddz/eh83qqpyo=">aaacgnicfvfdaxnbfjdq9b4frwfbbmqipt2kpciou+uktgraqjagu7n36ezs8vmxtem+wp8ib76b/wlziyr+ieegdiccy73ciavlxsujd+j+nrg9rs3n2/bt+5e+9+/8hdicvubyvkqyjzk4vnlgmcqppkktgs4vhudn+5//a2tk5x5qosapxrmrpzsaavpv+raabtacp/biezk3mnb5dsnvvp7vdf7ktt9l/hsn6/5/8yze/kn7mgef3stpzmawxjhdohhjpemv3h+cgi6zf5cmkix4vzkuyyctctdr/8qksjqadqkfzk3spkapbtskgx7weowbnegc5weakcjm/plzs/ezscl5unzxbfqucnpgjnfjopye5id9nrxh95k4bknmvtdqgrfavdaku8w7/nkhlqpsibawblu5eiulagkv9tlciwznftvm2ubskkvv3cvho2omsufkoge7vrejbze/acdvnkxrm99oedsdet7hfohoapys34pdxgr9arenopfoixud87g8k2mpa</latexit> sha_base64="8td5w2p/nlho9/bk2oogr+xbaqs=">aaacgnicfvfdaxnbfjdq9b4frwfbbmqio7fzbqyvcx3yswkbymzwd/zuonrmdpm5k4zhf4a/vf/gh/b2srcp8qda4dzzuvezusko6s5gcux9m4eu365o3ezvu379zt37t/6krgchylslx2oaehshockysfx7vflnco/xr/opvqfsjjfafhjvmpcyfikocdn+j8ydxqiqpmp7tbzcq7h7vlkc/+5/eqlf2m7/vc4alfe/yd/oxppbd5rvidrpyctmgl4wuahwa3vk94znbymvwhysjzal+marrmmbr7m/6v7kieogqkbc5mqwnqwziucttejisqzzchcebgtdopn5zwcufbaxgzwxdm8sx6tkjd9q5hc5dsjvsxfq68v/epkhyzdrluzeerqwwly3ivpguf5ii4luihaqvozbutgbc4lcl/wyastm537vtnmgutklfvwmhzujnbmln4ndnfw9wyyr+wpg7kuvwa77apbz2mm2o/oyfq4ehjvxm/jnh65isbreuybo4f43r8mgmpe</latexit> Universal Approximation Property Definition: N ( ;, ) := span{ (w x + b) :w 2,b2 } Standard (unconstrained) NN: N ( ; R d, R) NN with weight constraints: N ( ; S d, R) NN with weight and threshold constraints: N D ( ; S d, [ e X B,X B ]) Theorem [Pinkus 93]: N ( ; R d, R) isdenseinc(r d ) Theorem [Qin, Zhou, Xiu, 28]: N ( ; S d, R) =N ( ; R d, R), and is dense in C(R d ) ond constrained NN expression (2.). Theorem 3.3. Let be the binary (2.3) or the ReLU (2.2) activation function. Let x 2 D R d, where D is closed and bounded with X B =sup x2d kxk. Define =[ X B,X B ], then, ) is dense in C(D) in the topology of uniform conver- Furthermore, N D ( ; S d gence. N D ( ; S d, ) =N D ( ; R d, R). (3.4) N S d N R d R
Introduction to Machine Learning Spring 2018 Note Neural Networks
CS 189 Introduction to Machine Learning Spring 2018 Note 14 1 Neural Networks Neural networks are a class of compositional function approximators. They come in a variety of shapes and sizes. In this class,
More informationCS489/698: Intro to ML
CS489/698: Intro to ML Lecture 03: Multi-layer Perceptron Outline Failure of Perceptron Neural Network Backpropagation Universal Approximator 2 Outline Failure of Perceptron Neural Network Backpropagation
More informationLearning Deep Architectures for AI. Part I - Vijay Chakilam
Learning Deep Architectures for AI - Yoshua Bengio Part I - Vijay Chakilam Chapter 0: Preliminaries Neural Network Models The basic idea behind the neural network approach is to model the response as a
More informationMachine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6
Machine Learning for Large-Scale Data Analysis and Decision Making 80-629-17A Neural Networks Week #6 Today Neural Networks A. Modeling B. Fitting C. Deep neural networks Today s material is (adapted)
More informationA summary of Deep Learning without Poor Local Minima
A summary of Deep Learning without Poor Local Minima by Kenji Kawaguchi MIT oral presentation at NIPS 2016 Learning Supervised (or Predictive) learning Learn a mapping from inputs x to outputs y, given
More informationDeep Feedforward Networks. Sargur N. Srihari
Deep Feedforward Networks Sargur N. srihari@cedar.buffalo.edu 1 Topics Overview 1. Example: Learning XOR 2. Gradient-Based Learning 3. Hidden Units 4. Architecture Design 5. Backpropagation and Other Differentiation
More informationArtificial Neural Networks. MGS Lecture 2
Artificial Neural Networks MGS 2018 - Lecture 2 OVERVIEW Biological Neural Networks Cell Topology: Input, Output, and Hidden Layers Functional description Cost functions Training ANNs Back-Propagation
More informationCS 6501: Deep Learning for Computer Graphics. Basics of Neural Networks. Connelly Barnes
CS 6501: Deep Learning for Computer Graphics Basics of Neural Networks Connelly Barnes Overview Simple neural networks Perceptron Feedforward neural networks Multilayer perceptron and properties Autoencoders
More informationMachine Learning
Machine Learning 10-601 Maria Florina Balcan Machine Learning Department Carnegie Mellon University 02/10/2016 Today: Artificial neural networks Backpropagation Reading: Mitchell: Chapter 4 Bishop: Chapter
More informationMultilayer Perceptron
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4
More informationMachine Learning
Machine Learning 10-315 Maria Florina Balcan Machine Learning Department Carnegie Mellon University 03/29/2019 Today: Artificial neural networks Backpropagation Reading: Mitchell: Chapter 4 Bishop: Chapter
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationDeep Feedforward Networks
Deep Feedforward Networks Yongjin Park 1 Goal of Feedforward Networks Deep Feedforward Networks are also called as Feedforward neural networks or Multilayer Perceptrons Their Goal: approximate some function
More informationArtifical Neural Networks
Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................
More informationNeural Networks. Nicholas Ruozzi University of Texas at Dallas
Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify
More informationComments. Assignment 3 code released. Thought questions 3 due this week. Mini-project: hopefully you have started. implement classification algorithms
Neural networks Comments Assignment 3 code released implement classification algorithms use kernels for census dataset Thought questions 3 due this week Mini-project: hopefully you have started 2 Example:
More informationOptmization Methods for Machine Learning Beyond Perceptron Feed Forward neural networks (FFN)
Optmization Methods for Machine Learning Beyond Perceptron Feed Forward neural networks (FFN) Laura Palagi http://www.dis.uniroma1.it/ palagi Dipartimento di Ingegneria informatica automatica e gestionale
More informationDeep Feedforward Networks
Deep Feedforward Networks Liu Yang March 30, 2017 Liu Yang Short title March 30, 2017 1 / 24 Overview 1 Background A general introduction Example 2 Gradient based learning Cost functions Output Units 3
More informationy(x n, w) t n 2. (1)
Network training: Training a neural network involves determining the weight parameter vector w that minimizes a cost function. Given a training set comprising a set of input vector {x n }, n = 1,...N,
More informationADAPTIVE NEURO-FUZZY INFERENCE SYSTEMS
ADAPTIVE NEURO-FUZZY INFERENCE SYSTEMS RBFN and TS systems Equivalent if the following hold: Both RBFN and TS use same aggregation method for output (weighted sum or weighted average) Number of basis functions
More informationarxiv: v2 [cs.ne] 20 May 2016
A SINGLE HIDDEN LAYER FEEDFORWARD NETWORK WITH ONLY ONE NEURON IN THE HIDDEN LAYER CAN APPROXIMATE ANY UNIVARIATE FUNCTION arxiv:1601.00013v [cs.ne] 0 May 016 NAMIG J. GULIYEV AND VUGAR E. ISMAILOV Abstract.
More informationCSC321 Lecture 5: Multilayer Perceptrons
CSC321 Lecture 5: Multilayer Perceptrons Roger Grosse Roger Grosse CSC321 Lecture 5: Multilayer Perceptrons 1 / 21 Overview Recall the simple neuron-like unit: y output output bias i'th weight w 1 w2 w3
More informationAdvanced Machine Learning
Advanced Machine Learning Lecture 4: Deep Learning Essentials Pierre Geurts, Gilles Louppe, Louis Wehenkel 1 / 52 Outline Goal: explain and motivate the basic constructs of neural networks. From linear
More informationAI Programming CS F-20 Neural Networks
AI Programming CS662-2008F-20 Neural Networks David Galles Department of Computer Science University of San Francisco 20-0: Symbolic AI Most of this class has been focused on Symbolic AI Focus or symbols
More information<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation)
Learning for Deep Neural Networks (Back-propagation) Outline Summary of Previous Standford Lecture Universal Approximation Theorem Inference vs Training Gradient Descent Back-Propagation
More informationNeed for Deep Networks Perceptron. Can only model linear functions. Kernel Machines. Non-linearity provided by kernels
Need for Deep Networks Perceptron Can only model linear functions Kernel Machines Non-linearity provided by kernels Need to design appropriate kernels (possibly selecting from a set, i.e. kernel learning)
More informationFeedforward Neural Nets and Backpropagation
Feedforward Neural Nets and Backpropagation Julie Nutini University of British Columbia MLRG September 28 th, 2016 1 / 23 Supervised Learning Roadmap Supervised Learning: Assume that we are given the features
More informationNeed for Deep Networks Perceptron. Can only model linear functions. Kernel Machines. Non-linearity provided by kernels
Need for Deep Networks Perceptron Can only model linear functions Kernel Machines Non-linearity provided by kernels Need to design appropriate kernels (possibly selecting from a set, i.e. kernel learning)
More informationCh.6 Deep Feedforward Networks (2/3)
Ch.6 Deep Feedforward Networks (2/3) 16. 10. 17. (Mon.) System Software Lab., Dept. of Mechanical & Information Eng. Woonggy Kim 1 Contents 6.3. Hidden Units 6.3.1. Rectified Linear Units and Their Generalizations
More informationNeural networks. Chapter 19, Sections 1 5 1
Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10
More informationMachine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler
+ Machine Learning and Data Mining Multi-layer Perceptrons & Neural Networks: Basics Prof. Alexander Ihler Linear Classifiers (Perceptrons) Linear Classifiers a linear classifier is a mapping which partitions
More informationNeural networks. Chapter 20. Chapter 20 1
Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms
More informationNN V: The generalized delta learning rule
NN V: The generalized delta learning rule We now focus on generalizing the delta learning rule for feedforward layered neural networks. The architecture of the two-layer network considered below is shown
More informationSPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks
Topics in Machine Learning-EE 5359 Neural Networks 1 The Perceptron Output: A perceptron is a function that maps D-dimensional vectors to real numbers. For notational convenience, we add a zero-th dimension
More informationIntroduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis
Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.
More informationApproximate Q-Learning. Dan Weld / University of Washington
Approximate Q-Learning Dan Weld / University of Washington [Many slides taken from Dan Klein and Pieter Abbeel / CS188 Intro to AI at UC Berkeley materials available at http://ai.berkeley.edu.] Q Learning
More informationArtificial Intelligence
Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement
More informationArtificial Neural Networks
Artificial Neural Networks Stephan Dreiseitl University of Applied Sciences Upper Austria at Hagenberg Harvard-MIT Division of Health Sciences and Technology HST.951J: Medical Decision Support Knowledge
More informationNonlinear Models. Numerical Methods for Deep Learning. Lars Ruthotto. Departments of Mathematics and Computer Science, Emory University.
Nonlinear Models Numerical Methods for Deep Learning Lars Ruthotto Departments of Mathematics and Computer Science, Emory University Intro 1 Course Overview Intro 2 Course Overview Lecture 1: Linear Models
More information(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann
(Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for
More informationArtificial Neural Networks
Artificial Neural Networks Threshold units Gradient descent Multilayer networks Backpropagation Hidden layer representations Example: Face Recognition Advanced topics 1 Connectionist Models Consider humans:
More informationMultilayer Neural Networks
Multilayer Neural Networks Multilayer Neural Networks Discriminant function flexibility NON-Linear But with sets of linear parameters at each layer Provably general function approximators for sufficient
More informationFeed-forward Networks Network Training Error Backpropagation Applications. Neural Networks. Oliver Schulte - CMPT 726. Bishop PRML Ch.
Neural Networks Oliver Schulte - CMPT 726 Bishop PRML Ch. 5 Neural Networks Neural networks arise from attempts to model human/animal brains Many models, many claims of biological plausibility We will
More informationLab 5: 16 th April Exercises on Neural Networks
Lab 5: 16 th April 01 Exercises on Neural Networks 1. What are the values of weights w 0, w 1, and w for the perceptron whose decision surface is illustrated in the figure? Assume the surface crosses the
More informationNeural Networks (Part 1) Goals for the lecture
Neural Networks (Part ) Mark Craven and David Page Computer Sciences 760 Spring 208 www.biostat.wisc.edu/~craven/cs760/ Some of the slides in these lectures have been adapted/borrowed from materials developed
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationAdvanced statistical methods for data analysis Lecture 2
Advanced statistical methods for data analysis Lecture 2 RHUL Physics www.pp.rhul.ac.uk/~cowan Universität Mainz Klausurtagung des GK Eichtheorien exp. Tests... Bullay/Mosel 15 17 September, 2008 1 Outline
More informationNeural Networks and Deep Learning
Neural Networks and Deep Learning Professor Ameet Talwalkar November 12, 2015 Professor Ameet Talwalkar Neural Networks and Deep Learning November 12, 2015 1 / 16 Outline 1 Review of last lecture AdaBoost
More informationNeural Networks: Introduction
Neural Networks: Introduction Machine Learning Fall 2017 Based on slides and material from Geoffrey Hinton, Richard Socher, Dan Roth, Yoav Goldberg, Shai Shalev-Shwartz and Shai Ben-David, and others 1
More informationArtificial Neural Networks
Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples
More informationNeural Networks, Computation Graphs. CMSC 470 Marine Carpuat
Neural Networks, Computation Graphs CMSC 470 Marine Carpuat Binary Classification with a Multi-layer Perceptron φ A = 1 φ site = 1 φ located = 1 φ Maizuru = 1 φ, = 2 φ in = 1 φ Kyoto = 1 φ priest = 0 φ
More informationLecture 3 Feedforward Networks and Backpropagation
Lecture 3 Feedforward Networks and Backpropagation CMSC 35246: Deep Learning Shubhendu Trivedi & Risi Kondor University of Chicago April 3, 2017 Things we will look at today Recap of Logistic Regression
More informationDeep Feedforward Networks. Lecture slides for Chapter 6 of Deep Learning Ian Goodfellow Last updated
Deep Feedforward Networks Lecture slides for Chapter 6 of Deep Learning www.deeplearningbook.org Ian Goodfellow Last updated 2016-10-04 Roadmap Example: Learning XOR Gradient-Based Learning Hidden Units
More informationFoundations of Artificial Intelligence
Foundations of Artificial Intelligence 14. Deep Learning An Overview Joschka Boedecker and Wolfram Burgard and Bernhard Nebel Guest lecturer: Frank Hutter Albert-Ludwigs-Universität Freiburg July 14, 2017
More informationIntroduction to Neural Networks
CUONG TUAN NGUYEN SEIJI HOTTA MASAKI NAKAGAWA Tokyo University of Agriculture and Technology Copyright by Nguyen, Hotta and Nakagawa 1 Pattern classification Which category of an input? Example: Character
More informationNeural networks COMS 4771
Neural networks COMS 4771 1. Logistic regression Logistic regression Suppose X = R d and Y = {0, 1}. A logistic regression model is a statistical model where the conditional probability function has a
More informationCS:4420 Artificial Intelligence
CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart
More informationReading Group on Deep Learning Session 1
Reading Group on Deep Learning Session 1 Stephane Lathuiliere & Pablo Mesejo 2 June 2016 1/31 Contents Introduction to Artificial Neural Networks to understand, and to be able to efficiently use, the popular
More informationNONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition
NONLINEAR CLASSIFICATION AND REGRESSION Nonlinear Classification and Regression: Outline 2 Multi-Layer Perceptrons The Back-Propagation Learning Algorithm Generalized Linear Models Radial Basis Function
More informationGlobal Optimality in Matrix and Tensor Factorization, Deep Learning & Beyond
Global Optimality in Matrix and Tensor Factorization, Deep Learning & Beyond Ben Haeffele and René Vidal Center for Imaging Science Mathematical Institute for Data Science Johns Hopkins University This
More informationLecture 3 Feedforward Networks and Backpropagation
Lecture 3 Feedforward Networks and Backpropagation CMSC 35246: Deep Learning Shubhendu Trivedi & Risi Kondor University of Chicago April 3, 2017 Things we will look at today Recap of Logistic Regression
More informationBackpropagation Introduction to Machine Learning. Matt Gormley Lecture 12 Feb 23, 2018
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Backpropagation Matt Gormley Lecture 12 Feb 23, 2018 1 Neural Networks Outline
More informationDeep Learning. Hung-yi Lee 李宏毅
Deep Learning Hung-yi Lee 李宏毅 Deep learning attracts lots of attention. I believe you have seen lots of exciting results before. Deep learning trends at Google. Source: SIGMOD 206/Jeff Dean 958: Perceptron
More informationUnderstanding Neural Networks : Part I
TensorFlow Workshop 2018 Understanding Neural Networks Part I : Artificial Neurons and Network Optimization Nick Winovich Department of Mathematics Purdue University July 2018 Outline 1 Neural Networks
More informationMachine Learning for Computer Vision 8. Neural Networks and Deep Learning. Vladimir Golkov Technical University of Munich Computer Vision Group
Machine Learning for Computer Vision 8. Neural Networks and Deep Learning Vladimir Golkov Technical University of Munich Computer Vision Group INTRODUCTION Nonlinear Coordinate Transformation http://cs.stanford.edu/people/karpathy/convnetjs/
More informationMultilayer Neural Networks
Multilayer Neural Networks Introduction Goal: Classify objects by learning nonlinearity There are many problems for which linear discriminants are insufficient for minimum error In previous methods, the
More informationARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD
ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided
More informationCSE 190 Fall 2015 Midterm DO NOT TURN THIS PAGE UNTIL YOU ARE TOLD TO START!!!!
CSE 190 Fall 2015 Midterm DO NOT TURN THIS PAGE UNTIL YOU ARE TOLD TO START!!!! November 18, 2015 THE EXAM IS CLOSED BOOK. Once the exam has started, SORRY, NO TALKING!!! No, you can t even say see ya
More informationArtificial Neural Networks
Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks
More informationBits of Machine Learning Part 1: Supervised Learning
Bits of Machine Learning Part 1: Supervised Learning Alexandre Proutiere and Vahan Petrosyan KTH (The Royal Institute of Technology) Outline of the Course 1. Supervised Learning Regression and Classification
More information2018 EE448, Big Data Mining, Lecture 5. (Part II) Weinan Zhang Shanghai Jiao Tong University
2018 EE448, Big Data Mining, Lecture 5 Supervised Learning (Part II) Weinan Zhang Shanghai Jiao Tong University http://wnzhang.net http://wnzhang.net/teaching/ee448/index.html Content of Supervised Learning
More informationChapter 4 Neural Networks in System Identification
Chapter 4 Neural Networks in System Identification Gábor HORVÁTH Department of Measurement and Information Systems Budapest University of Technology and Economics Magyar tudósok körútja 2, 52 Budapest,
More informationECE 471/571 - Lecture 17. Types of NN. History. Back Propagation. Recurrent (feedback during operation) Feedforward
ECE 47/57 - Lecture 7 Back Propagation Types of NN Recurrent (feedback during operation) n Hopfield n Kohonen n Associative memory Feedforward n No feedback during operation or testing (only during determination
More informationMachine Learning Lecture 5
Machine Learning Lecture 5 Linear Discriminant Functions 26.10.2017 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de leibe@vision.rwth-aachen.de Course Outline Fundamentals Bayes Decision Theory
More informationDeep Feedforward Networks. Seung-Hoon Na Chonbuk National University
Deep Feedforward Networks Seung-Hoon Na Chonbuk National University Neural Network: Types Feedforward neural networks (FNN) = Deep feedforward networks = multilayer perceptrons (MLP) No feedback connections
More informationAN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009
AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is
More informationFeed-forward Network Functions
Feed-forward Network Functions Sargur Srihari Topics 1. Extension of linear models 2. Feed-forward Network Functions 3. Weight-space symmetries 2 Recap of Linear Models Linear Models for Regression, Classification
More informationJakub Hajic Artificial Intelligence Seminar I
Jakub Hajic Artificial Intelligence Seminar I. 11. 11. 2014 Outline Key concepts Deep Belief Networks Convolutional Neural Networks A couple of questions Convolution Perceptron Feedforward Neural Network
More informationMultilayer Perceptrons and Backpropagation
Multilayer Perceptrons and Backpropagation Informatics 1 CG: Lecture 7 Chris Lucas School of Informatics University of Edinburgh January 31, 2017 (Slides adapted from Mirella Lapata s.) 1 / 33 Reading:
More informationArtificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino
Artificial Neural Networks Data Base and Data Mining Group of Politecnico di Torino Elena Baralis Politecnico di Torino Artificial Neural Networks Inspired to the structure of the human brain Neurons as
More informationIntroduction to Convolutional Neural Networks (CNNs)
Introduction to Convolutional Neural Networks (CNNs) nojunk@snu.ac.kr http://mipal.snu.ac.kr Department of Transdisciplinary Studies Seoul National University, Korea Jan. 2016 Many slides are from Fei-Fei
More informationArtificial Neural Networks
0 Artificial Neural Networks Based on Machine Learning, T Mitchell, McGRAW Hill, 1997, ch 4 Acknowledgement: The present slides are an adaptation of slides drawn by T Mitchell PLAN 1 Introduction Connectionist
More informationDeep Feedforward Networks
Deep Feedforward Networks Liu Yang March 30, 2017 Liu Yang Short title March 30, 2017 1 / 24 Overview 1 Background A general introduction Example 2 Gradient based learning Cost functions Output Units 3
More informationMachine Learning (CSE 446): Neural Networks
Machine Learning (CSE 446): Neural Networks Noah Smith c 2017 University of Washington nasmith@cs.washington.edu November 6, 2017 1 / 22 Admin No Wednesday office hours for Noah; no lecture Friday. 2 /
More informationOptimization Methods for Machine Learning Decomposition methods for FFN
Optimization Methods for Machine Learning Laura Palagi http://www.dis.uniroma1.it/ palagi Dipartimento di Ingegneria informatica automatica e gestionale A. Ruberti Sapienza Università di Roma Via Ariosto
More informationOn the complexity of shallow and deep neural network classifiers
On the complexity of shallow and deep neural network classifiers Monica Bianchini and Franco Scarselli Department of Information Engineering and Mathematics University of Siena Via Roma 56, I-53100, Siena,
More informationDeep Neural Networks and Partial Differential Equations: Approximation Theory and Structural Properties. Philipp Christian Petersen
Deep Neural Networks and Partial Differential Equations: Approximation Theory and Structural Properties Philipp Christian Petersen Joint work Joint work with: Helmut Bölcskei (ETH Zürich) Philipp Grohs
More informationCSC 411 Lecture 10: Neural Networks
CSC 411 Lecture 10: Neural Networks Roger Grosse, Amir-massoud Farahmand, and Juan Carrasquilla University of Toronto UofT CSC 411: 10-Neural Networks 1 / 35 Inspiration: The Brain Our brain has 10 11
More informationCSE446: Neural Networks Spring Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer
CSE446: Neural Networks Spring 2017 Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer Human Neurons Switching time ~ 0.001 second Number of neurons 10 10 Connections per neuron 10 4-5 Scene
More informationMultilayer Perceptrons (MLPs)
CSE 5526: Introduction to Neural Networks Multilayer Perceptrons (MLPs) 1 Motivation Multilayer networks are more powerful than singlelayer nets Example: XOR problem x 2 1 AND x o x 1 x 2 +1-1 o x x 1-1
More informationNonparametric regression using deep neural networks with ReLU activation function
Nonparametric regression using deep neural networks with ReLU activation function Johannes Schmidt-Hieber February 2018 Caltech 1 / 20 Many impressive results in applications... Lack of theoretical understanding...
More informationNeural networks and support vector machines
Neural netorks and support vector machines Perceptron Input x 1 Weights 1 x 2 x 3... x D 2 3 D Output: sgn( x + b) Can incorporate bias as component of the eight vector by alays including a feature ith
More informationECE521 Lectures 9 Fully Connected Neural Networks
ECE521 Lectures 9 Fully Connected Neural Networks Outline Multi-class classification Learning multi-layer neural networks 2 Measuring distance in probability space We learnt that the squared L2 distance
More informationMultilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)
Multilayer Neural Networks (sometimes called Multilayer Perceptrons or MLPs) Linear separability Hyperplane In 2D: w x + w 2 x 2 + w 0 = 0 Feature x 2 = w w 2 x w 0 w 2 Feature 2 A perceptron can separate
More informationClassification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses about the label (Top-5 error) No Bounding Box
ImageNet Classification with Deep Convolutional Neural Networks Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton Motivation Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses
More informationIntroduction to Deep Neural Networks
Introduction to Deep Neural Networks Presenter: Chunyuan Li Pattern Classification and Recognition (ECE 681.01) Duke University April, 2016 Outline 1 Background and Preliminaries Why DNNs? Model: Logistic
More informationLast update: October 26, Neural networks. CMSC 421: Section Dana Nau
Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications
More information9 Classification. 9.1 Linear Classifiers
9 Classification This topic returns to prediction. Unlike linear regression where we were predicting a numeric value, in this case we are predicting a class: winner or loser, yes or no, rich or poor, positive
More informationMultilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)
Multilayer Neural Networks (sometimes called Multilayer Perceptrons or MLPs) Linear separability Hyperplane In 2D: w 1 x 1 + w 2 x 2 + w 0 = 0 Feature 1 x 2 = w 1 w 2 x 1 w 0 w 2 Feature 2 A perceptron
More information