TENSOR LAYERS FOR COMPRESSION OF DEEP LEARNING NETWORKS. Cris Cecka Senior Research Scientist, NVIDIA GTC 2018
|
|
- Dominic Johns
- 5 years ago
- Views:
Transcription
1 TENSOR LAYERS FOR COMPRESSION OF DEEP LEARNING NETWORKS Cris Cecka Senior Research Scientist, NVIDIA GTC 2018
2 Tensors Computations and the GPU AGENDA Tensor Networks and Decompositions Tensor Layers in Deep Learning 2
3 TENSOR COMPUTATIONS AND THE GPU Modern data is inherently multi-dimensional. 3
4 TENSOR CONTRACTIONS Core primitive of multilinear algebra BLAS level 3 unbounded compute intensity. X
5 TENSOR LIBRARIES Explicit permutation dominates. Y. Shi, U. N. Niranjan, A. Anandkumar and C. Cecka, "Tensor Contractions with Extended BLAS Kernels on CPU and GPU," 2016 IEEE 23rd International Conference on High Performance Computing (HiPC), Hyderabad, 2016, pp X
6 CONTRACTIONS : Single GEMM (Provided compact layout) X
7 BATCHED MATRIX-MATRIX MULTIPLY cublas<t>gemmstridedbatched X
8 CONTRACTIONS : Single SB-GEMM (Any layout) X
9 APPLICATION: FFT : Tensor/FFT vendor optimized cublas<t>gemmstridedbatched : Custom kernel : FMM Communication StridedBatchedGEMM: 75%+ of the runtime 1.5x over cufft on 2xV x over cufft on 8xV100 Cris Cecka. Low communication FMM-accelerated FFT on GPUs." In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis (SC '17). ACM, New York, NY, USA. X
10 WHY TENSORS? 10
11 DENSITY AND SPARSITY H. Anzt, S. Tomov, J. Dongarra, Energy Efficiency and Performance Frontiers for Sparse Computations on GPU Supercomputers," PMAM
12 DENSITY AND SPARSITY In general, need < 5% sparsity for a computational win. Solutions Block-sparse Locally dense and globally sparse 12
13 TENSOR DECOMPOSITIONS Decompositions for data sparse representations. 13
14 <latexit sha1_base64="yfuqph02d2wzf4wnsgc+egamvqw=">aaab83icbvbns8naej3ur1q/qh69lbbbu0leua9c1yvhcsyw2lam2227dloju5tccf0dxjyoepxpeppfug1z0oqdgcd7m8zmcxpbtxhdl6ewtlyyulzcl21sbm3vlhf3hnscksp8gotynuputhdjfmonym1emyxcwrrh8gbqn0zmar7leznowbbhx/iep2isflrrjamkl+tkvqdccavudoqv8xjsgrz1tvmz3y1pgjfpqectw56bmcbdztgvbfjqp5olsifyzy1ljuzmb9ns6ak5skqx9gjlsxoyu39ozbhppy5c2xmhgehfbyr+57vs0zspmi6t1dbj54t6qsamjtmesjcrro0yw4jucxsroqnusi3nqwrd8bzf/kv8k+pf1bs7rdsu8zskcachcaweneenbqeoplb4hcd4gvdn5dw7b877vlxg5dp78avoxzfrs5bt</latexit> <latexit sha1_base64="yfuqph02d2wzf4wnsgc+egamvqw=">aaab83icbvbns8naej3ur1q/qh69lbbbu0leua9c1yvhcsyw2lam2227dloju5tccf0dxjyoepxpeppfug1z0oqdgcd7m8zmcxpbtxhdl6ewtlyyulzcl21sbm3vlhf3hnscksp8gotynuputhdjfmonym1emyxcwrrh8gbqn0zmar7leznowbbhx/iep2isflrrjamkl+tkvqdccavudoqv8xjsgrz1tvmz3y1pgjfpqectw56bmcbdztgvbfjqp5olsifyzy1ljuzmb9ns6ak5skqx9gjlsxoyu39ozbhppy5c2xmhgehfbyr+57vs0zspmi6t1dbj54t6qsamjtmesjcrro0yw4jucxsroqnusi3nqwrd8bzf/kv8k+pf1bs7rdsu8zskcachcaweneenbqeoplb4hcd4gvdn5dw7b877vlxg5dp78avoxzfrs5bt</latexit> <latexit sha1_base64="yfuqph02d2wzf4wnsgc+egamvqw=">aaab83icbvbns8naej3ur1q/qh69lbbbu0leua9c1yvhcsyw2lam2227dloju5tccf0dxjyoepxpeppfug1z0oqdgcd7m8zmcxpbtxhdl6ewtlyyulzcl21sbm3vlhf3hnscksp8gotynuputhdjfmonym1emyxcwrrh8gbqn0zmar7leznowbbhx/iep2isflrrjamkl+tkvqdccavudoqv8xjsgrz1tvmz3y1pgjfpqectw56bmcbdztgvbfjqp5olsifyzy1ljuzmb9ns6ak5skqx9gjlsxoyu39ozbhppy5c2xmhgehfbyr+57vs0zspmi6t1dbj54t6qsamjtmesjcrro0yw4jucxsroqnusi3nqwrd8bzf/kv8k+pf1bs7rdsu8zskcachcaweneenbqeoplb4hcd4gvdn5dw7b877vlxg5dp78avoxzfrs5bt</latexit> <latexit sha1_base64="yfuqph02d2wzf4wnsgc+egamvqw=">aaab83icbvbns8naej3ur1q/qh69lbbbu0leua9c1yvhcsyw2lam2227dloju5tccf0dxjyoepxpeppfug1z0oqdgcd7m8zmcxpbtxhdl6ewtlyyulzcl21sbm3vlhf3hnscksp8gotynuputhdjfmonym1emyxcwrrh8gbqn0zmar7leznowbbhx/iep2isflrrjamkl+tkvqdccavudoqv8xjsgrz1tvmz3y1pgjfpqectw56bmcbdztgvbfjqp5olsifyzy1ljuzmb9ns6ak5skqx9gjlsxoyu39ozbhppy5c2xmhgehfbyr+57vs0zspmi6t1dbj54t6qsamjtmesjcrro0yw4jucxsroqnusi3nqwrd8bzf/kv8k+pf1bs7rdsu8zskcachcaweneenbqeoplb4hcd4gvdn5dw7b877vlxg5dp78avoxzfrs5bt</latexit> <latexit sha1_base64="ri3kwg/mnrn2znxkfmv//ojvv6i=">aaacmxicbvdlssnafj34rpvvdelmsaiuslke6kjodeoygrgfjotj9lydo3kwmxfqykf4f/6bw/0axym48yectf3y1gmd5557l2fu8wpopdlnd2npewv1bb2wudzc2t7zle3t38koerrsgvfith0igbmqbmuuh3ysgaq+h5y/vmr7rqcqkkxhrrrf4aakh7ieo0rpysvvnicoasu8bwreyu6hdnce4qvcmkmcgywjhbqkqv4wi16pbfbmmfaisaakjkzoeqvvpxvrjibquu6k7fhmrnyucmuoh6zojbk0xzd0oanpsakqbjo+mmphwunixit0cxueq383uhjioqp8pzmfi+d7ufhfr5oo3pmbsjbofir0ytrlofyrztpcxsaakj7shfdb9f8xhrbbqnkzzrgonnzm8lis+rawiv2tnfesm9ny/xkatwedoin0gixuq3v0jzrirhq9orf0it6mz+pd+ds+jqnlxntnam3a+pkfd4eprq==</latexit> <latexit sha1_base64="ri3kwg/mnrn2znxkfmv//ojvv6i=">aaacmxicbvdlssnafj34rpvvdelmsaiuslke6kjodeoygrgfjotj9lydo3kwmxfqykf4f/6bw/0axym48yectf3y1gmd5557l2fu8wpopdlnd2npewv1bb2wudzc2t7zle3t38koerrsgvfith0igbmqbmuuh3ysgaq+h5y/vmr7rqcqkkxhrrrf4aakh7ieo0rpysvvnicoasu8bwreyu6hdnce4qvcmkmcgywjhbqkqv4wi16pbfbmmfaisaakjkzoeqvvpxvrjibquu6k7fhmrnyucmuoh6zojbk0xzd0oanpsakqbjo+mmphwunixit0cxueq383uhjioqp8pzmfi+d7ufhfr5oo3pmbsjbofir0ytrlofyrztpcxsaakj7shfdb9f8xhrbbqnkzzrgonnzm8lis+rawiv2tnfesm9ny/xkatwedoin0gixuq3v0jzrirhq9orf0it6mz+pd+ds+jqnlxntnam3a+pkfd4eprq==</latexit> <latexit sha1_base64="ri3kwg/mnrn2znxkfmv//ojvv6i=">aaacmxicbvdlssnafj34rpvvdelmsaiuslke6kjodeoygrgfjotj9lydo3kwmxfqykf4f/6bw/0axym48yectf3y1gmd5557l2fu8wpopdlnd2npewv1bb2wudzc2t7zle3t38koerrsgvfith0igbmqbmuuh3ysgaq+h5y/vmr7rqcqkkxhrrrf4aakh7ieo0rpysvvnicoasu8bwreyu6hdnce4qvcmkmcgywjhbqkqv4wi16pbfbmmfaisaakjkzoeqvvpxvrjibquu6k7fhmrnyucmuoh6zojbk0xzd0oanpsakqbjo+mmphwunixit0cxueq383uhjioqp8pzmfi+d7ufhfr5oo3pmbsjbofir0ytrlofyrztpcxsaakj7shfdb9f8xhrbbqnkzzrgonnzm8lis+rawiv2tnfesm9ny/xkatwedoin0gixuq3v0jzrirhq9orf0it6mz+pd+ds+jqnlxntnam3a+pkfd4eprq==</latexit> <latexit sha1_base64="ri3kwg/mnrn2znxkfmv//ojvv6i=">aaacmxicbvdlssnafj34rpvvdelmsaiuslke6kjodeoygrgfjotj9lydo3kwmxfqykf4f/6bw/0axym48yectf3y1gmd5557l2fu8wpopdlnd2npewv1bb2wudzc2t7zle3t38koerrsgvfith0igbmqbmuuh3ysgaq+h5y/vmr7rqcqkkxhrrrf4aakh7ieo0rpysvvnicoasu8bwreyu6hdnce4qvcmkmcgywjhbqkqv4wi16pbfbmmfaisaakjkzoeqvvpxvrjibquu6k7fhmrnyucmuoh6zojbk0xzd0oanpsakqbjo+mmphwunixit0cxueq383uhjioqp8pzmfi+d7ufhfr5oo3pmbsjbofir0ytrlofyrztpcxsaakj7shfdb9f8xhrbbqnkzzrgonnzm8lis+rawiv2tnfesm9ny/xkatwedoin0gixuq3v0jzrirhq9orf0it6mz+pd+ds+jqnlxntnam3a+pkfd4eprq==</latexit> <latexit sha1_base64="/q3t93gl7ycydt4l5wu1z0hwdha=">aaacnxicdvdlsgmxfm3uv62vqks3wsk4kjmiqitc1y0boyk1hc4wzdk3bwgmmyqzsqz9kjd+h7tuxki49rdmh6c2eihhcm653htpkhcmtg0prdzc4tlysn61sla+sblv3n65u3eqkdrpzgpzdigczgtundmcmokeegucgkhvcqq37keqfotb3u/ai0hhsdajrbvkl14tn+ekph//bgadjrigilqyhwf2arhr7lr/axdht72c/wljltvjwvpamyismlbnlz67yuztcismncjvcuxeexmrmleog4kbkkgi7zeotawujallzeozb/jamcfux9i8ofgy/dmrkuipfhqyp1mxq2a1efmx1kp1+9tlmehsdyjobrvtjnwmrxnikemgmvcnifqysyumxsij1sbpggnbmt15htspymdl5+a4vl2yppfhe2gfhsihnaaquki1vecupaihekvv1pp1yr1bhxnrzpr27kjfzx1+axlcqjw=</latexit> <latexit sha1_base64="/q3t93gl7ycydt4l5wu1z0hwdha=">aaacnxicdvdlsgmxfm3uv62vqks3wsk4kjmiqitc1y0boyk1hc4wzdk3bwgmmyqzsqz9kjd+h7tuxki49rdmh6c2eihhcm653htpkhcmtg0prdzc4tlysn61sla+sblv3n65u3eqkdrpzgpzdigczgtundmcmokeegucgkhvcqq37keqfotb3u/ai0hhsdajrbvkl14tn+ekph//bgadjrigilqyhwf2arhr7lr/axdht72c/wljltvjwvpamyismlbnlz67yuztcismncjvcuxeexmrmleog4kbkkgi7zeotawujallzeozb/jamcfux9i8ofgy/dmrkuipfhqyp1mxq2a1efmx1kp1+9tlmehsdyjobrvtjnwmrxnikemgmvcnifqysyumxsij1sbpggnbmt15htspymdl5+a4vl2yppfhe2gfhsihnaaquki1vecupaihekvv1pp1yr1bhxnrzpr27kjfzx1+axlcqjw=</latexit> <latexit sha1_base64="/q3t93gl7ycydt4l5wu1z0hwdha=">aaacnxicdvdlsgmxfm3uv62vqks3wsk4kjmiqitc1y0boyk1hc4wzdk3bwgmmyqzsqz9kjd+h7tuxki49rdmh6c2eihhcm653htpkhcmtg0prdzc4tlysn61sla+sblv3n65u3eqkdrpzgpzdigczgtundmcmokeegucgkhvcqq37keqfotb3u/ai0hhsdajrbvkl14tn+ekph//bgadjrigilqyhwf2arhr7lr/axdht72c/wljltvjwvpamyismlbnlz67yuztcismncjvcuxeexmrmleog4kbkkgi7zeotawujallzeozb/jamcfux9i8ofgy/dmrkuipfhqyp1mxq2a1efmx1kp1+9tlmehsdyjobrvtjnwmrxnikemgmvcnifqysyumxsij1sbpggnbmt15htspymdl5+a4vl2yppfhe2gfhsihnaaquki1vecupaihekvv1pp1yr1bhxnrzpr27kjfzx1+axlcqjw=</latexit> <latexit sha1_base64="/q3t93gl7ycydt4l5wu1z0hwdha=">aaacnxicdvdlsgmxfm3uv62vqks3wsk4kjmiqitc1y0boyk1hc4wzdk3bwgmmyqzsqz9kjd+h7tuxki49rdmh6c2eihhcm653htpkhcmtg0prdzc4tlysn61sla+sblv3n65u3eqkdrpzgpzdigczgtundmcmokeegucgkhvcqq37keqfotb3u/ai0hhsdajrbvkl14tn+ekph//bgadjrigilqyhwf2arhr7lr/axdht72c/wljltvjwvpamyismlbnlz67yuztcismncjvcuxeexmrmleog4kbkkgi7zeotawujallzeozb/jamcfux9i8ofgy/dmrkuipfhqyp1mxq2a1efmx1kp1+9tlmehsdyjobrvtjnwmrxnikemgmvcnifqysyumxsij1sbpggnbmt15htspymdl5+a4vl2yppfhe2gfhsihnaaquki1vecupaihekvv1pp1yr1bhxnrzpr27kjfzx1+axlcqjw=</latexit> <latexit sha1_base64="aqfzgufox1r0sgh/gadmhi+mxiw=">aaagyxic7vrpa9swhfu71+u8p027yy9iyzbbaurg2eyzlo6yywflwohnkgu5ezfli8lbm+opua/qe7/brhtmdl02ozvumb8ynt97ehlv8ityzprg6gzn94g1zz/cf+q8fvl02chg8oizygpj6jrkpjnxevaum0gnmmlor3jjcrpxehmt3tf65rcqfcvej73oazjihwaji1gban5oju4q0qutpwarbzkjupc0mimcox2h3enobelmasksbteqcviiseu6y79ysghheev8fy7q2ihj/xu8oyejv/05deafr02he0ej9x3+zgbazpc7gaif0bzkpqnj6tpqs/yclrnqexc6clbfbivzfvn/inhmgyfyutnwe3gtgij2luad2ydosjfsoqnhss08louwxfizwk1kuciay7lcczozuocuqrbsfmifxxomhqzt8wkng/bveyvolvqnkxgmwc9vx6vjf2mzqidvwpkjvnbuklulkojdncf6u8kysuo0xxuaiwtmrzasscremx3cuaxup6pr8folbilpxh3reh8nw7pztp99caxegbhwwgtwbj6aczafxppu/bb+wr/2bm3hhthhd9bdnfbmc9az+/g3hyi3rw==</latexit> <latexit sha1_base64="aqfzgufox1r0sgh/gadmhi+mxiw=">aaagyxic7vrpa9swhfu71+u8p027yy9iyzbbaurg2eyzlo6yywflwohnkgu5ezfli8lbm+opua/qe7/brhtmdl02ozvumb8ynt97ehlv8ityzprg6gzn94g1zz/cf+q8fvl02chg8oizygpj6jrkpjnxevaum0gnmmlor3jjcrpxehmt3tf65rcqfcvej73oazjihwaji1gban5oju4q0qutpwarbzkjupc0mimcox2h3enobelmasksbteqcviiseu6y79ysghheev8fy7q2ihj/xu8oyejv/05deafr02he0ej9x3+zgbazpc7gaif0bzkpqnj6tpqs/yclrnqexc6clbfbivzfvn/inhmgyfyutnwe3gtgij2luad2ydosjfsoqnhss08louwxfizwk1kuciay7lcczozuocuqrbsfmifxxomhqzt8wkng/bveyvolvqnkxgmwc9vx6vjf2mzqidvwpkjvnbuklulkojdncf6u8kysuo0xxuaiwtmrzasscremx3cuaxup6pr8folbilpxh3reh8nw7pztp99caxegbhwwgtwbj6aczafxppu/bb+wr/2bm3hhthhd9bdnfbmc9az+/g3hyi3rw==</latexit> <latexit sha1_base64="aqfzgufox1r0sgh/gadmhi+mxiw=">aaagyxic7vrpa9swhfu71+u8p027yy9iyzbbaurg2eyzlo6yywflwohnkgu5ezfli8lbm+opua/qe7/brhtmdl02ozvumb8ynt97ehlv8ityzprg6gzn94g1zz/cf+q8fvl02chg8oizygpj6jrkpjnxevaum0gnmmlor3jjcrpxehmt3tf65rcqfcvej73oazjihwaji1gban5oju4q0qutpwarbzkjupc0mimcox2h3enobelmasksbteqcviiseu6y79ysghheev8fy7q2ihj/xu8oyejv/05deafr02he0ej9x3+zgbazpc7gaif0bzkpqnj6tpqs/yclrnqexc6clbfbivzfvn/inhmgyfyutnwe3gtgij2luad2ydosjfsoqnhss08louwxfizwk1kuciay7lcczozuocuqrbsfmifxxomhqzt8wkng/bveyvolvqnkxgmwc9vx6vjf2mzqidvwpkjvnbuklulkojdncf6u8kysuo0xxuaiwtmrzasscremx3cuaxup6pr8folbilpxh3reh8nw7pztp99caxegbhwwgtwbj6aczafxppu/bb+wr/2bm3hhthhd9bdnfbmc9az+/g3hyi3rw==</latexit> <latexit sha1_base64="aqfzgufox1r0sgh/gadmhi+mxiw=">aaagyxic7vrpa9swhfu71+u8p027yy9iyzbbaurg2eyzlo6yywflwohnkgu5ezfli8lbm+opua/qe7/brhtmdl02ozvumb8ynt97ehlv8ityzprg6gzn94g1zz/cf+q8fvl02chg8oizygpj6jrkpjnxevaum0gnmmlor3jjcrpxehmt3tf65rcqfcvej73oazjihwaji1gban5oju4q0qutpwarbzkjupc0mimcox2h3enobelmasksbteqcviiseu6y79ysghheev8fy7q2ihj/xu8oyejv/05deafr02he0ej9x3+zgbazpc7gaif0bzkpqnj6tpqs/yclrnqexc6clbfbivzfvn/inhmgyfyutnwe3gtgij2luad2ydosjfsoqnhss08louwxfizwk1kuciay7lcczozuocuqrbsfmifxxomhqzt8wkng/bveyvolvqnkxgmwc9vx6vjf2mzqidvwpkjvnbuklulkojdncf6u8kysuo0xxuaiwtmrzasscremx3cuaxup6pr8folbilpxh3reh8nw7pztp99caxegbhwwgtwbj6aczafxppu/bb+wr/2bm3hhthhd9bdnfbmc9az+/g3hyi3rw==</latexit> <latexit sha1_base64="wsz5rmw8cwo8oy0jbztjxbo84hs=">aaacwxiclvfds8mwfe2rbnn+tffos3aopo1wbpvhmpxfxwnwddyy0vrui6zptvjxlp1jx0twrwhmh6buvngg5hdovcnnszhyprtjvfv2yupaovhal29sbm3vvhb37lwssqoetxgioyfrwjkatzpnoznkihhior0+xk/89jnixrjxp0cpbdezcnznlggj9sppl72cpyxxa/shdjjiw5hoyv7g2kdrovhrwu77/zzarn+nnncvunpqzhr4mbhzuknzthqvvz9kabad0jqtpbquk+ogj1izymfc9jmfkagpzabdqwwjqqx5njoxpjrkhpujnetopfv/duqkvmouh6bsjdhui95e/mvrzrp/hurmpjkgqwcx9toodyinoeoisacajwwhvdizk6zdignv5jfkjgr38cnlxdupx9td29na82qergntown0jfx0hprobrwqhyh6q59wwspah7ztl+zyrns25j1v9at29quzkldw</latexit> <latexit sha1_base64="wsz5rmw8cwo8oy0jbztjxbo84hs=">aaacwxiclvfds8mwfe2rbnn+tffos3aopo1wbpvhmpxfxwnwddyy0vrui6zptvjxlp1jx0twrwhmh6buvngg5hdovcnnszhyprtjvfv2yupaovhal29sbm3vvhb37lwssqoetxgioyfrwjkatzpnoznkihhior0+xk/89jnixrjxp0cpbdezcnznlggj9sppl72cpyxxa/shdjjiw5hoyv7g2kdrovhrwu77/zzarn+nnncvunpqzhr4mbhzuknzthqvvz9kabad0jqtpbquk+ogj1izymfc9jmfkagpzabdqwwjqqx5njoxpjrkhpujnetopfv/duqkvmouh6bsjdhui95e/mvrzrp/hurmpjkgqwcx9toodyinoeoisacajwwhvdizk6zdignv5jfkjgr38cnlxdupx9td29na82qergntown0jfx0hprobrwqhyh6q59wwspah7ztl+zyrns25j1v9at29quzkldw</latexit> <latexit sha1_base64="wsz5rmw8cwo8oy0jbztjxbo84hs=">aaacwxiclvfds8mwfe2rbnn+tffos3aopo1wbpvhmpxfxwnwddyy0vrui6zptvjxlp1jx0twrwhmh6buvngg5hdovcnnszhyprtjvfv2yupaovhal29sbm3vvhb37lwssqoetxgioyfrwjkatzpnoznkihhior0+xk/89jnixrjxp0cpbdezcnznlggj9sppl72cpyxxa/shdjjiw5hoyv7g2kdrovhrwu77/zzarn+nnncvunpqzhr4mbhzuknzthqvvz9kabad0jqtpbquk+ogj1izymfc9jmfkagpzabdqwwjqqx5njoxpjrkhpujnetopfv/duqkvmouh6bsjdhui95e/mvrzrp/hurmpjkgqwcx9toodyinoeoisacajwwhvdizk6zdignv5jfkjgr38cnlxdupx9td29na82qergntown0jfx0hprobrwqhyh6q59wwspah7ztl+zyrns25j1v9at29quzkldw</latexit> <latexit sha1_base64="wsz5rmw8cwo8oy0jbztjxbo84hs=">aaacwxiclvfds8mwfe2rbnn+tffos3aopo1wbpvhmpxfxwnwddyy0vrui6zptvjxlp1jx0twrwhmh6buvngg5hdovcnnszhyprtjvfv2yupaovhal29sbm3vvhb37lwssqoetxgioyfrwjkatzpnoznkihhior0+xk/89jnixrjxp0cpbdezcnznlggj9sppl72cpyxxa/shdjjiw5hoyv7g2kdrovhrwu77/zzarn+nnncvunpqzhr4mbhzuknzthqvvz9kabad0jqtpbquk+ogj1izymfc9jmfkagpzabdqwwjqqx5njoxpjrkhpujnetopfv/duqkvmouh6bsjdhui95e/mvrzrp/hurmpjkgqwcx9toodyinoeoisacajwwhvdizk6zdignv5jfkjgr38cnlxdupx9td29na82qergntown0jfx0hprobrwqhyh6q59wwspah7ztl+zyrns25j1v9at29quzkldw</latexit> = A = Scalar TENSOR NETWORKS Notation and Visualization 2 3 a i = A i = = Vector A ij = = Matrix A ijk` = A ijk` = = Tensor 14
15 <latexit sha1_base64="iopidmcvwb55+b9azaw1bz0vyng=">aaacexicbvdlssnafj3uv42vqasxbgal4kokiqi7qhuxfywtncfmppn2zgqszizcdfkl/8ct/oarcesxupddnlrd2oqbc4dz7uxee8kmuals+8uolswula/uv8219y3nlwt7506mucdexsllrtdekjdkiauoyqsbcykskjfogf9vfuebceltfqtggfetnoa0ohgplqxwnpcgncsifrdlund72comlayzwa27ay8b/xjnshpginzgfxv9focj4qozjgxpstplf0goihkpts+xjem4rgps05sjhei/gd9qwkot9ggucl1cwbh6e6jaizsjjnsd1bly3qve/7xerqizv6a8yxxhelioyhlukazsgh0qcfzspancgupbir4igbdsmc1sutr+lktynpkq/hl3uhnedg5ogq3lat51sa8owbfwwclogwvqbi7aoatp4aw8gk/gm/fufexaa8z0zhfmwpj8arfdnbs=</latexit> <latexit sha1_base64="iopidmcvwb55+b9azaw1bz0vyng=">aaacexicbvdlssnafj3uv42vqasxbgal4kokiqi7qhuxfywtncfmppn2zgqszizcdfkl/8ct/oarcesxupddnlrd2oqbc4dz7uxee8kmuals+8uolswula/uv8219y3nlwt7506mucdexsllrtdekjdkiauoyqsbcykskjfogf9vfuebceltfqtggfetnoa0ohgplqxwnpcgncsifrdlund72comlayzwa27ay8b/xjnshpginzgfxv9focj4qozjgxpstplf0goihkpts+xjem4rgps05sjhei/gd9qwkot9ggucl1cwbh6e6jaizsjjnsd1bly3qve/7xerqizv6a8yxxhelioyhlukazsgh0qcfzspancgupbir4igbdsmc1sutr+lktynpkq/hl3uhnedg5ogq3lat51sa8owbfwwclogwvqbi7aoatp4aw8gk/gm/fufexaa8z0zhfmwpj8arfdnbs=</latexit> <latexit sha1_base64="iopidmcvwb55+b9azaw1bz0vyng=">aaacexicbvdlssnafj3uv42vqasxbgal4kokiqi7qhuxfywtncfmppn2zgqszizcdfkl/8ct/oarcesxupddnlrd2oqbc4dz7uxee8kmuals+8uolswula/uv8219y3nlwt7506mucdexsllrtdekjdkiauoyqsbcykskjfogf9vfuebceltfqtggfetnoa0ohgplqxwnpcgncsifrdlund72comlayzwa27ay8b/xjnshpginzgfxv9focj4qozjgxpstplf0goihkpts+xjem4rgps05sjhei/gd9qwkot9ggucl1cwbh6e6jaizsjjnsd1bly3qve/7xerqizv6a8yxxhelioyhlukazsgh0qcfzspancgupbir4igbdsmc1sutr+lktynpkq/hl3uhnedg5ogq3lat51sa8owbfwwclogwvqbi7aoatp4aw8gk/gm/fufexaa8z0zhfmwpj8arfdnbs=</latexit> <latexit sha1_base64="iopidmcvwb55+b9azaw1bz0vyng=">aaacexicbvdlssnafj3uv42vqasxbgal4kokiqi7qhuxfywtncfmppn2zgqszizcdfkl/8ct/oarcesxupddnlrd2oqbc4dz7uxee8kmuals+8uolswula/uv8219y3nlwt7506mucdexsllrtdekjdkiauoyqsbcykskjfogf9vfuebceltfqtggfetnoa0ohgplqxwnpcgncsifrdlund72comlayzwa27ay8b/xjnshpginzgfxv9focj4qozjgxpstplf0goihkpts+xjem4rgps05sjhei/gd9qwkot9ggucl1cwbh6e6jaizsjjnsd1bly3qve/7xerqizv6a8yxxhelioyhlukazsgh0qcfzspancgupbir4igbdsmc1sutr+lktynpkq/hl3uhnedg5ogq3lat51sa8owbfwwclogwvqbi7aoatp4aw8gk/gm/fufexaa8z0zhfmwpj8arfdnbs=</latexit> <latexit sha1_base64="mo2atz9sjjutm6gpmd564ganmba=">aaacahicbvbns8naej3urxq/qh69lbbbu0leug9vlx4rgftoq9lst+3azsbsboqacvifenu/4em8+k+8+0pctdny1gcdj/dmmjkxxjwp7tjfvmlpewv1rbxub2xube9udvfuvzriqj0s8ui2aqwoz4j6mmlow7gkoaw4bqaj69xvplkpwctu9dimfoghgvuzwdpirctuyh4y2+5wqk7nmqatercgvsjq6fz+or2ijcevmncsvnt1yu2nwgpgom3stqjojmkid2jbuifdqvx0cm+gjozsq/1imhiatds/eykolrqhgekmsr6qes8x//paie6f+yktcakpinnf/yqjhah8edrjkhlnx4zgipm5fzehlphoe9hmfs1gt1keizsfwilxtmoxnff2tfq/kvipwwecwjg4caz1uiegeecawwu8wpv1bl1bh9bntlvkftp7mapr6xda3jbc</latexit> <latexit sha1_base64="mo2atz9sjjutm6gpmd564ganmba=">aaacahicbvbns8naej3urxq/qh69lbbbu0leug9vlx4rgftoq9lst+3azsbsboqacvifenu/4em8+k+8+0pctdny1gcdj/dmmjkxxjwp7tjfvmlpewv1rbxub2xube9udvfuvzriqj0s8ui2aqwoz4j6mmlow7gkoaw4bqaj69xvplkpwctu9dimfoghgvuzwdpirctuyh4y2+5wqk7nmqatercgvsjq6fz+or2ijcevmncsvnt1yu2nwgpgom3stqjojmkid2jbuifdqvx0cm+gjozsq/1imhiatds/eykolrqhgekmsr6qes8x//paie6f+yktcakpinnf/yqjhah8edrjkhlnx4zgipm5fzehlphoe9hmfs1gt1keizsfwilxtmoxnff2tfq/kvipwwecwjg4caz1uiegeecawwu8wpv1bl1bh9bntlvkftp7mapr6xda3jbc</latexit> <latexit sha1_base64="mo2atz9sjjutm6gpmd564ganmba=">aaacahicbvbns8naej3urxq/qh69lbbbu0leug9vlx4rgftoq9lst+3azsbsboqacvifenu/4em8+k+8+0pctdny1gcdj/dmmjkxxjwp7tjfvmlpewv1rbxub2xube9udvfuvzriqj0s8ui2aqwoz4j6mmlow7gkoaw4bqaj69xvplkpwctu9dimfoghgvuzwdpirctuyh4y2+5wqk7nmqatercgvsjq6fz+or2ijcevmncsvnt1yu2nwgpgom3stqjojmkid2jbuifdqvx0cm+gjozsq/1imhiatds/eykolrqhgekmsr6qes8x//paie6f+yktcakpinnf/yqjhah8edrjkhlnx4zgipm5fzehlphoe9hmfs1gt1keizsfwilxtmoxnff2tfq/kvipwwecwjg4caz1uiegeecawwu8wpv1bl1bh9bntlvkftp7mapr6xda3jbc</latexit> <latexit sha1_base64="mo2atz9sjjutm6gpmd564ganmba=">aaacahicbvbns8naej3urxq/qh69lbbbu0leug9vlx4rgftoq9lst+3azsbsboqacvifenu/4em8+k+8+0pctdny1gcdj/dmmjkxxjwp7tjfvmlpewv1rbxub2xube9udvfuvzriqj0s8ui2aqwoz4j6mmlow7gkoaw4bqaj69xvplkpwctu9dimfoghgvuzwdpirctuyh4y2+5wqk7nmqatercgvsjq6fz+or2ijcevmncsvnt1yu2nwgpgom3stqjojmkid2jbuifdqvx0cm+gjozsq/1imhiatds/eykolrqhgekmsr6qes8x//paie6f+yktcakpinnf/yqjhah8edrjkhlnx4zgipm5fzehlphoe9hmfs1gt1keizsfwilxtmoxnff2tfq/kvipwwecwjg4caz1uiegeecawwu8wpv1bl1bh9bntlvkftp7mapr6xda3jbc</latexit> <latexit sha1_base64="rq+uxvchirixp4zke39hxf3+jdy=">aaaci3icbvdltsjafj3ic/gfunqzkzjahrtgrn2bblxiikjcstmdbmfk+mbmsojnv8g/8a/c6g+4mm5cupndbkelau9yk5nz7s2999gbz1lp+peww1ldw9/ibxa2tnd294r7b/fsdwwfjvw5l9o2kcczb03ffid2iic4noewpbxo/dyyhgs+d6cmaxrd0veywyhriwqvk6zl1iashtvjkyqzh0p5aalnlribmarzgnetkbjfhyjvlolvfqq8tiymlfcghlx8mxs+dv3wfoveyo6hb6obeaey5raxzfbcqoiq9kgtui+4ilvr9kuynyrkdzu+smptekr+nyiik+xetzpo9ag56kxif14nvm5fn2jeecrw6gyre3ksfjzmg3tmafv8khbcbutuxxrabkeqsxfui2ldxzinxvgmyzk0t6uxvep2rfs7yvljoyn0jmriqoeohm5qazurru/obb2in+1ze9c+tm9za07lzg7rhltvx8lhpfa=</latexit> <latexit sha1_base64="rq+uxvchirixp4zke39hxf3+jdy=">aaaci3icbvdltsjafj3ic/gfunqzkzjahrtgrn2bblxiikjcstmdbmfk+mbmsojnv8g/8a/c6g+4mm5cupndbkelau9yk5nz7s2999gbz1lp+peww1ldw9/ibxa2tnd294r7b/fsdwwfjvw5l9o2kcczb03ffid2iic4noewpbxo/dyyhgs+d6cmaxrd0veywyhriwqvk6zl1iashtvjkyqzh0p5aalnlribmarzgnetkbjfhyjvlolvfqq8tiymlfcghlx8mxs+dv3wfoveyo6hb6obeaey5raxzfbcqoiq9kgtui+4ilvr9kuynyrkdzu+smptekr+nyiik+xetzpo9ag56kxif14nvm5fn2jeecrw6gyre3ksfjzmg3tmafv8khbcbutuxxrabkeqsxfui2ldxzinxvgmyzk0t6uxvep2rfs7yvljoyn0jmriqoeohm5qazurru/obb2in+1ze9c+tm9za07lzg7rhltvx8lhpfa=</latexit> <latexit sha1_base64="rq+uxvchirixp4zke39hxf3+jdy=">aaaci3icbvdltsjafj3ic/gfunqzkzjahrtgrn2bblxiikjcstmdbmfk+mbmsojnv8g/8a/c6g+4mm5cupndbkelau9yk5nz7s2999gbz1lp+peww1ldw9/ibxa2tnd294r7b/fsdwwfjvw5l9o2kcczb03ffid2iic4noewpbxo/dyyhgs+d6cmaxrd0veywyhriwqvk6zl1iashtvjkyqzh0p5aalnlribmarzgnetkbjfhyjvlolvfqq8tiymlfcghlx8mxs+dv3wfoveyo6hb6obeaey5raxzfbcqoiq9kgtui+4ilvr9kuynyrkdzu+smptekr+nyiik+xetzpo9ag56kxif14nvm5fn2jeecrw6gyre3ksfjzmg3tmafv8khbcbutuxxrabkeqsxfui2ldxzinxvgmyzk0t6uxvep2rfs7yvljoyn0jmriqoeohm5qazurru/obb2in+1ze9c+tm9za07lzg7rhltvx8lhpfa=</latexit> <latexit sha1_base64="rq+uxvchirixp4zke39hxf3+jdy=">aaaci3icbvdltsjafj3ic/gfunqzkzjahrtgrn2bblxiikjcstmdbmfk+mbmsojnv8g/8a/c6g+4mm5cupndbkelau9yk5nz7s2999gbz1lp+peww1ldw9/ibxa2tnd294r7b/fsdwwfjvw5l9o2kcczb03ffid2iic4noewpbxo/dyyhgs+d6cmaxrd0veywyhriwqvk6zl1iashtvjkyqzh0p5aalnlribmarzgnetkbjfhyjvlolvfqq8tiymlfcghlx8mxs+dv3wfoveyo6hb6obeaey5raxzfbcqoiq9kgtui+4ilvr9kuynyrkdzu+smptekr+nyiik+xetzpo9ag56kxif14nvm5fn2jeecrw6gyre3ksfjzmg3tmafv8khbcbutuxxrabkeqsxfui2ldxzinxvgmyzk0t6uxvep2rfs7yvljoyn0jmriqoeohm5qazurru/obb2in+1ze9c+tm9za07lzg7rhltvx8lhpfa=</latexit> <latexit sha1_base64="6/q+x0nnfxufcxtl441chdi5hv0=">aaacihicbvdltsjafj36rhyhlt1mjbryknayqdvqjutmregoaabdfezmpmvmsojn/8c/8a/c6g+4mi517yfyqhccnuqmj+fcm3vvcqngltbnl2npewv1bt23kd/c2t7zlezt3ys/ljg0sm982xkriowk0tbum9ikjehczatpdq5tvzkiulff3olxqdoc9qt1key6kzzcsc2j6emmbtim6qjahok+riyqxu5ucoblehfloj93ckwzyk4af4mvksliuhckp3bxxyenqmoglgpbzqa7ezkaykbivb0qeia8qd3stqhankhonpknhsej0owel5msgk7uvxmr4kqnuzt0pveqes8v//paofyuohevqaijwnnfxsig9meaduxssbbm44qglglyk8r9jbhwsyqzwzqdpmzplnz8ciukcvq5rfi3z8xqvzzpdhyci1acfjghvxad6qabmhgcl+avvbnpxrvxyxxow5emboyazmd4/gva/qmj</latexit> <latexit sha1_base64="6/q+x0nnfxufcxtl441chdi5hv0=">aaacihicbvdltsjafj36rhyhlt1mjbryknayqdvqjutmregoaabdfezmpmvmsojn/8c/8a/c6g+4mi517yfyqhccnuqmj+fcm3vvcqngltbnl2npewv1bt23kd/c2t7zlezt3ys/ljg0sm982xkriowk0tbum9ikjehczatpdq5tvzkiulff3olxqdoc9qt1key6kzzcsc2j6emmbtim6qjahok+riyqxu5ucoblehfloj93ckwzyk4af4mvksliuhckp3bxxyenqmoglgpbzqa7ezkaykbivb0qeia8qd3stqhankhonpknhsej0owel5msgk7uvxmr4kqnuzt0pveqes8v//paofyuohevqaijwnnfxsig9meaduxssbbm44qglglyk8r9jbhwsyqzwzqdpmzplnz8ciukcvq5rfi3z8xqvzzpdhyci1acfjghvxad6qabmhgcl+avvbnpxrvxyxxow5emboyazmd4/gva/qmj</latexit> <latexit sha1_base64="6/q+x0nnfxufcxtl441chdi5hv0=">aaacihicbvdltsjafj36rhyhlt1mjbryknayqdvqjutmregoaabdfezmpmvmsojn/8c/8a/c6g+4mi517yfyqhccnuqmj+fcm3vvcqngltbnl2npewv1bt23kd/c2t7zlezt3ys/ljg0sm982xkriowk0tbum9ikjehczatpdq5tvzkiulff3olxqdoc9qt1key6kzzcsc2j6emmbtim6qjahok+riyqxu5ucoblehfloj93ckwzyk4af4mvksliuhckp3bxxyenqmoglgpbzqa7ezkaykbivb0qeia8qd3stqhankhonpknhsej0owel5msgk7uvxmr4kqnuzt0pveqes8v//paofyuohevqaijwnnfxsig9meaduxssbbm44qglglyk8r9jbhwsyqzwzqdpmzplnz8ciukcvq5rfi3z8xqvzzpdhyci1acfjghvxad6qabmhgcl+avvbnpxrvxyxxow5emboyazmd4/gva/qmj</latexit> <latexit sha1_base64="6/q+x0nnfxufcxtl441chdi5hv0=">aaacihicbvdltsjafj36rhyhlt1mjbryknayqdvqjutmregoaabdfezmpmvmsojn/8c/8a/c6g+4mi517yfyqhccnuqmj+fcm3vvcqngltbnl2npewv1bt23kd/c2t7zlezt3ys/ljg0sm982xkriowk0tbum9ikjehczatpdq5tvzkiulff3olxqdoc9qt1key6kzzcsc2j6emmbtim6qjahok+riyqxu5ucoblehfloj93ckwzyk4af4mvksliuhckp3bxxyenqmoglgpbzqa7ezkaykbivb0qeia8qd3stqhankhonpknhsej0owel5msgk7uvxmr4kqnuzt0pveqes8v//paofyuohevqaijwnnfxsig9meaduxssbbm44qglglyk8r9jbhwsyqzwzqdpmzplnz8ciukcvq5rfi3z8xqvzzpdhyci1acfjghvxad6qabmhgcl+avvbnpxrvxyxxow5emboyazmd4/gva/qmj</latexit> <latexit sha1_base64="o2iei5+35efdcth4sxy3e82gmvi=">aaacfhicbvdlssnafj34rpevdaebwslutuleuhetblxwmlbqhjcz3rzjjw9njouaav6ff+bwf8cvuhxv3g8xabuwrqcuhm65l3vv8slopdlnb21hcwl5zbwwpq9vbg5tgzu7dzkmbqwbhjwudy9i4cwawzhforejil7hoe71r3k/pgahwrjcqmeejk+6aeswslqmucz+1u1k7l7fas6pu9ych5gncnx1dd01imbzhahpe2tcimicmmv8tnohjx0ifoveyqzlrspjifcmckj1viwhirrputdmaeb8ke4y+ihfr5nsxp1qzbuopfl/titel3loe1mnt1rpznq5+j/xjfxn3elyemukajpe1ik5vihoa8ftjoaqpswioyjlt2lai4jqlcu2tuwx/moax2lnhjbp7jpyrdm6os1wlif5fnabokqlzkezvehxqizsrnetekgv6e171t61d+1z3lqgtwb20bs0r19+1z3p</latexit> <latexit sha1_base64="o2iei5+35efdcth4sxy3e82gmvi=">aaacfhicbvdlssnafj34rpevdaebwslutuleuhetblxwmlbqhjcz3rzjjw9njouaav6ff+bwf8cvuhxv3g8xabuwrqcuhm65l3vv8slopdlnb21hcwl5zbwwpq9vbg5tgzu7dzkmbqwbhjwudy9i4cwawzhforejil7hoe71r3k/pgahwrjcqmeejk+6aeswslqmucz+1u1k7l7fas6pu9ych5gncnx1dd01imbzhahpe2tcimicmmv8tnohjx0ifoveyqzlrspjifcmckj1viwhirrputdmaeb8ke4y+ihfr5nsxp1qzbuopfl/titel3loe1mnt1rpznq5+j/xjfxn3elyemukajpe1ik5vihoa8ftjoaqpswioyjlt2lai4jqlcu2tuwx/moax2lnhjbp7jpyrdm6os1wlif5fnabokqlzkezvehxqizsrnetekgv6e171t61d+1z3lqgtwb20bs0r19+1z3p</latexit> <latexit sha1_base64="o2iei5+35efdcth4sxy3e82gmvi=">aaacfhicbvdlssnafj34rpevdaebwslutuleuhetblxwmlbqhjcz3rzjjw9njouaav6ff+bwf8cvuhxv3g8xabuwrqcuhm65l3vv8slopdlnb21hcwl5zbwwpq9vbg5tgzu7dzkmbqwbhjwudy9i4cwawzhforejil7hoe71r3k/pgahwrjcqmeejk+6aeswslqmucz+1u1k7l7fas6pu9ych5gncnx1dd01imbzhahpe2tcimicmmv8tnohjx0ifoveyqzlrspjifcmckj1viwhirrputdmaeb8ke4y+ihfr5nsxp1qzbuopfl/titel3loe1mnt1rpznq5+j/xjfxn3elyemukajpe1ik5vihoa8ftjoaqpswioyjlt2lai4jqlcu2tuwx/moax2lnhjbp7jpyrdm6os1wlif5fnabokqlzkezvehxqizsrnetekgv6e171t61d+1z3lqgtwb20bs0r19+1z3p</latexit> <latexit sha1_base64="o2iei5+35efdcth4sxy3e82gmvi=">aaacfhicbvdlssnafj34rpevdaebwslutuleuhetblxwmlbqhjcz3rzjjw9njouaav6ff+bwf8cvuhxv3g8xabuwrqcuhm65l3vv8slopdlnb21hcwl5zbwwpq9vbg5tgzu7dzkmbqwbhjwudy9i4cwawzhforejil7hoe71r3k/pgahwrjcqmeejk+6aeswslqmucz+1u1k7l7fas6pu9ych5gncnx1dd01imbzhahpe2tcimicmmv8tnohjx0ifoveyqzlrspjifcmckj1viwhirrputdmaeb8ke4y+ihfr5nsxp1qzbuopfl/titel3loe1mnt1rpznq5+j/xjfxn3elyemukajpe1ik5vihoa8ftjoaqpswioyjlt2lai4jqlcu2tuwx/moax2lnhjbp7jpyrdm6os1wlif5fnabokqlzkezvehxqizsrnetekgv6e171t61d+1z3lqgtwb20bs0r19+1z3p</latexit> TENSOR NETWORKS Notation and Visualization Matricize Vectorize A ijk` A (ij)(k`) A pq A (ijk`) A m Tensorize A ij A ij A (pq)(mn) 15
16 TENSOR NETWORKS Notation and Visualization inner product outer product SVD 16
17 <latexit sha1_base64="6tmhaxhdcavfrhbjw01k2z9ny0e=">aaacsxicbzdns8mwgmbtzy85v6oevqshmefgowt1iex38tjbuce6s5pmw1ialiqvrunf58wtn/8ilx5upjmtfxtzhcdd73nfn8njryxkzvkvrqg4tlyywlorr29sbm2bo7t3mowfjm0cslb0psqjo5y0fvwmdcnbuoax0vhgzanfesbc0pdfqkle+geacjqggcmnxbndugl1berwoyp9uelixz7cc+gwvcrhrodomwzej1x7kj21ivsh1dnsz0g+nns887hixbni1axzwuvh56ic8mq55rpjhzgocfeyisl7thwpfokeopirtozekkqij9gq9ltkkccyn8yisoghjj4cheifrucm/p5iucdljpb0z4dusm57u/if14vv4kyfub7finccxtsigvqhnoykfsoivmyibckc6rdcpeicyaxtl+sq7pkvl4p2vxzes29oko2rpi0s2achoapscaoa4bq0qbtg8ahewtv4mj6mn+pt+mpac0y+swf+vkh4dcpsr3s=</latexit> <latexit sha1_base64="6tmhaxhdcavfrhbjw01k2z9ny0e=">aaacsxicbzdns8mwgmbtzy85v6oevqshmefgowt1iex38tjbuce6s5pmw1ialiqvrunf58wtn/8ilx5upjmtfxtzhcdd73nfn8njryxkzvkvrqg4tlyywlorr29sbm2bo7t3mowfjm0cslb0psqjo5y0fvwmdcnbuoax0vhgzanfesbc0pdfqkle+geacjqggcmnxbndugl1berwoyp9uelixz7cc+gwvcrhrodomwzej1x7kj21ivsh1dnsz0g+nns887hixbni1axzwuvh56ic8mq55rpjhzgocfeyisl7thwpfokeopirtozekkqij9gq9ltkkccyn8yisoghjj4cheifrucm/p5iucdljpb0z4dusm57u/if14vv4kyfub7finccxtsigvqhnoykfsoivmyibckc6rdcpeicyaxtl+sq7pkvl4p2vxzes29oko2rpi0s2achoapscaoa4bq0qbtg8ahewtv4mj6mn+pt+mpac0y+swf+vkh4dcpsr3s=</latexit> <latexit sha1_base64="6tmhaxhdcavfrhbjw01k2z9ny0e=">aaacsxicbzdns8mwgmbtzy85v6oevqshmefgowt1iex38tjbuce6s5pmw1ialiqvrunf58wtn/8ilx5upjmtfxtzhcdd73nfn8njryxkzvkvrqg4tlyywlorr29sbm2bo7t3mowfjm0cslb0psqjo5y0fvwmdcnbuoax0vhgzanfesbc0pdfqkle+geacjqggcmnxbndugl1berwoyp9uelixz7cc+gwvcrhrodomwzej1x7kj21ivsh1dnsz0g+nns887hixbni1axzwuvh56ic8mq55rpjhzgocfeyisl7thwpfokeopirtozekkqij9gq9ltkkccyn8yisoghjj4cheifrucm/p5iucdljpb0z4dusm57u/if14vv4kyfub7finccxtsigvqhnoykfsoivmyibckc6rdcpeicyaxtl+sq7pkvl4p2vxzes29oko2rpi0s2achoapscaoa4bq0qbtg8ahewtv4mj6mn+pt+mpac0y+swf+vkh4dcpsr3s=</latexit> <latexit sha1_base64="6tmhaxhdcavfrhbjw01k2z9ny0e=">aaacsxicbzdns8mwgmbtzy85v6oevqshmefgowt1iex38tjbuce6s5pmw1ialiqvrunf58wtn/8ilx5upjmtfxtzhcdd73nfn8njryxkzvkvrqg4tlyywlorr29sbm2bo7t3mowfjm0cslb0psqjo5y0fvwmdcnbuoax0vhgzanfesbc0pdfqkle+geacjqggcmnxbndugl1berwoyp9uelixz7cc+gwvcrhrodomwzej1x7kj21ivsh1dnsz0g+nns887hixbni1axzwuvh56ic8mq55rpjhzgocfeyisl7thwpfokeopirtozekkqij9gq9ltkkccyn8yisoghjj4cheifrucm/p5iucdljpb0z4dusm57u/if14vv4kyfub7finccxtsigvqhnoykfsoivmyibckc6rdcpeicyaxtl+sq7pkvl4p2vxzes29oko2rpi0s2achoapscaoa4bq0qbtg8ahewtv4mj6mn+pt+mpac0y+swf+vkh4dcpsr3s=</latexit> CP DECOMPOSITION Canonical Polyadic Decomposition A i1 i 2 i n = r C (1) i 1 r C(2) i 2 r C(n) i n r 17
18 <latexit sha1_base64="6tmhaxhdcavfrhbjw01k2z9ny0e=">aaacsxicbzdns8mwgmbtzy85v6oevqshmefgowt1iex38tjbuce6s5pmw1ialiqvrunf58wtn/8ilx5upjmtfxtzhcdd73nfn8njryxkzvkvrqg4tlyywlorr29sbm2bo7t3mowfjm0cslb0psqjo5y0fvwmdcnbuoax0vhgzanfesbc0pdfqkle+geacjqggcmnxbndugl1berwoyp9uelixz7cc+gwvcrhrodomwzej1x7kj21ivsh1dnsz0g+nns887hixbni1axzwuvh56ic8mq55rpjhzgocfeyisl7thwpfokeopirtozekkqij9gq9ltkkccyn8yisoghjj4cheifrucm/p5iucdljpb0z4dusm57u/if14vv4kyfub7finccxtsigvqhnoykfsoivmyibckc6rdcpeicyaxtl+sq7pkvl4p2vxzes29oko2rpi0s2achoapscaoa4bq0qbtg8ahewtv4mj6mn+pt+mpac0y+swf+vkh4dcpsr3s=</latexit> <latexit sha1_base64="6tmhaxhdcavfrhbjw01k2z9ny0e=">aaacsxicbzdns8mwgmbtzy85v6oevqshmefgowt1iex38tjbuce6s5pmw1ialiqvrunf58wtn/8ilx5upjmtfxtzhcdd73nfn8njryxkzvkvrqg4tlyywlorr29sbm2bo7t3mowfjm0cslb0psqjo5y0fvwmdcnbuoax0vhgzanfesbc0pdfqkle+geacjqggcmnxbndugl1berwoyp9uelixz7cc+gwvcrhrodomwzej1x7kj21ivsh1dnsz0g+nns887hixbni1axzwuvh56ic8mq55rpjhzgocfeyisl7thwpfokeopirtozekkqij9gq9ltkkccyn8yisoghjj4cheifrucm/p5iucdljpb0z4dusm57u/if14vv4kyfub7finccxtsigvqhnoykfsoivmyibckc6rdcpeicyaxtl+sq7pkvl4p2vxzes29oko2rpi0s2achoapscaoa4bq0qbtg8ahewtv4mj6mn+pt+mpac0y+swf+vkh4dcpsr3s=</latexit> <latexit sha1_base64="6tmhaxhdcavfrhbjw01k2z9ny0e=">aaacsxicbzdns8mwgmbtzy85v6oevqshmefgowt1iex38tjbuce6s5pmw1ialiqvrunf58wtn/8ilx5upjmtfxtzhcdd73nfn8njryxkzvkvrqg4tlyywlorr29sbm2bo7t3mowfjm0cslb0psqjo5y0fvwmdcnbuoax0vhgzanfesbc0pdfqkle+geacjqggcmnxbndugl1berwoyp9uelixz7cc+gwvcrhrodomwzej1x7kj21ivsh1dnsz0g+nns887hixbni1axzwuvh56ic8mq55rpjhzgocfeyisl7thwpfokeopirtozekkqij9gq9ltkkccyn8yisoghjj4cheifrucm/p5iucdljpb0z4dusm57u/if14vv4kyfub7finccxtsigvqhnoykfsoivmyibckc6rdcpeicyaxtl+sq7pkvl4p2vxzes29oko2rpi0s2achoapscaoa4bq0qbtg8ahewtv4mj6mn+pt+mpac0y+swf+vkh4dcpsr3s=</latexit> <latexit sha1_base64="6tmhaxhdcavfrhbjw01k2z9ny0e=">aaacsxicbzdns8mwgmbtzy85v6oevqshmefgowt1iex38tjbuce6s5pmw1ialiqvrunf58wtn/8ilx5upjmtfxtzhcdd73nfn8njryxkzvkvrqg4tlyywlorr29sbm2bo7t3mowfjm0cslb0psqjo5y0fvwmdcnbuoax0vhgzanfesbc0pdfqkle+geacjqggcmnxbndugl1berwoyp9uelixz7cc+gwvcrhrodomwzej1x7kj21ivsh1dnsz0g+nns887hixbni1axzwuvh56ic8mq55rpjhzgocfeyisl7thwpfokeopirtozekkqij9gq9ltkkccyn8yisoghjj4cheifrucm/p5iucdljpb0z4dusm57u/if14vv4kyfub7finccxtsigvqhnoykfsoivmyibckc6rdcpeicyaxtl+sq7pkvl4p2vxzes29oko2rpi0s2achoapscaoa4bq0qbtg8ahewtv4mj6mn+pt+mpac0y+swf+vkh4dcpsr3s=</latexit> CP DECOMPOSITION A i1 i 2 i n = r C (1) i 1 r C(2) i 2 r C(n) i n r 18
19 CP DECOMPOSITION Properties Analogous to SVD for Tensors Rank is the size of the diagonal core tensor in it s CP Finding the minimal rank is NP-hard Truncated SVD is the best rank-k approximation is NOT true. Uniqueness of the factors Matrix decompositions are not 19
20 <latexit sha1_base64="l9ynuhfbihs23kj/yv0czku0uz4=">aaab53icbvbns8naej3ur1q/qh69lbbbu0leua9c0yvhfowttkfsttn27wytdjdccf0fxjyoepuvefpfug1z0nyha4/3zpizfyaca+o6305hzxvtfao4wdra3tndk+8fpog4vqx9fotytukquxcjvufgyctrsknqydmc3u795hmqzwn5b8yjbhedsn7njborna675ypbdwcgy8tlsqvy1lvlr04vzmme0jbbtw57bmkcjcrdmcbjqznqtcgb0qg2lzu0qh1ks0mn5mqqpdkpls1pyez9pzhrsotxfnroijqhxvsm4n9eozx9yydjmkknsjzf1e8fmtgzfk16xcezymwjzyrbwwkbukwzsdmubaje4svlxd+rxlw9xnmldponuyqjoizt8oacanahdfcbacizvmkb8+i8oo/ox7y14oqzh/ahzucp+1+mlq==</latexit> <latexit sha1_base64="l9ynuhfbihs23kj/yv0czku0uz4=">aaab53icbvbns8naej3ur1q/qh69lbbbu0leua9c0yvhfowttkfsttn27wytdjdccf0fxjyoepuvefpfug1z0nyha4/3zpizfyaca+o6305hzxvtfao4wdra3tndk+8fpog4vqx9fotytukquxcjvufgyctrsknqydmc3u795hmqzwn5b8yjbhedsn7njborna675ypbdwcgy8tlsqvy1lvlr04vzmme0jbbtw57bmkcjcrdmcbjqznqtcgb0qg2lzu0qh1ks0mn5mqqpdkpls1pyez9pzhrsotxfnroijqhxvsm4n9eozx9yydjmkknsjzf1e8fmtgzfk16xcezymwjzyrbwwkbukwzsdmubaje4svlxd+rxlw9xnmldponuyqjoizt8oacanahdfcbacizvmkb8+i8oo/ox7y14oqzh/ahzucp+1+mlq==</latexit> <latexit sha1_base64="l9ynuhfbihs23kj/yv0czku0uz4=">aaab53icbvbns8naej3ur1q/qh69lbbbu0leua9c0yvhfowttkfsttn27wytdjdccf0fxjyoepuvefpfug1z0nyha4/3zpizfyaca+o6305hzxvtfao4wdra3tndk+8fpog4vqx9fotytukquxcjvufgyctrsknqydmc3u795hmqzwn5b8yjbhedsn7njborna675ypbdwcgy8tlsqvy1lvlr04vzmme0jbbtw57bmkcjcrdmcbjqznqtcgb0qg2lzu0qh1ks0mn5mqqpdkpls1pyez9pzhrsotxfnroijqhxvsm4n9eozx9yydjmkknsjzf1e8fmtgzfk16xcezymwjzyrbwwkbukwzsdmubaje4svlxd+rxlw9xnmldponuyqjoizt8oacanahdfcbacizvmkb8+i8oo/ox7y14oqzh/ahzucp+1+mlq==</latexit> CP DECOMPOSITION Properties A vector s CP Decomposition is itself. =<latexit sha1_base64="l9ynuhfbihs23kj/yv0czku0uz4=">aaab53icbvbns8naej3ur1q/qh69lbbbu0leua9c0yvhfowttkfsttn27wytdjdccf0fxjyoepuvefpfug1z0nyha4/3zpizfyaca+o6305hzxvtfao4wdra3tndk+8fpog4vqx9fotytukquxcjvufgyctrsknqydmc3u795hmqzwn5b8yjbhedsn7njborna675ypbdwcgy8tlsqvy1lvlr04vzmme0jbbtw57bmkcjcrdmcbjqznqtcgb0qg2lzu0qh1ks0mn5mqqpdkpls1pyez9pzhrsotxfnroijqhxvsm4n9eozx9yydjmkknsjzf1e8fmtgzfk16xcezymwjzyrbwwkbukwzsdmubaje4svlxd+rxlw9xnmldponuyqjoizt8oacanahdfcbacizvmkb8+i8oo/ox7y14oqzh/ahzucp+1+mlq==</latexit> A matrix s CP Decomposition is the SVD. CP-ALS to compute a CP Decomposition from a 3D+ Tensor. Choose rank. Fix all but one core tensor and solve linear least squares to fit. Continue. 20
21 <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> TUCKER DECOMPOSITION A i1 i 2 i n = G r1 r 2 r n C (1) i 1 r 1 C (2) i 2 r 2 C (n) i n r n 21
22 <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> TUCKER DECOMPOSITION A i1 i 2 i n = G r1 r 2 r n C (1) i 1 r 1 C (2) i 2 r 2 C (n) i n r n CP is a special case of Tucker when core is superdiagonal and R 1 = R 2 = = R n Called High-Order SVD (HOSVD/MLSVD) when core and factor tensors are orthogonal 22
23 TUCKER DECOMPOSITION Algorithms Computing a Tucker Decomposition from a tensor 23
24 <latexit sha1_base64="4n1myzincplwyxmzhkhqzmox/dg=">aaactxicbvhls8mwhe7ny3o+ph69bicwquzbbfugthfxomg5wtzdmmvbwjqwjbvg6v/orbz5z3jxocjmxq+6+ypax/fi44sxcqa0bb9auaxlldv8ya24vrg5tv3a2b1xqsqjbzkab7ltyuu5e7spmea0huqkfy/tljeut/xwi5wkbejot0la8/fqsaejwbsklegvihlyghjhl/qdrsbdiogxsp4qv5yjbmvyqjosuqnshqe0o6nde5lojkwzbgyqm0mserozyseovlardjpwetgzkinsgqj00u0hjpkp0irjptqohepejkvmhnok2i0udtez4yhtgciwt1uvtuti4kfh+naqsloehin7oxfjx6mj7xmnj/vizwtt8j+te+nbes9miow0fwr20cdiuadw2i3sm0mj5hmdmjhm3bwsezayapmdrvocm//krdb0qxdv5/a0xlvo2iiafxaaksabz6agbkadnaebt+anfibp69l6t76s75k1z2wzpfbncvkfpiyxpa==</latexit> <latexit sha1_base64="4n1myzincplwyxmzhkhqzmox/dg=">aaactxicbvhls8mwhe7ny3o+ph69bicwquzbbfugthfxomg5wtzdmmvbwjqwjbvg6v/orbz5z3jxocjmxq+6+ypax/fi44sxcqa0bb9auaxlldv8ya24vrg5tv3a2b1xqsqjbzkab7ltyuu5e7spmea0huqkfy/tljeut/xwi5wkbejot0la8/fqsaejwbsklegvihlyghjhl/qdrsbdiogxsp4qv5yjbmvyqjosuqnshqe0o6nde5lojkwzbgyqm0mserozyseovlardjpwetgzkinsgqj00u0hjpkp0irjptqohepejkvmhnok2i0udtez4yhtgciwt1uvtuti4kfh+naqsloehin7oxfjx6mj7xmnj/vizwtt8j+te+nbes9miow0fwr20cdiuadw2i3sm0mj5hmdmjhm3bwsezayapmdrvocm//krdb0qxdv5/a0xlvo2iiafxaaksabz6agbkadnaebt+anfibp69l6t76s75k1z2wzpfbncvkfpiyxpa==</latexit> <latexit sha1_base64="4n1myzincplwyxmzhkhqzmox/dg=">aaactxicbvhls8mwhe7ny3o+ph69bicwquzbbfugthfxomg5wtzdmmvbwjqwjbvg6v/orbz5z3jxocjmxq+6+ypax/fi44sxcqa0bb9auaxlldv8ya24vrg5tv3a2b1xqsqjbzkab7ltyuu5e7spmea0huqkfy/tljeut/xwi5wkbejot0la8/fqsaejwbsklegvihlyghjhl/qdrsbdiogxsp4qv5yjbmvyqjosuqnshqe0o6nde5lojkwzbgyqm0mserozyseovlardjpwetgzkinsgqj00u0hjpkp0irjptqohepejkvmhnok2i0udtez4yhtgciwt1uvtuti4kfh+naqsloehin7oxfjx6mj7xmnj/vizwtt8j+te+nbes9miow0fwr20cdiuadw2i3sm0mj5hmdmjhm3bwsezayapmdrvocm//krdb0qxdv5/a0xlvo2iiafxaaksabz6agbkadnaebt+anfibp69l6t76s75k1z2wzpfbncvkfpiyxpa==</latexit> <latexit sha1_base64="4n1myzincplwyxmzhkhqzmox/dg=">aaactxicbvhls8mwhe7ny3o+ph69bicwquzbbfugthfxomg5wtzdmmvbwjqwjbvg6v/orbz5z3jxocjmxq+6+ypax/fi44sxcqa0bb9auaxlldv8ya24vrg5tv3a2b1xqsqjbzkab7ltyuu5e7spmea0huqkfy/tljeut/xwi5wkbejot0la8/fqsaejwbsklegvihlyghjhl/qdrsbdiogxsp4qv5yjbmvyqjosuqnshqe0o6nde5lojkwzbgyqm0mserozyseovlardjpwetgzkinsgqj00u0hjpkp0irjptqohepejkvmhnok2i0udtez4yhtgciwt1uvtuti4kfh+naqsloehin7oxfjx6mj7xmnj/vizwtt8j+te+nbes9miow0fwr20cdiuadw2i3sm0mj5hmdmjhm3bwsezayapmdrvocm//krdb0qxdv5/a0xlvo2iiafxaaksabz6agbkadnaebt+anfibp69l6t76s75k1z2wzpfbncvkfpiyxpa==</latexit> TENSOR RING DECOMPOSITION A i1 i 2 i n = C (1) r 1 i 1 r 2 C (2) r 2 i 2 r 3 C (n) r n i n r 1 24
25 <latexit sha1_base64="996/we2ljjnp7a57s+vlgzoz7ek=">aaacr3icbzdns8mwgmbt+txn19sjl+aqjshoq6aehokuhic4n9hmslnshqvpsvjhlp55xjx682/w4khfo2nxg26+ehj4pc+bno8xcqa0bb9ahyxfpewv4mppbx1jc6u8vxongkgs2iibd2thw4pyjmhlm81pj5qu+x6nbw/csp32i5wkbejwt0la9/fiscejwbueyugsxqw5dlmwrwabvpahkcal2lipq85hkrksuqnshwxmtzkbzkoi4wzni3lataoxsnkbulli1+xs4lxwcleb+trr+au3cejku6ejx0p1htvu/rhlzqinsakxkrpimsyj2jvsyj+qfpwvkcadqwzwgehzhiyz/b0ry1+pie+zpi/1g5r1uvif14308kwfmxfgmgoy/daw4lahmg0vdpikrpojezhizv4vkgcsmdgm+5ipwzl98rxoubxzmnnzuqlf5w0uwr7yb1xggfnqb9egcvqagcfwbj7ap/vsvvtf1vc0wrdynv3wzwrwd4n6rvw=</latexit> <latexit sha1_base64="996/we2ljjnp7a57s+vlgzoz7ek=">aaacr3icbzdns8mwgmbt+txn19sjl+aqjshoq6aehokuhic4n9hmslnshqvpsvjhlp55xjx682/w4khfo2nxg26+ehj4pc+bno8xcqa0bb9ahyxfpewv4mppbx1jc6u8vxongkgs2iibd2thw4pyjmhlm81pj5qu+x6nbw/csp32i5wkbejwt0la9/fiscejwbueyugsxqw5dlmwrwabvpahkcal2lipq85hkrksuqnshwxmtzkbzkoi4wzni3lataoxsnkbulli1+xs4lxwcleb+trr+au3cejku6ejx0p1htvu/rhlzqinsakxkrpimsyj2jvsyj+qfpwvkcadqwzwgehzhiyz/b0ry1+pie+zpi/1g5r1uvif14308kwfmxfgmgoy/daw4lahmg0vdpikrpojezhizv4vkgcsmdgm+5ipwzl98rxoubxzmnnzuqlf5w0uwr7yb1xggfnqb9egcvqagcfwbj7ap/vsvvtf1vc0wrdynv3wzwrwd4n6rvw=</latexit> <latexit sha1_base64="996/we2ljjnp7a57s+vlgzoz7ek=">aaacr3icbzdns8mwgmbt+txn19sjl+aqjshoq6aehokuhic4n9hmslnshqvpsvjhlp55xjx682/w4khfo2nxg26+ehj4pc+bno8xcqa0bb9ahyxfpewv4mppbx1jc6u8vxongkgs2iibd2thw4pyjmhlm81pj5qu+x6nbw/csp32i5wkbejwt0la9/fiscejwbueyugsxqw5dlmwrwabvpahkcal2lipq85hkrksuqnshwxmtzkbzkoi4wzni3lataoxsnkbulli1+xs4lxwcleb+trr+au3cejku6ejx0p1htvu/rhlzqinsakxkrpimsyj2jvsyj+qfpwvkcadqwzwgehzhiyz/b0ry1+pie+zpi/1g5r1uvif14308kwfmxfgmgoy/daw4lahmg0vdpikrpojezhizv4vkgcsmdgm+5ipwzl98rxoubxzmnnzuqlf5w0uwr7yb1xggfnqb9egcvqagcfwbj7ap/vsvvtf1vc0wrdynv3wzwrwd4n6rvw=</latexit> <latexit sha1_base64="996/we2ljjnp7a57s+vlgzoz7ek=">aaacr3icbzdns8mwgmbt+txn19sjl+aqjshoq6aehokuhic4n9hmslnshqvpsvjhlp55xjx682/w4khfo2nxg26+ehj4pc+bno8xcqa0bb9ahyxfpewv4mppbx1jc6u8vxongkgs2iibd2thw4pyjmhlm81pj5qu+x6nbw/csp32i5wkbejwt0la9/fiscejwbueyugsxqw5dlmwrwabvpahkcal2lipq85hkrksuqnshwxmtzkbzkoi4wzni3lataoxsnkbulli1+xs4lxwcleb+trr+au3cejku6ejx0p1htvu/rhlzqinsakxkrpimsyj2jvsyj+qfpwvkcadqwzwgehzhiyz/b0ry1+pie+zpi/1g5r1uvif14308kwfmxfgmgoy/daw4lahmg0vdpikrpojezhizv4vkgcsmdgm+5ipwzl98rxoubxzmnnzuqlf5w0uwr7yb1xggfnqb9egcvqagcfwbj7ap/vsvvtf1vc0wrdynv3wzwrwd4n6rvw=</latexit> TENSOR RING DECOMPOSITION Often found as the Tensor Train A i1 i 2 i n = C (1) i 1 r 2 C (2) r 2 i 2 r 3 C (n) r n i n 25
26 TENSOR RING DECOMPOSITION Algorithms Similar algorithms for computation from a tensor: Direct HOSVD Iterative ALS Adaptive Rank ALS Block-wise adaptive rank ALS Q. Zhao, G Zhou, S. Xie, L Zhang, A Cichocki. Tensor Ring Decomposition. arxiv: [cs.na] Jun
27 TENSOR RING DECOMPOSITION Properties Interpretation as a hierarchical method: Relation to Kernel methods Hierarchical (H-matrix) decomposition Translational invariance E. Corona, A. Rahimian, D. Zorin. A Tensor Train Accelerated Solver For Integral Equation in Complex Geometries. Journal of Computational Physics, Volume 334,
28 <latexit sha1_base64="uztt9lax+prwgbgamohpfoxpync=">aaacb3icbvhlahsxfnvmmzz1+ndqtaftedufe2ngjpb2exdrtzcp1e3adsqdjzyiakrbuhmwg9f5v+76ed30c6qxp8fnekfwoa90dzqwwnlmkp9r/odhzqphu09ae0+fpx/r3n/53dvsctkrvlt3lokxwhk5qyvanhvoqp5qezpejwv99fo6r6z5hstczno4mgqhbgcgepvme68uz4opzykz6knizkwp6fi86rh+ilfagdykuisziyqxfm0z1paus/twkm5bgm47kdejjlomd//kwyjwdjczjouh9wfrqjc0c8lbp2azfwuudqon3k9zuuc8aodkallqzuovcxbxccgnaroic8yrdwer+j4wgv1yf45buma3exxk3i/zndhzwet/v6vj/2ntehcf5puyrynsim1fi1jttlrun2bksyf6gqaip8kuvfyca4hhj1qhbhb3yffbzdj4ogbfd7ujz00bu+q1eud6hjejmijfyamzeef+rz3otfq2+h2/ig9iurhguzppkh8m7v8bd5u6ya==</latexit> <latexit sha1_base64="uztt9lax+prwgbgamohpfoxpync=">aaacb3icbvhlahsxfnvmmzz1+ndqtaftedufe2ngjpb2exdrtzcp1e3adsqdjzyiakrbuhmwg9f5v+76ed30c6qxp8fnekfwoa90dzqwwnlmkp9r/odhzqphu09ae0+fpx/r3n/53dvsctkrvlt3lokxwhk5qyvanhvoqp5qezpejwv99fo6r6z5hstczno4mgqhbgcgepvme68uz4opzykz6knizkwp6fi86rh+ilfagdykuisziyqxfm0z1paus/twkm5bgm47kdejjlomd//kwyjwdjczjouh9wfrqjc0c8lbp2azfwuudqon3k9zuuc8aodkallqzuovcxbxccgnaroic8yrdwer+j4wgv1yf45buma3exxk3i/zndhzwet/v6vj/2ntehcf5puyrynsim1fi1jttlrun2bksyf6gqaip8kuvfyca4hhj1qhbhb3yffbzdj4ogbfd7ujz00bu+q1eud6hjejmijfyamzeef+rz3otfq2+h2/ig9iurhguzppkh8m7v8bd5u6ya==</latexit> <latexit sha1_base64="uztt9lax+prwgbgamohpfoxpync=">aaacb3icbvhlahsxfnvmmzz1+ndqtaftedufe2ngjpb2exdrtzcp1e3adsqdjzyiakrbuhmwg9f5v+76ed30c6qxp8fnekfwoa90dzqwwnlmkp9r/odhzqphu09ae0+fpx/r3n/53dvsctkrvlt3lokxwhk5qyvanhvoqp5qezpejwv99fo6r6z5hstczno4mgqhbgcgepvme68uz4opzykz6knizkwp6fi86rh+ilfagdykuisziyqxfm0z1paus/twkm5bgm47kdejjlomd//kwyjwdjczjouh9wfrqjc0c8lbp2azfwuudqon3k9zuuc8aodkallqzuovcxbxccgnaroic8yrdwer+j4wgv1yf45buma3exxk3i/zndhzwet/v6vj/2ntehcf5puyrynsim1fi1jttlrun2bksyf6gqaip8kuvfyca4hhj1qhbhb3yffbzdj4ogbfd7ujz00bu+q1eud6hjejmijfyamzeef+rz3otfq2+h2/ig9iurhguzppkh8m7v8bd5u6ya==</latexit> <latexit sha1_base64="uztt9lax+prwgbgamohpfoxpync=">aaacb3icbvhlahsxfnvmmzz1+ndqtaftedufe2ngjpb2exdrtzcp1e3adsqdjzyiakrbuhmwg9f5v+76ed30c6qxp8fnekfwoa90dzqwwnlmkp9r/odhzqphu09ae0+fpx/r3n/53dvsctkrvlt3lokxwhk5qyvanhvoqp5qezpejwv99fo6r6z5hstczno4mgqhbgcgepvme68uz4opzykz6knizkwp6fi86rh+ilfagdykuisziyqxfm0z1paus/twkm5bgm47kdejjlomd//kwyjwdjczjouh9wfrqjc0c8lbp2azfwuudqon3k9zuuc8aodkallqzuovcxbxccgnaroic8yrdwer+j4wgv1yf45buma3exxk3i/zndhzwet/v6vj/2ntehcf5puyrynsim1fi1jttlrun2bksyf6gqaip8kuvfyca4hhj1qhbhb3yffbzdj4ogbfd7ujz00bu+q1eud6hjejmijfyamzeef+rz3otfq2+h2/ig9iurhguzppkh8m7v8bd5u6ya==</latexit> KRONECKER DECOMPOSITION A i1 i 2 i n = C a (1) 1 a 2 a n C (2) b 1 b 2 b n C (m) d 1 d 2 d n 28
29 KRONECKER DECOMPOSITION Algorithms and Properties Similar algorithms for computation from a tensor: Direct SVD Iterative ALS (KPCA) Iterative Lanczos Relation to Perfect shuffles + Z-order curves Butterfly algorithms and factorizations 29
30 KRONECKER DECOMPOSITION Properties 30
31 EXOTIC DECOMPOSITIONS Hierarchical Tucker Decomposition Any tensor network without cycles where all nodes have degree 3 or less. 31
32 EXOTIC DECOMPOSITIONS Other Compositions Construct arbitrary rich structure that reflects any a-priori knowledge of the structure of inputs and outputs 32
33 TENSOR DECOMPOSITIONS IN DEEP LEARNING Compress and accelerate layers in Deep Learning 33
34 DEEP NETWORKS Fully connected layers take up a lot of space! TOTAL PARAMS FC PARAMS % FC LAYER AlexNet 61,100,840 58,631,144 96% VGG ,667, ,642,856 86% ResNet-50 25,557,032 2,049,000 8% ResNet ,549,160 2,049, % 34
35 A general tensor requires O(I N ) storage I is the maximum mode dimension N is the tensor order DEEP NETWORKS Compression Exponential savings Storage Compute 35
36 DEEP NETWORKS Observation: In CNNs (and other networks) Fully Connected Layers flatten data 36
37 CP DECOMPOSITIONS in machine learning Application in latent variable models Single topic models Gaussian mixture models (GMM) Latent Dirichlet allocation (LDA) Hidden Markov models (HMM) But in Deep Learning? E. Allman,C. Matias, J. Rhodes. Identifiability of parameters in latent structure models with many observed variables. Ann. Stat. 37 (2009) A. Anandkumar, R. Ge, D. Hsu, S. Kakade, M. Talgarsky. Tensor Decompositions for Learning Latent Variable Models. Journal of Machine Learning Research 15, Jan
38 CP DECOMPOSITIONS in deep learning Compact representation for matrices and tensors. Efficient application of linear algebra operations. Replace a fully connected layer with a CP decomposed layer Initialize from a trained network or randomly Fine-tune Match the modal structure of the input and output 38
39 TCL LAYER Special case of CP Layer with R = 1 and input/output mode fusion Fully Connected R = 1 Matricization TCL Application
40 TCL LAYER AlexNet, CIFAR-100 J. Kossaifi, A. Khanna, Z. Lipton, T. Furanello, A. Anandkumar. Tensor Contraction Layers for Parsimonious Deep Nets IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, 2017, pp
41 TCL LAYER VGG-19, CIFAR-100 J. Kossaifi, A. Khanna, Z. Lipton, T. Furanello, A. Anandkumar. Tensor Contraction Layers for Parsimonious Deep Nets IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, 2017, pp
42 CP LAYER Resnet-32, CIFAR-10 X. Cao, G. Rabusseau, J. Pineau. Tensor Regression Networks with various Low-Rank Tensor Approximations. arxiv: [cs.lg] Dec
43 CP LAYER Training ImageNet 43
44 CP LAYER Initialization from Pretrained ImageNet Demonstrates initialization from existing networks to gain a large head-start in training Fine-tuning still very important 44
45 <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> <latexit sha1_base64="3gau8xc0s+bn+bf9fg9nendpaao=">aaacwnicbzfbs8mwgibtetjcpmzdntfbisjiaiugxggelvrswbnbwkoazs4stuuscqp0t3qlf/4vme2kqpodwmvzfm8ox8kem6ud592yfxaxlmv1luzzdw19o7w59ajivblajtgpzt/einimafczzwk/krrhiae9chjd+l0xkhwlxyoejjsi8lngi0awngi15cxkghiz8qbphrfwkcgrw3n4gzkjxim8cssc+0fw+ik7ca/zmmuavpk3y55jlkykva6yualybbxatscpc84ltxjtunudar36w5ikerwackzuwhushwryaky4zrt+qmicyqq/04grakdubvk5mxzugzkeo1iajtqs6c9ehiolplfooiosx+qvv8d/vegqr6dbxkssairi7kbryqgoytfoogsses2nrmaimbkrjgmsmdhmoxpmco7fj8+lrtc567j3x+2lq2oadbal9sabcmejuac34a50aqfv4noqwxxrw16wv+zmrnw2qsw2+fx2zhfexref</latexit> TUCKER LAYERS A nice geometric interpretation: Reveal latent features in each mode. Core tensor, G, yields relatives importance of all combined features. A i1 i 2 i n = G r1 r 2 r n C (1) i 1 r 1 C (2) i 2 r 2 C (n) i n r n Straight forward compression and application Core tensor is of the same order. 45
46 TUCKER LAYERS Replace Fully Connected Resnet-32, CIFAR-10 X. Cao, G. Rabusseau, J. Pineau. Tensor Regression Networks with various Low-Rank Tensor Approximations. arxiv: [cs.lg] Dec
47 TUCKER LAYERS Replace Fully Connected TCL (CP) J. Kossaifi, Z. Lipton, A. Khanna, T. Furanello, A. Anandkumar. Tensor Contraction and Regression Networks.. arxiv: [cs.lg] Nov 2017 TRL (Tucker) 47
48 TUCKER LAYERS Performance and Compression Resnet-101 on ImageNet: Compression of the FullyConnected Layer J. Kossaifi, Z. Lipton, A. Khanna, T. Furanello, A. Anandkumar. Tensor Contraction and Regression Networks.. arxiv: [cs.lg] Nov
49 TENSOR RING LAYERS Compression for fully connected layers and convolutional layers 49
50 TENSOR TRAIN LAYERS Replace Fully Connected Resnet-32, CIFAR-10 X. Cao, G. Rabusseau, J. Pineau. Tensor Regression Networks with various Low-Rank Tensor Approximations. arxiv: [cs.lg] Dec
51 TENSOR TRAIN LAYERS Replace Fully Connected VGG-16/19, ImageNet A. Novikov, D. Podoprikhin, A. Osokin, D. Vetrov. Tensorizing Neural Networks. arxiv: [cs.lg] Dec
52 TENSOR TRAIN LAYERS Replace Fully Connected ImageNet 52
53 TENSOR TRAIN LAYERS Replace Convolutional Resnet-like, CIFAR-10 VGG-like, CIFAR-10 T. Garipov, D. Podoprikhin, A. Novikov, D. Vetrov. Ultimate Tensorization: Compressing Convolutional and FC Layers Alike. arxiv: [cs.lg] Nov
54 KRONECKER LAYERS Compact representation of advanced linear operators Preservation of properties of linear operators 54
55 KRONECKER LAYERS Fully Connected Layer S. Zhou, J. Wu. Compression of Fully-Connected Layer in Neural Network by Kronecker Product. Advanced Computational Intelligence (ICACI) IEEE
56 <latexit sha1_base64="mi6ifyuolwhrmlt0gdms5bp2qom=">aaacqnicbzdltgixfiy7xhfvqes3jcqeopizyqluigw0cyejcjeb0ul0okfzsdsxico8mxtfwj0v4mafgrculamxav5jkz/foac9/a2ausf1/uvbwfxaxllnraxxnza3tjm7u3fcdzkmnewznzcsjaijhqljkhlpbjwg12kkbvxlo3r9gxbbfa8qbwfpuajruydijbxqzo5nf8keriy6ixmmtn2zh0fqzoogg0hzgjrdpmnyonlcjni0hyd0mlvjepllrmnz2cm2i51mvi/oiec8msymcyaqddlppu3j0cwexawj0tt0qlyixcxfjmrpmxqkqlipuqspridcilprkkemdxwxoenzdtwje/p3ikkueapxup2jpcvsbqt/qzvd6zy3iuofosqehj/khaxkh44chtblbes2uazhttwuepcqr1iq2nmqbgp2y/omvixcfizb02zpcpjgcuyda5adbjgdjxafkqagmhger+adfghp2pv2qx2nwxe0ycwemjl2/qp5ak9f</latexit> <latexit sha1_base64="mi6ifyuolwhrmlt0gdms5bp2qom=">aaacqnicbzdltgixfiy7xhfvqes3jcqeopizyqluigw0cyejcjeb0ul0okfzsdsxico8mxtfwj0v4mafgrculamxav5jkz/foac9/a2ausf1/uvbwfxaxllnraxxnza3tjm7u3fcdzkmnewznzcsjaijhqljkhlpbjwg12kkbvxlo3r9gxbbfa8qbwfpuajruydijbxqzo5nf8keriy6ixmmtn2zh0fqzoogg0hzgjrdpmnyonlcjni0hyd0mlvjepllrmnz2cm2i51mvi/oiec8msymcyaqddlppu3j0cwexawj0tt0qlyixcxfjmrpmxqkqlipuqspridcilprkkemdxwxoenzdtwje/p3ikkueapxup2jpcvsbqt/qzvd6zy3iuofosqehj/khaxkh44chtblbes2uazhttwuepcqr1iq2nmqbgp2y/omvixcfizb02zpcpjgcuyda5adbjgdjxafkqagmhger+adfghp2pv2qx2nwxe0ycwemjl2/qp5ak9f</latexit> <latexit sha1_base64="mi6ifyuolwhrmlt0gdms5bp2qom=">aaacqnicbzdltgixfiy7xhfvqes3jcqeopizyqluigw0cyejcjeb0ul0okfzsdsxico8mxtfwj0v4mafgrculamxav5jkz/foac9/a2ausf1/uvbwfxaxllnraxxnza3tjm7u3fcdzkmnewznzcsjaijhqljkhlpbjwg12kkbvxlo3r9gxbbfa8qbwfpuajruydijbxqzo5nf8keriy6ixmmtn2zh0fqzoogg0hzgjrdpmnyonlcjni0hyd0mlvjepllrmnz2cm2i51mvi/oiec8msymcyaqddlppu3j0cwexawj0tt0qlyixcxfjmrpmxqkqlipuqspridcilprkkemdxwxoenzdtwje/p3ikkueapxup2jpcvsbqt/qzvd6zy3iuofosqehj/khaxkh44chtblbes2uazhttwuepcqr1iq2nmqbgp2y/omvixcfizb02zpcpjgcuyda5adbjgdjxafkqagmhger+adfghp2pv2qx2nwxe0ycwemjl2/qp5ak9f</latexit> <latexit sha1_base64="mi6ifyuolwhrmlt0gdms5bp2qom=">aaacqnicbzdltgixfiy7xhfvqes3jcqeopizyqluigw0cyejcjeb0ul0okfzsdsxico8mxtfwj0v4mafgrculamxav5jkz/foac9/a2ausf1/uvbwfxaxllnraxxnza3tjm7u3fcdzkmnewznzcsjaijhqljkhlpbjwg12kkbvxlo3r9gxbbfa8qbwfpuajruydijbxqzo5nf8keriy6ixmmtn2zh0fqzoogg0hzgjrdpmnyonlcjni0hyd0mlvjepllrmnz2cm2i51mvi/oiec8msymcyaqddlppu3j0cwexawj0tt0qlyixcxfjmrpmxqkqlipuqspridcilprkkemdxwxoenzdtwje/p3ikkueapxup2jpcvsbqt/qzvd6zy3iuofosqehj/khaxkh44chtblbes2uazhttwuepcqr1iq2nmqbgp2y/omvixcfizb02zpcpjgcuyda5adbjgdjxafkqagmhger+adfghp2pv2qx2nwxe0ycwemjl2/qp5ak9f</latexit> <latexit sha1_base64="ucnhaqn0jsyhhpzxkckz9fjnvww=">aaacrnicbvdlsgmxfl1t3/vvdekmwatdlekr1ixgy+nswarq1pdjpbrmtibkjlcg/p0bt+78btcuvnyaqv34oha4nhmu9+zemvyow/apqiynt0xotc9uz+fmfxzrs8vnzurwyjyw2tjlidupvspbqfdly8xknkraxks3r6v/csetuyy9w34muwm/tlvpcy5eyrwra1yorhvrdsgeoboqnujmgbwcuv5khymqkw5onesjyjt6bnrebnd9ycvllmy09jlwq4encajyl9arqcmij6z22imnybozotdcutynm+ww3kiswg6qndzjjitbfi3bnqbcb+0wwx4gzn0rmekz61+kzkh+nyh44lw/ixwy4xjjfnul+j/xzrg30y1umuuou/g1qjdrgoaupzjywslq9z3hwip/kxe33hkbvvqql4h+/vjf0mo2dhv0dku+fzhqyxpwyq02gmi27mmxnealbnzdm7zcw/aqvatvwcdxtbkmzlbgbyrwctbkr3g=</latexit> <latexit sha1_base64="ucnhaqn0jsyhhpzxkckz9fjnvww=">aaacrnicbvdlsgmxfl1t3/vvdekmwatdlekr1ixgy+nswarq1pdjpbrmtibkjlcg/p0bt+78btcuvnyaqv34oha4nhmu9+zemvyow/apqiynt0xotc9uz+fmfxzrs8vnzurwyjyw2tjlidupvspbqfdly8xknkraxks3r6v/csetuyy9w34muwm/tlvpcy5eyrwra1yorhvrdsgeoboqnujmgbwcuv5khymqkw5onesjyjt6bnrebnd9ycvllmy09jlwq4encajyl9arqcmij6z22imnybozotdcutynm+ww3kiswg6qndzjjitbfi3bnqbcb+0wwx4gzn0rmekz61+kzkh+nyh44lw/ixwy4xjjfnul+j/xzrg30y1umuuou/g1qjdrgoaupzjywslq9z3hwip/kxe33hkbvvqql4h+/vjf0mo2dhv0dku+fzhqyxpwyq02gmi27mmxnealbnzdm7zcw/aqvatvwcdxtbkmzlbgbyrwctbkr3g=</latexit> <latexit sha1_base64="ucnhaqn0jsyhhpzxkckz9fjnvww=">aaacrnicbvdlsgmxfl1t3/vvdekmwatdlekr1ixgy+nswarq1pdjpbrmtibkjlcg/p0bt+78btcuvnyaqv34oha4nhmu9+zemvyow/apqiynt0xotc9uz+fmfxzrs8vnzurwyjyw2tjlidupvspbqfdly8xknkraxks3r6v/csetuyy9w34muwm/tlvpcy5eyrwra1yorhvrdsgeoboqnujmgbwcuv5khymqkw5onesjyjt6bnrebnd9ycvllmy09jlwq4encajyl9arqcmij6z22imnybozotdcutynm+ww3kiswg6qndzjjitbfi3bnqbcb+0wwx4gzn0rmekz61+kzkh+nyh44lw/ixwy4xjjfnul+j/xzrg30y1umuuou/g1qjdrgoaupzjywslq9z3hwip/kxe33hkbvvqql4h+/vjf0mo2dhv0dku+fzhqyxpwyq02gmi27mmxnealbnzdm7zcw/aqvatvwcdxtbkmzlbgbyrwctbkr3g=</latexit> <latexit sha1_base64="ucnhaqn0jsyhhpzxkckz9fjnvww=">aaacrnicbvdlsgmxfl1t3/vvdekmwatdlekr1ixgy+nswarq1pdjpbrmtibkjlcg/p0bt+78btcuvnyaqv34oha4nhmu9+zemvyow/apqiynt0xotc9uz+fmfxzrs8vnzurwyjyw2tjlidupvspbqfdly8xknkraxks3r6v/csetuyy9w34muwm/tlvpcy5eyrwra1yorhvrdsgeoboqnujmgbwcuv5khymqkw5onesjyjt6bnrebnd9ycvllmy09jlwq4encajyl9arqcmij6z22imnybozotdcutynm+ww3kiswg6qndzjjitbfi3bnqbcb+0wwx4gzn0rmekz61+kzkh+nyh44lw/ixwy4xjjfnul+j/xzrg30y1umuuou/g1qjdrgoaupzjywslq9z3hwip/kxe33hkbvvqql4h+/vjf0mo2dhv0dku+fzhqyxpwyq02gmi27mmxnealbnzdm7zcw/aqvatvwcdxtbkmzlbgbyrwctbkr3g=</latexit> KRONECKER LAYERS RNN Layer Kronecker product preserves unitarity Control RNN vanishing/exploding gradient problem on small Kronecker factors A i1 i 2 = C a (1) 1 a 2 C (2) b 1 b 2 C (m) d 1 d 2 L( )+ kc (i) C (i)t Ik 2 2 Complex-valued factors for compact unitary set! 56
57 KRONECKER LAYERS RNN Layer C. Jose, M. Cisse, F. Fleuret. Kronecker Recurrent Units. arxiv: [cs.lg] Dec
Tensor Contractions with Extended BLAS Kernels on CPU and GPU
Tensor Contractions with Extended BLAS Kernels on CPU and GPU Cris Cecka Senior Research Scientist NVIDIA Research, Santa Clara, California Joint work with Yang Shi, U.N. Niranjan, and Animashree Anandkumar
More informationarxiv: v1 [cs.lg] 26 Jul 2017
Tensor Regression Networks Jean Kossaifi Amazon AI Imperial College London jean.kossaifi@imperial.ac.uk Zachary Lipton Amazon AI University of California, San Diego zlipton@cs.ucsd.edu arxiv:1707.08308v1
More informationOrthogonal tensor decomposition
Orthogonal tensor decomposition Daniel Hsu Columbia University Largely based on 2012 arxiv report Tensor decompositions for learning latent variable models, with Anandkumar, Ge, Kakade, and Telgarsky.
More informationMath 671: Tensor Train decomposition methods
Math 671: Eduardo Corona 1 1 University of Michigan at Ann Arbor December 8, 2016 Table of Contents 1 Preliminaries and goal 2 Unfolding matrices for tensorized arrays The Tensor Train decomposition 3
More informationNeural Network Approximation. Low rank, Sparsity, and Quantization Oct. 2017
Neural Network Approximation Low rank, Sparsity, and Quantization zsc@megvii.com Oct. 2017 Motivation Faster Inference Faster Training Latency critical scenarios VR/AR, UGV/UAV Saves time and energy Higher
More informationConvolutional Neural Network Architecture
Convolutional Neural Network Architecture Zhisheng Zhong Feburary 2nd, 2018 Zhisheng Zhong Convolutional Neural Network Architecture Feburary 2nd, 2018 1 / 55 Outline 1 Introduction of Convolution Motivation
More informationComputational Linear Algebra
Computational Linear Algebra PD Dr. rer. nat. habil. Ralf-Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2018/19 Part 6: Some Other Stuff PD Dr.
More informationarxiv: v1 [stat.ml] 15 Dec 2017
BT-Nets: Simplifying Deep Neural Networks via Block Term Decomposition Guangxi Li 1, Jinmian Ye 1, Haiqin Yang 2, Di Chen 1, Shuicheng Yan 3, Zenglin Xu 1, 1 SMILE Lab, School of Comp. Sci. and Eng., Univ.
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationCVPR A New Tensor Algebra - Tutorial. July 26, 2017
CVPR 2017 A New Tensor Algebra - Tutorial Lior Horesh lhoresh@us.ibm.com Misha Kilmer misha.kilmer@tufts.edu July 26, 2017 Outline Motivation Background and notation New t-product and associated algebraic
More informationPRUNING CONVOLUTIONAL NEURAL NETWORKS. Pavlo Molchanov Stephen Tyree Tero Karras Timo Aila Jan Kautz
PRUNING CONVOLUTIONAL NEURAL NETWORKS Pavlo Molchanov Stephen Tyree Tero Karras Timo Aila Jan Kautz 2017 WHY WE CAN PRUNE CNNS? 2 WHY WE CAN PRUNE CNNS? Optimization failures : Some neurons are "dead":
More informationMagmaDNN High-Performance Data Analytics for Manycore GPUs and CPUs
MagmaDNN High-Performance Data Analytics for Manycore GPUs and CPUs Lucien Ng The Chinese University of Hong Kong Kwai Wong The Joint Institute for Computational Sciences (JICS), UTK and ORNL Azzam Haidar,
More informationPostgraduate Course Signal Processing for Big Data (MSc)
Postgraduate Course Signal Processing for Big Data (MSc) Jesús Gustavo Cuevas del Río E-mail: gustavo.cuevas@upm.es Work Phone: +34 91 549 57 00 Ext: 4039 Course Description Instructor Information Course
More informationNumerical Linear and Multilinear Algebra in Quantum Tensor Networks
Numerical Linear and Multilinear Algebra in Quantum Tensor Networks Konrad Waldherr October 20, 2013 Joint work with Thomas Huckle QCCC 2013, Prien, October 20, 2013 1 Outline Numerical (Multi-) Linear
More informationhttps://goo.gl/kfxweg KYOTO UNIVERSITY Statistical Machine Learning Theory Sparsity Hisashi Kashima kashima@i.kyoto-u.ac.jp DEPARTMENT OF INTELLIGENCE SCIENCE AND TECHNOLOGY 1 KYOTO UNIVERSITY Topics:
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationDonald Goldfarb IEOR Department Columbia University UCLA Mathematics Department Distinguished Lecture Series May 17 19, 2016
Optimization for Tensor Models Donald Goldfarb IEOR Department Columbia University UCLA Mathematics Department Distinguished Lecture Series May 17 19, 2016 1 Tensors Matrix Tensor: higher-order matrix
More informationLecture 4. Tensor-Related Singular Value Decompositions. Charles F. Van Loan
From Matrix to Tensor: The Transition to Numerical Multilinear Algebra Lecture 4. Tensor-Related Singular Value Decompositions Charles F. Van Loan Cornell University The Gene Golub SIAM Summer School 2010
More informationMatrix-Tensor and Deep Learning in High Dimensional Data Analysis
Matrix-Tensor and Deep Learning in High Dimensional Data Analysis Tien D. Bui Department of Computer Science and Software Engineering Concordia University 14 th ICIAR Montréal July 5-7, 2017 Introduction
More informationMARCH 24-27, 2014 SAN JOSE, CA
MARCH 24-27, 2014 SAN JOSE, CA Sparse HPC on modern architectures Important scientific applications rely on sparse linear algebra HPCG a new benchmark proposal to complement Top500 (HPL) To solve A x =
More informationDimensionality Reduction and Principle Components Analysis
Dimensionality Reduction and Principle Components Analysis 1 Outline What is dimensionality reduction? Principle Components Analysis (PCA) Example (Bishop, ch 12) PCA vs linear regression PCA as a mixture
More informationMulti-Linear Mappings, SVD, HOSVD, and the Numerical Solution of Ill-Conditioned Tensor Least Squares Problems
Multi-Linear Mappings, SVD, HOSVD, and the Numerical Solution of Ill-Conditioned Tensor Least Squares Problems Lars Eldén Department of Mathematics, Linköping University 1 April 2005 ERCIM April 2005 Multi-Linear
More informationIdentifiability and Learning of Topic Models: Tensor Decompositions under Structural Constraints
Identifiability and Learning of Topic Models: Tensor Decompositions under Structural Constraints Anima Anandkumar U.C. Irvine Joint work with Daniel Hsu, Majid Janzamin Adel Javanmard and Sham Kakade.
More informationarxiv: v2 [cs.lg] 9 May 2018
TensorLy: Tensor Learning in Python arxiv:1610.09555v2 [cs.lg] 9 May 2018 Jean Kossaifi 1 jean.kossaifi@imperial.ac.uk Yannis Panagakis 1,2 i.panagakis@imperial.ac.uk Anima Anandkumar 3,4 anima@amazon.com
More informationarxiv: v1 [cs.lg] 31 Oct 2018
LO-RANK EMBEDDING OF KERNELS IN CONVOLUTIONAL NEURAL NETORKS UNDER RANDOM SUFFLING 1 Chao Li, 1 Zhun Sun, 1,2 Jinshi Yu, 1 Ming ou and 1 Qibin Zhao 1 RIKEN Center for Advanced Intelligence Project (AIP),
More informationTensor networks and deep learning
Tensor networks and deep learning I. Oseledets, A. Cichocki Skoltech, Moscow 26 July 2017 What is a tensor Tensor is d-dimensional array: A(i 1,, i d ) Why tensors Many objects in machine learning can
More informationChannel Pruning and Other Methods for Compressing CNN
Channel Pruning and Other Methods for Compressing CNN Yihui He Xi an Jiaotong Univ. October 12, 2017 Yihui He (Xi an Jiaotong Univ.) Channel Pruning and Other Methods for Compressing CNN October 12, 2017
More informationTruncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences
Journal of Information Hiding and Multimedia Signal Processing c 2016 ISSN 207-4212 Ubiquitous International Volume 7, Number 5, September 2016 Truncation Strategy of Tensor Compressive Sensing for Noisy
More informationHigher-Order Singular Value Decomposition (HOSVD) for structured tensors
Higher-Order Singular Value Decomposition (HOSVD) for structured tensors Definition and applications Rémy Boyer Laboratoire des Signaux et Système (L2S) Université Paris-Sud XI GDR ISIS, January 16, 2012
More informationMatrix-Product-States/ Tensor-Trains
/ Tensor-Trains November 22, 2016 / Tensor-Trains 1 Matrices What Can We Do With Matrices? Tensors What Can We Do With Tensors? Diagrammatic Notation 2 Singular-Value-Decomposition 3 Curse of Dimensionality
More informationMore on Neural Networks
More on Neural Networks Yujia Yan Fall 2018 Outline Linear Regression y = Wx + b (1) Linear Regression y = Wx + b (1) Polynomial Regression y = Wφ(x) + b (2) where φ(x) gives the polynomial basis, e.g.,
More informationThree right directions and three wrong directions for tensor research
Three right directions and three wrong directions for tensor research Michael W. Mahoney Stanford University ( For more info, see: http:// cs.stanford.edu/people/mmahoney/ or Google on Michael Mahoney
More informationFaster Machine Learning via Low-Precision Communication & Computation. Dan Alistarh (IST Austria & ETH Zurich), Hantian Zhang (ETH Zurich)
Faster Machine Learning via Low-Precision Communication & Computation Dan Alistarh (IST Austria & ETH Zurich), Hantian Zhang (ETH Zurich) 2 How many bits do you need to represent a single number in machine
More informationNon-convex Robust PCA: Provable Bounds
Non-convex Robust PCA: Provable Bounds Anima Anandkumar U.C. Irvine Joint work with Praneeth Netrapalli, U.N. Niranjan, Prateek Jain and Sujay Sanghavi. Learning with Big Data High Dimensional Regime Missing
More informationLecture 1: Introduction to low-rank tensor representation/approximation. Center for Uncertainty Quantification. Alexander Litvinenko
tifica Lecture 1: Introduction to low-rank tensor representation/approximation Alexander Litvinenko http://sri-uq.kaust.edu.sa/ KAUST Figure : KAUST campus, 5 years old, approx. 7000 people (include 1400
More informationSajid Anwar, Kyuyeon Hwang and Wonyong Sung
Sajid Anwar, Kyuyeon Hwang and Wonyong Sung Department of Electrical and Computer Engineering Seoul National University Seoul, 08826 Korea Email: sajid@dsp.snu.ac.kr, khwang@dsp.snu.ac.kr, wysung@snu.ac.kr
More informationTensor Decompositions for Machine Learning. G. Roeder 1. UBC Machine Learning Reading Group, June University of British Columbia
Network Feature s Decompositions for Machine Learning 1 1 Department of Computer Science University of British Columbia UBC Machine Learning Group, June 15 2016 1/30 Contact information Network Feature
More informationMath 671: Tensor Train decomposition methods II
Math 671: Tensor Train decomposition methods II Eduardo Corona 1 1 University of Michigan at Ann Arbor December 13, 2016 Table of Contents 1 What we ve talked about so far: 2 The Tensor Train decomposition
More informationCoupled Matrix/Tensor Decompositions:
Coupled Matrix/Tensor Decompositions: An Introduction Laurent Sorber Mikael Sørensen Marc Van Barel Lieven De Lathauwer KU Leuven Belgium Lieven.DeLathauwer@kuleuven-kulak.be 1 Canonical Polyadic Decomposition
More informationA new truncation strategy for the higher-order singular value decomposition
A new truncation strategy for the higher-order singular value decomposition Nick Vannieuwenhoven K.U.Leuven, Belgium Workshop on Matrix Equations and Tensor Techniques RWTH Aachen, Germany November 21,
More informationECE 598: Representation Learning: Algorithms and Models Fall 2017
ECE 598: Representation Learning: Algorithms and Models Fall 2017 Lecture 1: Tensor Methods in Machine Learning Lecturer: Pramod Viswanathan Scribe: Bharath V Raghavan, Oct 3, 2017 11 Introduction Tensors
More informationExploring the Granularity of Sparsity in Convolutional Neural Networks
Exploring the Granularity of Sparsity in Convolutional Neural Networks Anonymous TMCV submission Abstract Sparsity helps reducing the computation complexity of DNNs by skipping the multiplication with
More informationDictionary Learning Using Tensor Methods
Dictionary Learning Using Tensor Methods Anima Anandkumar U.C. Irvine Joint work with Rong Ge, Majid Janzamin and Furong Huang. Feature learning as cornerstone of ML ML Practice Feature learning as cornerstone
More informationCompressed Sensing and Neural Networks
and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications
More informationImage Registration Lecture 2: Vectors and Matrices
Image Registration Lecture 2: Vectors and Matrices Prof. Charlene Tsai Lecture Overview Vectors Matrices Basics Orthogonal matrices Singular Value Decomposition (SVD) 2 1 Preliminary Comments Some of this
More informationA graph contains a set of nodes (vertices) connected by links (edges or arcs)
BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,
More informationVery Deep Residual Networks with Maxout for Plant Identification in the Wild Milan Šulc, Dmytro Mishkin, Jiří Matas
Very Deep Residual Networks with Maxout for Plant Identification in the Wild Milan Šulc, Dmytro Mishkin, Jiří Matas Center for Machine Perception Department of Cybernetics Faculty of Electrical Engineering
More informationAccelerating linear algebra computations with hybrid GPU-multicore systems.
Accelerating linear algebra computations with hybrid GPU-multicore systems. Marc Baboulin INRIA/Université Paris-Sud joint work with Jack Dongarra (University of Tennessee and Oak Ridge National Laboratory)
More information/16/$ IEEE 1728
Extension of the Semi-Algebraic Framework for Approximate CP Decompositions via Simultaneous Matrix Diagonalization to the Efficient Calculation of Coupled CP Decompositions Kristina Naskovska and Martin
More informationSpatial Transformer Networks
BIL722 - Deep Learning for Computer Vision Spatial Transformer Networks Max Jaderberg Andrew Zisserman Karen Simonyan Koray Kavukcuoglu Contents Introduction to Spatial Transformers Related Works Spatial
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationA New Generation of Brain-Computer Interfaces Driven by Discovery of Latent EEG-fMRI Linkages Using Tensor Decomposition
A New Generation of Brain-Computer Interfaces Driven by Discovery of Latent EEG-fMRI Linkages Using Tensor Decomposition Gopikrishna Deshpande AU MRI Research Center AU Department of Electrical and Computer
More informationNumerical tensor methods and their applications
Numerical tensor methods and their applications 8 May 2013 All lectures 4 lectures, 2 May, 08:00-10:00: Introduction: ideas, matrix results, history. 7 May, 08:00-10:00: Novel tensor formats (TT, HT, QTT).
More informationHigh Performance Parallel Tucker Decomposition of Sparse Tensors
High Performance Parallel Tucker Decomposition of Sparse Tensors Oguz Kaya INRIA and LIP, ENS Lyon, France SIAM PP 16, April 14, 2016, Paris, France Joint work with: Bora Uçar, CNRS and LIP, ENS Lyon,
More informationWafer Pattern Recognition Using Tucker Decomposition
Wafer Pattern Recognition Using Tucker Decomposition Ahmed Wahba, Li-C. Wang, Zheng Zhang UC Santa Barbara Nik Sumikawa NXP Semiconductors Abstract In production test data analytics, it is often that an
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More information<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation)
Learning for Deep Neural Networks (Back-propagation) Outline Summary of Previous Standford Lecture Universal Approximation Theorem Inference vs Training Gradient Descent Back-Propagation
More informationAn Introduction to Hierachical (H ) Rank and TT Rank of Tensors with Examples
An Introduction to Hierachical (H ) Rank and TT Rank of Tensors with Examples Lars Grasedyck and Wolfgang Hackbusch Bericht Nr. 329 August 2011 Key words: MSC: hierarchical Tucker tensor rank tensor approximation
More informationsparse and low-rank tensor recovery Cubic-Sketching
Sparse and Low-Ran Tensor Recovery via Cubic-Setching Guang Cheng Department of Statistics Purdue University www.science.purdue.edu/bigdata CCAM@Purdue Math Oct. 27, 2017 Joint wor with Botao Hao and Anru
More informationRandomized Algorithms
Randomized Algorithms Saniv Kumar, Google Research, NY EECS-6898, Columbia University - Fall, 010 Saniv Kumar 9/13/010 EECS6898 Large Scale Machine Learning 1 Curse of Dimensionality Gaussian Mixture Models
More informationCSC321 Lecture 16: ResNets and Attention
CSC321 Lecture 16: ResNets and Attention Roger Grosse Roger Grosse CSC321 Lecture 16: ResNets and Attention 1 / 24 Overview Two topics for today: Topic 1: Deep Residual Networks (ResNets) This is the state-of-the
More informationDeep Residual. Variations
Deep Residual Network and Its Variations Diyu Yang (Originally prepared by Kaiming He from Microsoft Research) Advantages of Depth Degradation Problem Possible Causes? Vanishing/Exploding Gradients. Overfitting
More informationParallel Tensor Compression for Large-Scale Scientific Data
Parallel Tensor Compression for Large-Scale Scientific Data Woody Austin, Grey Ballard, Tamara G. Kolda April 14, 2016 SIAM Conference on Parallel Processing for Scientific Computing MS 44/52: Parallel
More informationSolving PDEs with CUDA Jonathan Cohen
Solving PDEs with CUDA Jonathan Cohen jocohen@nvidia.com NVIDIA Research PDEs (Partial Differential Equations) Big topic Some common strategies Focus on one type of PDE in this talk Poisson Equation Linear
More informationThis work has been submitted to ChesterRep the University of Chester s online research repository.
This work has been submitted to ChesterRep the University of Chester s online research repository http://chesterrep.openrepository.com Author(s): Daniel Tock Title: Tensor decomposition and its applications
More information1. Structured representation of high-order tensors revisited. 2. Multi-linear algebra (MLA) with Kronecker-product data.
Lect. 4. Toward MLA in tensor-product formats B. Khoromskij, Leipzig 2007(L4) 1 Contents of Lecture 4 1. Structured representation of high-order tensors revisited. - Tucker model. - Canonical (PARAFAC)
More informationEECS 275 Matrix Computation
EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 22 1 / 21 Overview
More informationBrief Introduction of Machine Learning Techniques for Content Analysis
1 Brief Introduction of Machine Learning Techniques for Content Analysis Wei-Ta Chu 2008/11/20 Outline 2 Overview Gaussian Mixture Model (GMM) Hidden Markov Model (HMM) Support Vector Machine (SVM) Overview
More informationA Multi-Affine Model for Tensor Decomposition
Yiqing Yang UW Madison breakds@cs.wisc.edu A Multi-Affine Model for Tensor Decomposition Hongrui Jiang UW Madison hongrui@engr.wisc.edu Li Zhang UW Madison lizhang@cs.wisc.edu Chris J. Murphy UC Davis
More informationFundamentals of Multilinear Subspace Learning
Chapter 3 Fundamentals of Multilinear Subspace Learning The previous chapter covered background materials on linear subspace learning. From this chapter on, we shall proceed to multiple dimensions with
More informationCS60021: Scalable Data Mining. Dimensionality Reduction
J. Leskovec, A. Rajaraman, J. Ullman: Mining of Massive Datasets, http://www.mmds.org 1 CS60021: Scalable Data Mining Dimensionality Reduction Sourangshu Bhattacharya Assumption: Data lies on or near a
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project
More informationIndex. Copyright (c)2007 The Society for Industrial and Applied Mathematics From: Matrix Methods in Data Mining and Pattern Recgonition By: Lars Elden
Index 1-norm, 15 matrix, 17 vector, 15 2-norm, 15, 59 matrix, 17 vector, 15 3-mode array, 91 absolute error, 15 adjacency matrix, 158 Aitken extrapolation, 157 algebra, multi-linear, 91 all-orthogonality,
More informationLecture 4. CP and KSVD Representations. Charles F. Van Loan
Structured Matrix Computations from Structured Tensors Lecture 4. CP and KSVD Representations Charles F. Van Loan Cornell University CIME-EMS Summer School June 22-26, 2015 Cetraro, Italy Structured Matrix
More informationNumerical Linear Algebra Primer. Ryan Tibshirani Convex Optimization /36-725
Numerical Linear Algebra Primer Ryan Tibshirani Convex Optimization 10-725/36-725 Last time: proximal gradient descent Consider the problem min g(x) + h(x) with g, h convex, g differentiable, and h simple
More informationFrom Matrix to Tensor. Charles F. Van Loan
From Matrix to Tensor Charles F. Van Loan Department of Computer Science January 28, 2016 From Matrix to Tensor From Tensor To Matrix 1 / 68 What is a Tensor? Instead of just A(i, j) it s A(i, j, k) or
More informationFrequency-Domain Dynamic Pruning for Convolutional Neural Networks
Frequency-Domain Dynamic Pruning for Convolutional Neural Networks Zhenhua Liu 1, Jizheng Xu 2, Xiulian Peng 2, Ruiqin Xiong 1 1 Institute of Digital Media, School of Electronic Engineering and Computer
More informationPrincipal Component Analysis
Machine Learning Michaelmas 2017 James Worrell Principal Component Analysis 1 Introduction 1.1 Goals of PCA Principal components analysis (PCA) is a dimensionality reduction technique that can be used
More informationTUTORIAL PART 1 Unsupervised Learning
TUTORIAL PART 1 Unsupervised Learning Marc'Aurelio Ranzato Department of Computer Science Univ. of Toronto ranzato@cs.toronto.edu Co-organizers: Honglak Lee, Yoshua Bengio, Geoff Hinton, Yann LeCun, Andrew
More informationFreezeOut: Accelerate Training by Progressively Freezing Layers
FreezeOut: Accelerate Training by Progressively Freezing Layers Andrew Brock, Theodore Lim, & J.M. Ritchie School of Engineering and Physical Sciences Heriot-Watt University Edinburgh, UK {ajb5, t.lim,
More informationMachine Learning with Quantum-Inspired Tensor Networks
Machine Learning with Quantum-Inspired Tensor Networks E.M. Stoudenmire and David J. Schwab Advances in Neural Information Processing 29 arxiv:1605.05775 RIKEN AICS - Mar 2017 Collaboration with David
More informationNonparametric regression using deep neural networks with ReLU activation function
Nonparametric regression using deep neural networks with ReLU activation function Johannes Schmidt-Hieber February 2018 Caltech 1 / 20 Many impressive results in applications... Lack of theoretical understanding...
More informationMath 350: An exploration of HMMs through doodles.
Math 350: An exploration of HMMs through doodles. Joshua Little (407673) 19 December 2012 1 Background 1.1 Hidden Markov models. Markov chains (MCs) work well for modelling discrete-time processes, or
More informationNumerical Linear Algebra Primer. Ryan Tibshirani Convex Optimization
Numerical Linear Algebra Primer Ryan Tibshirani Convex Optimization 10-725 Consider Last time: proximal Newton method min x g(x) + h(x) where g, h convex, g twice differentiable, and h simple. Proximal
More informationApproximate Principal Components Analysis of Large Data Sets
Approximate Principal Components Analysis of Large Data Sets Daniel J. McDonald Department of Statistics Indiana University mypage.iu.edu/ dajmcdon April 27, 2016 Approximation-Regularization for Analysis
More informationMachine Learning with Tensor Networks
Machine Learning with Tensor Networks E.M. Stoudenmire and David J. Schwab Advances in Neural Information Processing 29 arxiv:1605.05775 Beijing Jun 2017 Machine learning has physics in its DNA # " # #
More informationarxiv: v1 [cs.lg] 13 Aug 2013
When are Overcomplete Topic Models Identifiable? Uniqueness of Tensor Tucker Decompositions with Structured Sparsity arxiv:1308.2853v1 [cs.lg] 13 Aug 2013 Animashree Anandkumar, Daniel Hsu, Majid Janzamin
More informationComputing least squares condition numbers on hybrid multicore/gpu systems
Computing least squares condition numbers on hybrid multicore/gpu systems M. Baboulin and J. Dongarra and R. Lacroix Abstract This paper presents an efficient computation for least squares conditioning
More informationBlockMatrixComputations and the Singular Value Decomposition. ATaleofTwoIdeas
BlockMatrixComputations and the Singular Value Decomposition ATaleofTwoIdeas Charles F. Van Loan Department of Computer Science Cornell University Supported in part by the NSF contract CCR-9901988. Block
More informationLinear Algebra (Review) Volker Tresp 2017
Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.
More informationUnsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent
Unsupervised Machine Learning and Data Mining DS 5230 / DS 4420 - Fall 2018 Lecture 7 Jan-Willem van de Meent DIMENSIONALITY REDUCTION Borrowing from: Percy Liang (Stanford) Dimensionality Reduction Goal:
More informationTETRIS: TilE-matching the TRemendous Irregular Sparsity
TETRIS: TilE-matching the TRemendous Irregular Sparsity Yu Ji 1,2,3 Ling Liang 3 Lei Deng 3 Youyang Zhang 1 Youhui Zhang 1,2 Yuan Xie 3 {jiy15,zhang-yy15}@mails.tsinghua.edu.cn,zyh02@tsinghua.edu.cn 1
More informationDeep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści
Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, 2017 Spis treści Website Acknowledgments Notation xiii xv xix 1 Introduction 1 1.1 Who Should Read This Book?
More informationProbabilistic Time Series Classification
Probabilistic Time Series Classification Y. Cem Sübakan Boğaziçi University 25.06.2013 Y. Cem Sübakan (Boğaziçi University) M.Sc. Thesis Defense 25.06.2013 1 / 54 Problem Statement The goal is to assign
More informationNonlinear Models. Numerical Methods for Deep Learning. Lars Ruthotto. Departments of Mathematics and Computer Science, Emory University.
Nonlinear Models Numerical Methods for Deep Learning Lars Ruthotto Departments of Mathematics and Computer Science, Emory University Intro 1 Course Overview Intro 2 Course Overview Lecture 1: Linear Models
More informationRegression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning)
Linear Regression Regression Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Example: Height, Gender, Weight Shoe Size Audio features
More informationQuantisation. Efficient implementation of convolutional neural networks. Philip Leong. Computer Engineering Lab The University of Sydney
1/51 Quantisation Efficient implementation of convolutional neural networks Philip Leong Computer Engineering Lab The University of Sydney July 2018 / PAPAA Workshop Australia 2/51 3/51 Outline 1 Introduction
More informationNumerical Methods in Matrix Computations
Ake Bjorck Numerical Methods in Matrix Computations Springer Contents 1 Direct Methods for Linear Systems 1 1.1 Elements of Matrix Theory 1 1.1.1 Matrix Algebra 2 1.1.2 Vector Spaces 6 1.1.3 Submatrices
More informationRegression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning)
Linear Regression Regression Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Example: Height, Gender, Weight Shoe Size Audio features
More informationLearning Deep Architectures for AI. Part II - Vijay Chakilam
Learning Deep Architectures for AI - Yoshua Bengio Part II - Vijay Chakilam Limitations of Perceptron x1 W, b 0,1 1,1 y x2 weight plane output =1 output =0 There is no value for W and b such that the model
More information