Efficient inference of interactions from non-equilibrium data and application to multi-electrode neural recordings

Similar documents
Approximate inference for stochastic dynamics in large biological networks

Inference in kinetic Ising models: mean field and Bayes estimators

NETADIS Kick off meeting. Torino,3-6 february Payal Tyagi

Using persistent homology to reveal hidden information in neural data

arxiv: v3 [cond-mat.dis-nn] 18 May 2011

Grégory Schehr. Citizenship : French Date of birth : March 29, First class Junior Scientist (CR1) at CNRS in Theoretical Physics.

Bayesian Learning in Undirected Graphical Models

Basic Principles of Unsupervised and Unsupervised

Some recent results on the inverse Ising problem. Federico Ricci-Tersenghi Physics Department Sapienza University, Roma

A brief introduction to the inverse Ising problem and some algorithms to solve it

Belief Propagation for Traffic forecasting

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

Derrick Kiley, Ph.D.

Multi-Objective Trajectory Optimisation

Chapter 20. Deep Generative Models

Statistical mechanics of lymphocyte networks modelled with slow and fast variables

Florentin Münch - Curriculum Vitae

Probabilistic Graphical Models

Stanislav M. Mintchev

Grégory Schehr. Citizenship : French Date of birth : March 29, Senior Researcher (DR2) at CNRS in Theoretical Physics.

Large-Scale Feature Learning with Spike-and-Slab Sparse Coding

The Origin of Deep Learning. Lili Mou Jan, 2015

Restricted Boltzmann Machines for Collaborative Filtering

Dat Tien Cao. B.S. in Mathematics, University of Science, Vietnam National University of Ho Chi Minh City, Ho Chi Minh City, Vietnam

Efficient Complex Output Prediction

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Reconstruction for Models on Random Graphs

Modeling Documents with a Deep Boltzmann Machine

Pose tracking of magnetic objects

Energy Based Models. Stefano Ermon, Aditya Grover. Stanford University. Lecture 13

CS Lecture 19. Exponential Families & Expectation Propagation

Mean field methods for classification with Gaussian processes

Why is Deep Learning so effective?

Temporal Neuronal Oscillations can Produce Spatial Phase Codes

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Active and Semi-supervised Kernel Classification

Probabilistic and Bayesian Machine Learning

Bayesian Learning in Undirected Graphical Models

A graph contains a set of nodes (vertices) connected by links (edges or arcs)

The ultrametric tree of states and computation of correlation functions in spin glasses. Andrea Lucarelli

Introduction to Neural Networks

CPSC 540: Machine Learning

Multilayer Neural Networks

Johns Hopkins University Fax: N. Charles Street

Deep Feedforward Networks

Feedforward Neural Nets and Backpropagation

University of Oklahoma Norman, OK. University of Michigan Ann Arbor, MI Ph.D., Applied and Interdisciplinary Mathematics

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017

Stochastic Gradient Estimate Variance in Contrastive Divergence and Persistent Contrastive Divergence

Probabilistic Models in Theoretical Neuroscience

Convolutional networks. Sebastian Seung

COMS 4721: Machine Learning for Data Science Lecture 18, 4/4/2017

The Variational Gaussian Approximation Revisited

Knowledge Extraction from DBNs for Images

CS340 Winter 2010: HW3 Out Wed. 2nd February, due Friday 11th February

CURRICULUM VITAE. Ali Qalil Alorabi

UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS

Consensus and Distributed Inference Rates Using Network Divergence

A summary of Deep Learning without Poor Local Minima

Bayesian Machine Learning - Lecture 7

INdAM-COFUND INdAM Outgoing Fellowships in Mathematics and/or Applications cofunded by Marie Curie actions. MPoisCoho

STA 4273H: Statistical Machine Learning

Classification for High Dimensional Problems Using Bayesian Neural Networks and Dirichlet Diffusion Trees

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Artificial Neural Networks. MGS Lecture 2

Credit Assignment: Beyond Backpropagation

Lecture 16 Deep Neural Generative Models

Probabilistic Graphical Models for Image Analysis - Lecture 1

CPSC 540: Machine Learning

Expectation Propagation for Approximate Bayesian Inference

Curriculum Vitae. Research Interests Noncommutative Geometry, Geometric Analysis, Mathematical Physics.

Introduction to the Renormalization Group

Learning the collective dynamics of complex biological systems. from neurons to animal groups. Thierry Mora

Feed-forward Networks Network Training Error Backpropagation Applications. Neural Networks. Oliver Schulte - CMPT 726. Bishop PRML Ch.

Deep unsupervised learning

MASTER OF PHYSICS. iii.) Compliance of the School of Graduate Studies and the Institute admission requirements.

CURRICULUM VITAE. Christian Wolf Department of Mathematics Wichita State University. Wichita, KS Citizenship: Germany, Permanent Resident: USA

Learning unbelievable marginal probabilities

Researcher. Department of Atmospheric and Oceanic Sciences. University of Wisconsin-Madison W. Dayton Street, Madison, WI 53706

1. Introduction. 2. Artificial Neural Networks and Fuzzy Time Series

Michele Salvi. Personal. Education. Teaching and professional activities

Neural Networks (and Gradient Ascent Again)

Variational Inference via Stochastic Backpropagation

Fundamentals of Computational Neuroscience 2e

Neural Networks Task Sheet 2. Due date: May

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,

CMSC 421: Neural Computation. Applications of Neural Networks

Learning distributions and hypothesis testing via social learning

Stanislav M. Mintchev

Dr. Tristan McLoughlin

Multilayer Perceptrons and Backpropagation

Learning with Ensembles: How. over-tting can be useful. Anders Krogh Copenhagen, Denmark. Abstract

Analyzing large-scale spike trains data with spatio-temporal constraints

Lecture 5: Logistic Regression. Neural Networks

Neural networks. Chapter 20. Chapter 20 1

Neural Networks. Bishop PRML Ch. 5. Alireza Ghane. Feed-forward Networks Network Training Error Backpropagation Applications

CSC321 Lecture 20: Autoencoders

Pre-advising Meeting for Students who Make Molecules/Biomolecules (Inorganic/Organic/Chemical Biology)

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

Transcription:

Efficient inference of interactions from non-equilibrium data and application to multi-electrode neural recordings ESR: 1 Supervisor: Dr. Yasser Roudi 1, 2 1 Kavli Institute for Systems Neuroscience, Trondheim (NO) 2 NORDITA, Stockholm (SE) 23rd January 2014 NETADIS MID-TERM REVIEW Institut Henri Poincaré, Paris

Table of Contents 1 My background 2 Training experiences 3 My work so far 4 Future developments and expectations

My background July 2012 Master degree in theoretical physics (University of Trieste) Master s Thesis: Criticality of models inferred in Boltzmann learning Supervisor: Dr. Matteo Marsili

My present September 2012 start as NETADIS ESR at the Kavli Institute for Systems Neuroscience, Trondheim NORWAY October 2012 enrolled as PhD Student in Neuroscience at NTNU, Trondheim NORWAY

Training experiences 2012 October Random Matrix Theory by PierPaolo Vivo Kavli I., Trondheim 11/26-12/07 Winter School on Quantitative Systems Biology ICTP, Trieste 2013 02/03-02/06 Netadis First Scientific Meeting Torino 02/13-02/22 Spin glasses by Adriano Barra NORDITA, Stockholm 03/07-03/15 Stochastic Thermodynamics (Workshop) NORDITA, Stockholm 04/03-04/05 SMED8005 Communication of Science Kavli I., Trondheim 05/20-06/14 Spring College on Physics of Complex Systems ICTP, Trieste 08/09-08/19 Large Deviations by prof.kamiar Rahnama Rad Kavli I., Trondheim 09/08-09/22 Netadis Summer School 2013 Hillerod, Denmark 23/10-20/12 Secondment at TUB in prof.opper group Berlin

My work so far 1 Path integral Methods: One-loop correction to the Laplace Approximation for systems out of equilibrium in collaboration with prof. K. Rahnama Rad (Columbia University, NY) 2 Gaussian Average Method for the kinetic Ising Model 3 Variational Factorizable Approximation for the kinetic Ising Model under the supervision of prof. M.Opper & Dr. Y.Roudi

My work so far 1 Path integral Methods: One-loop correction to the Laplace Approximation for systems out of equilibrium [ TAP: m i (t) = tanh j J ij m j (t 1) m i (t) ( j J2 ij 1 mj (t 1) 2)] (Plefka expansion) in collaboration with prof. K. Rahnama Rad (Columbia University, NY) 2 Gaussian Average Method for the kinetic Ising Model 3 Variational Factorizable Approximation for the kinetic Ising Model under the supervision of prof. M.Opper & Dr. Y.Roudi

My work so far Variational approximation model distribution P(σ) ensamble Q(σ, θ)

My work so far Variational approximation optimal approximate distribution Q(σ, θ ) (TREATABLE!) ensamble Q(σ, θ) KL-divergence minimization

My work so far Gaussian Average Method for the kinetic Ising Model m i (t) = [ DX (2π) tanh x NT i (t 1) + ] j J ij m j (t 1) J ij J ji Mezard s MF: m i (t) = [ Dx tanh j J ij m j (t 1) + x ] i (t 1) where i (t) = ( j J2 ij 1 mj (t) 2) [ g 1 TAP: m i (t) = tanh j J ij m j (t 1) m i (t) ( j J2 ij 1 mj (t 1) 2)] k = 0.8

My work so far Variational Factorizable Approximation for the kinetic Ising Model Equilibrium Ising model Saddle-point Approximation Variational Factorizable Approximation Naïve Mean Field Equations Kinetic Ising model Saddle-point [ ] Approximation m i (t) = tanh j J ij m j (t 1) (nmf) Variational Factorizable Approximation m i (t) = tanh [ k J ikm k (t 1) + k J ki m k (t + 1) [ [ ] j i tanh 1 tanh k i J jkσ k (t) tanh [ ] ] ] J ji Q where Q(σ) is the factorizable distribution Why poorer predictions? D f (m, H) D nmf (m, H)

My work so far k = 0 k = 0.5 k = 1

Future developments and expectations February 2014 Dilute systems: The Bethe Approximation for the kinetic Ising Model supervised by prof.john Hertz (NORDITA,Copenhagen) & Dr. Y.Roudi March 2014 Deep learning in networks with hidden nodes in collaboration with Benjamin Dunn Summer 2014 Learning the connectivity of the brain: performances of the approximations developed so far using data available at the Kavli Institute for Systems Neuroscience dates to be decided Secondment in London at prof. Sollich group

Future developments and expectations Summer 2014 Learning the connectivity of the brain: performances of the approximations developed so far using data available at the Kavli Institute for Systems Neuroscience

Future developments and expectations Summer 2014 Learning the connectivity of the brain: performances of the approximations developed so far using data available at the Kavli Institute for Systems Neuroscience

Future developments and expectations Summer 2014 Learning the connectivity of the brain: performances of the approximations developed so far using data available at the Kavli Institute for Systems Neuroscience

Future developments and expectations February 2014 Dilute systems: The Bethe Approximation for the kinetic Ising Model supervised by prof.john Hertz (NORDITA,Copenhagen) & Dr. Y.Roudi March 2014 Deep learning in networks with hidden nodes in collaboration with Benjamin Dunn Summer 2014 Learning the connectivity of the brain: performances of the approximations developed so far using data available at the Kavli Institute for Systems Neuroscience dates to be decided Secondment in London at prof. Sollich group

Thanks!