The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

Size: px
Start display at page:

Download "The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017"

Transcription

1 The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017

2 Bayesian Brain How do neurons represent the states of the world? How do neurons represent likelihood functions? How do neurons use these representations to calculate posterior distributions?

3 Biology of a Single Neuron Think of an individual neuron as an input-output device Neurons transmit electrical impulses (action potentials; aka spikes) away from their cell bodies along output structures (axons) When a spike reaches the end of an axon (axon terminal), it induces the release of neurotransmitter molecules that diffuse across a narrow synaptic cleft and bind to receptors on the input structure (dendrites) of the postsynaptic neuron

4 Effect of the transmitter released by a given input neuron may be inhibitory (reducing the probability that the postsynaptic neuron will fire spikes of its own) or excitatory (increasing the probability that the postsynaptic neuron will fire) depending on the transmitter released and on the receptor that binds it Each neuron receives inputs from a large number of other (presynaptic) neurons Each neuron sends outputs to a large number of other (postsynaptic) neurons

5

6

7

8 Each input neuron fires a number of spikes over a relevant time interval These result in neurotransmitter release onto our neuron of interest. This postsynaptic neuron integrates these inputs and produces an output spike train of its own We will focus on the number of output spikes in a time interval, either in response to a stimulus or in response to the numbers of input spikes it receives from other neurons

9 Neural Coding Let s denote the sensory stimulus Special neurons (sensory receptors) convert the stimulus into a pattern of spikes For example, photoreceptors are neurons that convert a pattern of light falling on the back of the retina into a pattern of spikes. Each individual photoreceptor responds to the amount of light falling on a very small region of the retina

10 Neural Coding Important: Even when the stimulus s is held constant, the pattern of spiking activities in any set of neurons will be variable Goal of neural coding: Characterize the (stochastic) mapping from stimulus s to neural responses r Because of the stochastic nature of this mapping, it is characterized as a probability distribution, denoted p(r s) We do not currently understand why this mapping is stochastic That is, we do not understand the noise factors contributing to the stochastic nature of neural responses

11 Neural Tuning Curve The tuning curve of a neuron describes its mean response to stimulus s For example, let the stimulus be an oriented Gabor: If the orientation is 90 (s = 90 ), the neuron may emit (on average) 10 spikes (r = 10) If s = 100, then r = 40 If s = 110, then r = 80 If s = 120, then r = 40 If s = 130, then r = 10 On average, this neuron is most activated by a Gabor whose orientation is 110 (i.e., s pref = 110 )

12

13

14

15

16

17 Example: Bell-Shaped Tuning Curve In primary visual cortex (area V1), neurons are tuned to the orientation of a visual stimulus. The tuning curve is typically unimodal and symmetric around the preferred orientation. Von Mises (circular Gaussian) function: f i (s) = g e κ(cos(s s i) 1) + b where b is an offset (baseline level of activity), g is a gain, s i is neuron i s preferred orientation, and κ is a concentration parameter (the higher κ, the more narrowly tuned the neuron)

18 Example: Monotonic Tuning Curve Simplest form is a rectified linear function: f i (s) = [g s + b] + where g is positive for monotonically increasing curves and negative for monotonically decreasing curves Problem: function is unbounded (may need to place a ceiling on function value)

19 Neural Variability

20 Example: Poisson Variability Need to characterize variability in neuron s spike count (non-negative integer) Mean spike count is f i (s) (not necessarily an integer) On any given trial, actual spike count will vary around f i (s). For every possible count r, we seek its probability: p(r i s) = 1 r i! e f i(s) f i (s) r i Note: mean = variance = f i (s)

21

22 Population Codes Neural populations are groups of neurons that are all selective for a particular stimulus feature (e.g., orientation) Typically have similarly-shaped tuning curves but with different preferred stimulus values Population code is the vector of stimulus-evoked firing rates of the neurons in a neural population (denoted r = [r 1,..., r N ] T ) Neural coding: need to characterize p(r s)

23

24 Independent Poisson Variability in a Population Assume: For a given stimulus, neuron responses are independent: p(r s) = N p(r i s) i=1 If we also assume Poisson variability, then: p(r s) = N i=1 1 r i! e f i(s) f i (s) r i

25 Problem (Problem 10.3 from draft of textbook by Ma, Körding, and Goldreich) We assume a population of 9 independent Poisson neurons with Gaussian tuning curves and preferred orientations from -40 to 40 in steps of 10. The tuning curve parameters have values g = 10, b = 0, and σ tc = 20. A stimulus s = 0 is presented to this population. What is the probability that all neurons stay silent?

26 Solution Gaussian tuning curve: [ 1 f i (s) = g exp 1 ] (s s 2πσtc 2σtc 2 i ) 2 + b Thus p(r = 0 s = 0 ) = 9 i=1 e f i(s) = e f 1(s) f 2 (s) f 9 (s)

27 Problem (modified from Problem 10.5 from draft of textbook by Ma, Körding, and Goldreich) In a population of 1000 independent neurons with Gaussian tuning curves (let g = 10, b = 0, and σ tc = 20), examine the claim that Σ i f i (s) is more or less independent of s. Given certain assumptions, Σ i f i (s) is, in fact, independent of s. We will use this important fact later in this lecture.

28

29 Sum of Neurons Mean Responses Stimulus

30 Problem (modified from Problem from draft of textbook by Ma, Körding, and Goldreich) Define 1000 time points. At each time point, determine whether a spike is fired by generating a random number according to a Poisson distribution (use poissrnd function in Matlab) that leads to a yes probability of (this corresponds to a mean of 3.2 spikes over all 1000 time points). Count the total number of spikes generated. Repeat for 10,000 trials. Plot a histogram of spike counts and compare to Figure 10.3a.

31

32 Repeat for a mean of 9.5 spikes. Compare the resulting histogram to Figure 10.3b.

33

34 A property of a Poisson process is that the time between two subsequent spikes (interspike interval, denoted t) follows an exponential distribution: p( t) = exp( t/λ)/λ, where λ is the mean of the Poisson process. Verify this by plotting the histogram of interspike intervals across all Poisson spike trains generated above.

35 Mean = 3.2 spikes

36 Mean = 9.5 spikes 6 x

37 Neural Decoding Neural Decoding: Given neural responses r, what can we say about the external stimulus s? Goal: Infer posterior distribution p(s r) via Bayes rule Is this like mind reading? For example, given a person s neural responses, we want to infer what the person is looking at

38 Neural Likelihood Function for a Single Neuron Recall that the probability that a neuron fires r spikes in a given time interval (based on a Poisson noise distribution) is: p(r s) = 1 r! e f(s) f(s) r Suppose that 11 spikes are observed (r = 11). Given r, the neural likelihood of a hypothesized stimulus value s is the probability that r spikes were produced by that value of s. That is, the likelihood of s is: L(s r) = p(r s)

39 Example: Gaussian Tuning Curve Suppose that a neuron has a Gaussian tuning curve (with mean s pref = 0, width σ tc = 10, and gain g = 5: [ f(s) = g exp 1 ] (s s 2σtc 2 pref ) 2 + b 1 For the purposes of this slide, the constant 2πσtc (normally appearing in a Gaussian distribution) has been absorbed into the constant gain g

40

41 Intuition Behind Likelihood Suppose that neuron fires 4 spikes. What can we conclude about stimulus s? r = 4 could have been produced by, for example, s = 10...or, with equal probability, it could have been produced by s = 10...or, with much lower probability, it could have been produced by s = 21

42

43 Unlike the likelihood functions frequently encountered in previous classes, this likelihood function is not Gaussian This likelihood function is not normalized In general, likelihood functions do not need to be normalized. Recall that they are not probability distributions. L(s r) (which equals p(r s)) is a function of the unknown stimulus value s, not the known rate count r

44 Likelihood functions for several different values of r

45 Example: Britten et al. (1992) Monkeys viewed random-dot kinematograms:

46 For each neuron in area MT, consider two directions of motion: Direction that produced the maximum response in that neuron Opposite direction Behavioral task: Monkey s task was to discriminate between these two motion directions Q: Can we decode the neuron s responses in order to discriminate between these two directions?

47

48 Percent correct of monkey is about the same as the percent correct of the single neuron (!!!) But, but, but...monkey has lots of neurons. Shouldn t its performance be better than that of an individual neuron?

49 Likelihood Function for a Population of Neurons Assuming neurons are independent with Poisson noise: L(s r) = p(r s) = = N p(r i s) i=1 N i=1 1 r i! e f i(s) f i (s) r i

50 Log likelihood function: log L(s r) = N i=1 ( ) 1 log r i! e f i(s) f i (s) r i N = log r! i=1 N f i (s) + i=1 N r i log f i (s) i=1

51 In General... Likelihood values can be very small. These values are products of probabilities (numbers less than one) Likelihood functions can be very far from Gaussian. Some functions have multiple maxima, or a flat top, or are skew (i.e., long tails). If the number of neurons is large, the likelihood function is likely to have a dominant peak and may look (roughly) Gaussian The likelihood function is different on every trial, even if the stimulus is kept the same. This is because the likelihood function is determined by the pattern of neural activity on a trial, and this pattern varies stochastically from trial to trial

52 Neural Likelihood Function and Sensory Integration At a conceptual level, the computation of a likelihood function based on a set of neurons with independent noise is similar to sensory integration (or cue combination) as discussed earlier in the semester A neuron s spike count is analogous to a sensory or cue measurement The likelihood is obtained by multiplying the likelihoods from the individual neurons, just as the likelihoods from individual cues are multiplied together in cue combination

53 Example: Cercal System of Crickets Cercal system to sense the direction of incoming air currents. Four neurons represent this direction at low velocities.

54 Neural tuning curves are well-approximated by half-wave rectified cosine functions: Change of notation: f(s) = r max [cos(s s pref )] + Let spatial vector v point parallel to the wind velocity and have unit length (i.e., v = 1) Let c i (of unit length) denote the preferred direction of neuron i

55 In this case: f i (s) = r max [ v c i ] + Neural decoding: The wind direction represented by the four neurons is: 4 ( ) ri v pop = c i i=1 r max Warning: In this example, neural decoding is performed in a heuristic (non-probabilistic) manner. (Nonetheless, it illustrates the concept of neural decoding based on a population of neurons.)

56

57 fmri Example: Harrison and Tong (2009) Decoding can also be performed using fmri data Stimuli: Two orthogonal gratings (25 and 115 ) Two experiments on visual working memory Q: Can we decode fmri signals in order to estimate the contents of a person s visual working memory?

58 Experiment 1: On each trial: 1. Display 1 grating (picked at random) 2. Display other grating 3. Display cue (either 1 or 2) indicating which grating to remember second delay period 5. Display test grating 6. Response: Is test grating rotated clockwise or anticlockwise relative to the cued grating?

59

60

61 Experiment 2: On each trial: 1. Display sequence of letters at center of screen with low-contrast gratings flashing in the surround 2. Response: Identify if sequence contained a J or a K (grating is task-irrelevant)

62 Experiment 1: green curve Experiment 2: red curve

63 Yes, we can decode fmri signals in order to estimate the contents of a person s visual working memory Decoding works well for both attended and unattended stimuli Lower-level areas (e.g., V1, V2) respond similarli regardless of whether a stimulus is attended. In contrast, higher-level areas (e.g., V3A-V4) are sensitive to whether a stimulus is attended. Warning: Decoding was performed using a non-probabilistic (linear support vector machine) method. (Nonetheless, this example illustrates the concept of neural decoding based on fmri signals from populations of neurons.)

64 Toy Model Next, make assumptions underlying a toy model that is unrealistic, but useful nonetheless Goal: Show that, given assumptions, neural likelihood function is Gaussian Assume neuron tuning curves are Gaussian (with b = 0), and are translated versions of each other Assume preferred stimuli of the neurons are equally and densely spaced across entire real line (there are thus infinitely many neurons)

65 Thus, neuron i s tuning function is: [ f i (s) = g exp 1 ] (s s 2σtc s pref ) 2

66

67 Log Likelihood Function Recall the log likelihood function: N log L(s r) = log r! i=1 N f i (s) + i=1 N r i log f i (s) i=1 On a given trial, first term is a constant (i.e., it is independent of stimulus s) Second term is also (approximately) a constant (i.e., independent of s) This was demonstrated (via simulation) on an earlier slide Known as the constant sum approximation

68 log L(s r) = = = N i=1 N i=1 N i=1 [ r i log g exp 1 ] (s s 2σtc s pref ) 2 + constant ( r i log g 1 ) (s s 2σtc s pref ) 2 + constant r i log g 1 2σ s tc N r i (s s pref ) 2 + constant i=1 = 1 2σ s tc N r i (s s pref ) 2 + constant i=1 In last step, N i=1 r i log g is a constant (independent of s), and thus absorbed into additive constant

69 After re-arranging terms, it is easy to show: [ ] 1 L(s r) exp (s µ likelihood ) 2 2σ 2 likelihood µ likelihood = N i=1 r i s pref,i N i=1 r i σ s likelihood = σ 2 tc N i=1 r i Likelihood is an (unnormalized) Gaussian (!!!)

70 Maximum Likelihood Estimate In other words, the maximum likelihood estimate of s is: ŝ MLE = N i=1 r i s pref,i N i=1 r i Width of this estimate ( sensory uncertainty ) is: σ likelihood = σ tc N i=1 r i

71 Higher the total spike count in the population, narrower the likelihood function (i.e., lower the sensory uncertainty)

72 MLE estimate and width varies from trial to trial (even when the stimulus value s is constant) What is the distribution of errors (over many trials) in the MLE estimate?

73 What is the distribution of widths (over many trials) in the MLE estimate?

74 Relationships between Behavioral and Neural Models

75 Think of the neural model as generating quantities that are used by the behavioral model The neural quantity µ likelihood is used as the measurement x in the behavioral model The neural quantity σlikelihood 2 is used as the error σ in the measurement x in the behavioral model

76

77 Problem (modified from Problem 10.3 [in Chapter 10, not 9] from draft of textbook by Ma, Kording, and Goldreich) Simulate a population of 1000 independent Poisson neurons with Gaussian tuning curves (let g = 1 [also try g = 10], b = 0, and σ tc = 20). Simulate the toy model of this chapter for 10,000 trials (all at s=0) and create a scatter plot of the squared error of the maximum likelihood estimate versus the squared width (variance) of the likelihood function. What is the correlation coefficient?

78 gain: g = Variance of Likelihood Function Squared Error of ML Estimate

79 gain: g = Variance of Likelihood Function Squared Error of ML Estimate

80 Last Topic...Cue Combination Suppose that we want to combine visual and auditory information to the location of an event One neural population encodes location based on the visual information. The activities of these neurons are denoted r V Another neural population encodes location based on the auditory information. The activities of these neurons are denoted r A

81 Assumptions: Each modality-specific population has the same number of neurons Neurons in each area have the same tuning curves f i (s) Neural activities in each area are corrupted by Poisson noise The two populations may have different gains. The mean activities of the visual neurons are g V f i (s) and the mean activities of the auditory neurons are g A f i (s) The population with the higher gain will produce more spikes, and thus should be more reliable

82 Assuming that the two populations are conditionally independent given the stimulus value s, then p(s r V, r A ) p(r V, r A s) = p(r V s) p(r A s)

83 Recalling that neural responses follow a Poisson distribution, and ignoring all factors that do not depend on s: ( N ) ( N ) p(s r V, r A ) e g V f i (s) f i (s) r V,i e g Af i (s) f i (s) r A,i i=1 i=1 [ N ] = exp ( (g V + g A )f i (s) + (r V,i + r A,i ) log f i (s) i=1 = N e (g V +g A )f i (s) f i (s) r V,i+r A,i i=1

84 Q: What is this last equation? Suppose a new population of neurons sums the activities of corresponding pairs of neurons in the visual and auditory populations r V A = r V + r A The new population activities r V A also obey independent Poisson variability (because the sum of two Poisson random variables is also a Poisson random variable) This new population encodes p(s r V, r A ) (!!!)

85 Fetsch, C. R., DeAngelis, G. C., & Angelaki, D. E. (2013). Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. Nature Review Neuroscience, 14,

86

87

88

89

90

91

3.3 Population Decoding

3.3 Population Decoding 3.3 Population Decoding 97 We have thus far considered discriminating between two quite distinct stimulus values, plus and minus. Often we are interested in discriminating between two stimulus values s

More information

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable

More information

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern. 1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend

More information

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although,

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although, 3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining what

More information

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Population Coding Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2010 Coding so far... Time-series for both spikes and stimuli Empirical

More information

Neural Decoding. Chapter Encoding and Decoding

Neural Decoding. Chapter Encoding and Decoding Chapter 3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining

More information

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture CSE/NB 528 Final Lecture: All Good Things Must 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading 2 What is the neural code? What

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

Signal detection theory

Signal detection theory Signal detection theory z p[r -] p[r +] - + Role of priors: Find z by maximizing P[correct] = p[+] b(z) + p[-](1 a(z)) Is there a better test to use than r? z p[r -] p[r +] - + The optimal

More information

Encoding or decoding

Encoding or decoding Encoding or decoding Decoding How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms for extracting a stimulus

More information

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be Supplementary Materials General case: computing log likelihood We first describe the general case of computing the log likelihood of a sensory parameter θ that is encoded by the activity of neurons. Each

More information

The homogeneous Poisson process

The homogeneous Poisson process The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,

More information

Lateral organization & computation

Lateral organization & computation Lateral organization & computation review Population encoding & decoding lateral organization Efficient representations that reduce or exploit redundancy Fixation task 1rst order Retinotopic maps Log-polar

More information

Bayesian probability theory and generative models

Bayesian probability theory and generative models Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using

More information

Mid Year Project Report: Statistical models of visual neurons

Mid Year Project Report: Statistical models of visual neurons Mid Year Project Report: Statistical models of visual neurons Anna Sotnikova asotniko@math.umd.edu Project Advisor: Prof. Daniel A. Butts dab@umd.edu Department of Biology Abstract Studying visual neurons

More information

Biosciences in the 21st century

Biosciences in the 21st century Biosciences in the 21st century Lecture 1: Neurons, Synapses, and Signaling Dr. Michael Burger Outline: 1. Why neuroscience? 2. The neuron 3. Action potentials 4. Synapses 5. Organization of the nervous

More information

Math in systems neuroscience. Quan Wen

Math in systems neuroscience. Quan Wen Math in systems neuroscience Quan Wen Human brain is perhaps the most complex subject in the universe 1 kg brain 10 11 neurons 180,000 km nerve fiber 10 15 synapses 10 18 synaptic proteins Multiscale

More information

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)

More information

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)

More information

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what

More information

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing Tuning tuning curves So far: Receptive fields Representation of stimuli Population vectors Today: Contrast enhancment, cortical processing Firing frequency N 3 s max (N 1 ) = 40 o N4 N 1 N N 5 2 s max

More information

High-dimensional geometry of cortical population activity. Marius Pachitariu University College London

High-dimensional geometry of cortical population activity. Marius Pachitariu University College London High-dimensional geometry of cortical population activity Marius Pachitariu University College London Part I: introduction to the brave new world of large-scale neuroscience Part II: large-scale data preprocessing

More information

Chapter 9: The Perceptron

Chapter 9: The Perceptron Chapter 9: The Perceptron 9.1 INTRODUCTION At this point in the book, we have completed all of the exercises that we are going to do with the James program. These exercises have shown that distributed

More information

Probabilistic Models in Theoretical Neuroscience

Probabilistic Models in Theoretical Neuroscience Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction

More information

CISC 3250 Systems Neuroscience

CISC 3250 Systems Neuroscience CISC 3250 Systems Neuroscience Systems Neuroscience How the nervous system performs computations How groups of neurons work together to achieve intelligence Professor Daniel Leeds dleeds@fordham.edu JMH

More information

1/12/2017. Computational neuroscience. Neurotechnology.

1/12/2017. Computational neuroscience. Neurotechnology. Computational neuroscience Neurotechnology https://devblogs.nvidia.com/parallelforall/deep-learning-nutshell-core-concepts/ 1 Neurotechnology http://www.lce.hut.fi/research/cogntech/neurophysiology Recording

More information

encoding and estimation bottleneck and limits to visual fidelity

encoding and estimation bottleneck and limits to visual fidelity Retina Light Optic Nerve photoreceptors encoding and estimation bottleneck and limits to visual fidelity interneurons ganglion cells light The Neural Coding Problem s(t) {t i } Central goals for today:

More information

Bayesian Inference. 2 CS295-7 cfl Michael J. Black,

Bayesian Inference. 2 CS295-7 cfl Michael J. Black, Population Coding Now listen to me closely, young gentlemen. That brain is thinking. Maybe it s thinking about music. Maybe it has a great symphony all thought out or a mathematical formula that would

More information

Statistical models for neural encoding

Statistical models for neural encoding Statistical models for neural encoding Part 1: discrete-time models Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk

More information

Congruent and Opposite Neurons: Sisters for Multisensory Integration and Segregation

Congruent and Opposite Neurons: Sisters for Multisensory Integration and Segregation Congruent and Opposite Neurons: Sisters for Multisensory Integration and Segregation Wen-Hao Zhang 1,2, He Wang 1, K. Y. Michael Wong 1, Si Wu 2 wenhaoz@ust.hk, hwangaa@connect.ust.hk, phkywong@ust.hk,

More information

Nervous Tissue. Neurons Neural communication Nervous Systems

Nervous Tissue. Neurons Neural communication Nervous Systems Nervous Tissue Neurons Neural communication Nervous Systems What is the function of nervous tissue? Maintain homeostasis & respond to stimuli Sense & transmit information rapidly, to specific cells and

More information

Neural Encoding I: Firing Rates and Spike Statistics

Neural Encoding I: Firing Rates and Spike Statistics Chapter 1 Neural Encoding I: Firing Rates and Spike Statistics 1.1 Introduction Neurons are remarkable among the cells of the body in their ability to propagate signals rapidly over large distances. They

More information

Leo Kadanoff and 2d XY Models with Symmetry-Breaking Fields. renormalization group study of higher order gradients, cosines and vortices

Leo Kadanoff and 2d XY Models with Symmetry-Breaking Fields. renormalization group study of higher order gradients, cosines and vortices Leo Kadanoff and d XY Models with Symmetry-Breaking Fields renormalization group study of higher order gradients, cosines and vortices Leo Kadanoff and Random Matrix Theory Non-Hermitian Localization in

More information

SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017

SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017 SPIKE TRIGGERED APPROACHES Odelia Schwartz Computational Neuroscience Course 2017 LINEAR NONLINEAR MODELS Linear Nonlinear o Often constrain to some form of Linear, Nonlinear computations, e.g. visual

More information

Neurophysiology. Danil Hammoudi.MD

Neurophysiology. Danil Hammoudi.MD Neurophysiology Danil Hammoudi.MD ACTION POTENTIAL An action potential is a wave of electrical discharge that travels along the membrane of a cell. Action potentials are an essential feature of animal

More information

3 Detector vs. Computer

3 Detector vs. Computer 1 Neurons 1. The detector model. Also keep in mind this material gets elaborated w/the simulations, and the earliest material is often hardest for those w/primarily psych background. 2. Biological properties

More information

Linear Regression, Neural Networks, etc.

Linear Regression, Neural Networks, etc. Linear Regression, Neural Networks, etc. Gradient Descent Many machine learning problems can be cast as optimization problems Define a function that corresponds to learning error. (More on this later)

More information

Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis

Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis Younesse Kaddar. Covariance and Correlation Assume that we have recorded two neurons in the two-alternative-forced

More information

Finding a Basis for the Neural State

Finding a Basis for the Neural State Finding a Basis for the Neural State Chris Cueva ccueva@stanford.edu I. INTRODUCTION How is information represented in the brain? For example, consider arm movement. Neurons in dorsal premotor cortex (PMd)

More information

Chapter 9. Nerve Signals and Homeostasis

Chapter 9. Nerve Signals and Homeostasis Chapter 9 Nerve Signals and Homeostasis A neuron is a specialized nerve cell that is the functional unit of the nervous system. Neural signaling communication by neurons is the process by which an animal

More information

Concerns of the Psychophysicist. Three methods for measuring perception. Yes/no method of constant stimuli. Detection / discrimination.

Concerns of the Psychophysicist. Three methods for measuring perception. Yes/no method of constant stimuli. Detection / discrimination. Three methods for measuring perception Concerns of the Psychophysicist. Magnitude estimation 2. Matching 3. Detection/discrimination Bias/ Attentiveness Strategy/Artifactual Cues History of stimulation

More information

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl Spiking neurons Membrane equation V GNa GK GCl Cm VNa VK VCl dv dt + V = V Na G Na + V K G K + V Cl G Cl G total G total = G Na + G K + G Cl = C m G total Membrane with synaptic inputs V Gleak GNa GK

More information

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation Nervous Tissue Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation What is the function of nervous tissue? Maintain homeostasis & respond to stimuli

More information

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling Name: AP Biology Mr. Croft Section 1 1. What is a neuron? Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling 2. Neurons can be placed into three groups, based on their location and function.

More information

Advanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting)

Advanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting) Advanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting) Professor: Aude Billard Assistants: Nadia Figueroa, Ilaria Lauzana and Brice Platerrier E-mails: aude.billard@epfl.ch,

More information

How to read a burst duration code

How to read a burst duration code Neurocomputing 58 60 (2004) 1 6 www.elsevier.com/locate/neucom How to read a burst duration code Adam Kepecs a;, John Lisman b a Cold Spring Harbor Laboratory, Marks Building, 1 Bungtown Road, Cold Spring

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics October 17, 2017 CS 361: Probability & Statistics Inference Maximum likelihood: drawbacks A couple of things might trip up max likelihood estimation: 1) Finding the maximum of some functions can be quite

More information

Neural Encoding: Firing Rates and Spike Statistics

Neural Encoding: Firing Rates and Spike Statistics Neural Encoding: Firing Rates and Spike Statistics Dayan and Abbott (21) Chapter 1 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Background: Dirac δ Function Dirac δ function has the following

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes COMP 55 Applied Machine Learning Lecture 2: Gaussian processes Instructor: Ryan Lowe (ryan.lowe@cs.mcgill.ca) Slides mostly by: (herke.vanhoof@mcgill.ca) Class web page: www.cs.mcgill.ca/~hvanho2/comp55

More information

Neurons and Nervous Systems

Neurons and Nervous Systems 34 Neurons and Nervous Systems Concept 34.1 Nervous Systems Consist of Neurons and Glia Nervous systems have two categories of cells: Neurons, or nerve cells, are excitable they generate and transmit electrical

More information

Artificial Neural Network and Fuzzy Logic

Artificial Neural Network and Fuzzy Logic Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks

More information

Decoding conceptual representations

Decoding conceptual representations Decoding conceptual representations!!!! Marcel van Gerven! Computational Cognitive Neuroscience Lab (www.ccnlab.net) Artificial Intelligence Department Donders Centre for Cognition Donders Institute for

More information

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000 Synaptic Input Professor David Heeger September 5, 2000 The purpose of this handout is to go a bit beyond the discussion in Ch. 6 of The Book of Genesis on synaptic input, and give some examples of how

More information

Probabilistic Modeling of Dependencies Among Visual Short-Term Memory Representations

Probabilistic Modeling of Dependencies Among Visual Short-Term Memory Representations Probabilistic Modeling of Dependencies Among Visual Short-Term Memory Representations A. Emin Orhan Robert A. Jacobs Department of Brain & Cognitive Sciences University of Rochester Rochester, NY 4627

More information

Experimental design of fmri studies & Resting-State fmri

Experimental design of fmri studies & Resting-State fmri Methods & Models for fmri Analysis 2016 Experimental design of fmri studies & Resting-State fmri Sandra Iglesias With many thanks for slides & images to: Klaas Enno Stephan, FIL Methods group, Christian

More information

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 206 Jonathan Pillow Homework 8: Logistic Regression & Information Theory Due: Tuesday, April 26, 9:59am Optimization Toolbox One

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics March 14, 2018 CS 361: Probability & Statistics Inference The prior From Bayes rule, we know that we can express our function of interest as Likelihood Prior Posterior The right hand side contains the

More information

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017 Neurons, synapses, and signaling Chapter 48 Information processing Divisions of nervous system Central nervous system (CNS) Brain and a nerve cord Integration center Peripheral nervous system (PNS) Nerves

More information

Why Bayesian? Rigorous approach to address statistical estimation problems. The Bayesian philosophy is mature and powerful.

Why Bayesian? Rigorous approach to address statistical estimation problems. The Bayesian philosophy is mature and powerful. Why Bayesian? Rigorous approach to address statistical estimation problems. The Bayesian philosophy is mature and powerful. Even if you aren t Bayesian, you can define an uninformative prior and everything

More information

Bayesian Models in Machine Learning

Bayesian Models in Machine Learning Bayesian Models in Machine Learning Lukáš Burget Escuela de Ciencias Informáticas 2017 Buenos Aires, July 24-29 2017 Frequentist vs. Bayesian Frequentist point of view: Probability is the frequency of

More information

Computational Cognitive Science

Computational Cognitive Science Computational Cognitive Science Lecture 8: Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk Based on slides by Sharon Goldwater October 14, 2016 Frank Keller Computational

More information

Chapter 48 Neurons, Synapses, and Signaling

Chapter 48 Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling Concept 48.1 Neuron organization and structure reflect function in information transfer Neurons are nerve cells that transfer information within the body Neurons

More information

THE retina in general consists of three layers: photoreceptors

THE retina in general consists of three layers: photoreceptors CS229 MACHINE LEARNING, STANFORD UNIVERSITY, DECEMBER 2016 1 Models of Neuron Coding in Retinal Ganglion Cells and Clustering by Receptive Field Kevin Fegelis, SUID: 005996192, Claire Hebert, SUID: 006122438,

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.

More information

Modelling stochastic neural learning

Modelling stochastic neural learning Modelling stochastic neural learning Computational Neuroscience András Telcs telcs.andras@wigner.mta.hu www.cs.bme.hu/~telcs http://pattern.wigner.mta.hu/participants/andras-telcs Compiled from lectures

More information

Parameter Estimation. Industrial AI Lab.

Parameter Estimation. Industrial AI Lab. Parameter Estimation Industrial AI Lab. Generative Model X Y w y = ω T x + ε ε~n(0, σ 2 ) σ 2 2 Maximum Likelihood Estimation (MLE) Estimate parameters θ ω, σ 2 given a generative model Given observed

More information

Nerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials

Nerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials Nerve Signal Conduction Resting Potential Action Potential Conduction of Action Potentials Resting Potential Resting neurons are always prepared to send a nerve signal. Neuron possesses potential energy

More information

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5 Artificial Neural Networks Q550: Models in Cognitive Science Lecture 5 "Intelligence is 10 million rules." --Doug Lenat The human brain has about 100 billion neurons. With an estimated average of one thousand

More information

Experimental design of fmri studies

Experimental design of fmri studies Experimental design of fmri studies Zurich SPM Course 2016 Sandra Iglesias Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering (IBT) University and ETH Zürich With many thanks for

More information

Sean Escola. Center for Theoretical Neuroscience

Sean Escola. Center for Theoretical Neuroscience Employing hidden Markov models of neural spike-trains toward the improved estimation of linear receptive fields and the decoding of multiple firing regimes Sean Escola Center for Theoretical Neuroscience

More information

LECTURE NOTE #3 PROF. ALAN YUILLE

LECTURE NOTE #3 PROF. ALAN YUILLE LECTURE NOTE #3 PROF. ALAN YUILLE 1. Three Topics (1) Precision and Recall Curves. Receiver Operating Characteristic Curves (ROC). What to do if we do not fix the loss function? (2) The Curse of Dimensionality.

More information

p(d θ ) l(θ ) 1.2 x x x

p(d θ ) l(θ ) 1.2 x x x p(d θ ).2 x 0-7 0.8 x 0-7 0.4 x 0-7 l(θ ) -20-40 -60-80 -00 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ θ x FIGURE 3.. The top graph shows several training points in one dimension, known or assumed to

More information

Bayesian Learning (II)

Bayesian Learning (II) Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning (II) Niels Landwehr Overview Probabilities, expected values, variance Basic concepts of Bayesian learning MAP

More information

Marr's Theory of the Hippocampus: Part I

Marr's Theory of the Hippocampus: Part I Marr's Theory of the Hippocampus: Part I Computational Models of Neural Systems Lecture 3.3 David S. Touretzky October, 2015 David Marr: 1945-1980 10/05/15 Computational Models of Neural Systems 2 Marr

More information

Experimental design of fmri studies

Experimental design of fmri studies Experimental design of fmri studies Sandra Iglesias With many thanks for slides & images to: Klaas Enno Stephan, FIL Methods group, Christian Ruff SPM Course 2015 Overview of SPM Image time-series Kernel

More information

Supplementary Figure 1. Characterization of the single-photon quantum light source based on spontaneous parametric down-conversion (SPDC).

Supplementary Figure 1. Characterization of the single-photon quantum light source based on spontaneous parametric down-conversion (SPDC). .2 Classical light source.8 g (2) ().6.4.2 EMCCD SPAD 2 3.2.4.6.8..2.4.6.8.2 Mean number of photon pairs per pump pulse 4 5 6 7 8 9 2 3 4 Supplementary Figure. Characterization of the single-photon quantum

More information

Gentle Introduction to Infinite Gaussian Mixture Modeling

Gentle Introduction to Infinite Gaussian Mixture Modeling Gentle Introduction to Infinite Gaussian Mixture Modeling with an application in neuroscience By Frank Wood Rasmussen, NIPS 1999 Neuroscience Application: Spike Sorting Important in neuroscience and for

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION Spatio-temporal correlations and visual signaling in a complete neuronal population Jonathan W. Pillow 1, Jonathon Shlens 2, Liam Paninski 3, Alexander Sher 4, Alan M. Litke 4,E.J.Chichilnisky 2, Eero

More information

Nervous System Organization

Nervous System Organization The Nervous System Chapter 44 Nervous System Organization All animals must be able to respond to environmental stimuli -Sensory receptors = Detect stimulus -Motor effectors = Respond to it -The nervous

More information

Sampling: A Brief Review. Workshop on Respondent-driven Sampling Analyst Software

Sampling: A Brief Review. Workshop on Respondent-driven Sampling Analyst Software Sampling: A Brief Review Workshop on Respondent-driven Sampling Analyst Software 201 1 Purpose To review some of the influences on estimates in design-based inference in classic survey sampling methods

More information

BIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer

BIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer 48 Neurons, Synapses, and Signaling CAMPBELL BIOLOGY TENTH EDITION Reece Urry Cain Wasserman Minorsky Jackson Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick Concept 48.1: Neuron organization

More information

Dendrites - receives information from other neuron cells - input receivers.

Dendrites - receives information from other neuron cells - input receivers. The Nerve Tissue Neuron - the nerve cell Dendrites - receives information from other neuron cells - input receivers. Cell body - includes usual parts of the organelles of a cell (nucleus, mitochondria)

More information

Relationship between Least Squares Approximation and Maximum Likelihood Hypotheses

Relationship between Least Squares Approximation and Maximum Likelihood Hypotheses Relationship between Least Squares Approximation and Maximum Likelihood Hypotheses Steven Bergner, Chris Demwell Lecture notes for Cmpt 882 Machine Learning February 19, 2004 Abstract In these notes, a

More information

MEMBRANE POTENTIALS AND ACTION POTENTIALS:

MEMBRANE POTENTIALS AND ACTION POTENTIALS: University of Jordan Faculty of Medicine Department of Physiology & Biochemistry Medical students, 2017/2018 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Review: Membrane physiology

More information

Experimental design of fmri studies

Experimental design of fmri studies Methods & Models for fmri Analysis 2017 Experimental design of fmri studies Sara Tomiello With many thanks for slides & images to: Sandra Iglesias, Klaas Enno Stephan, FIL Methods group, Christian Ruff

More information

Neural variability and Poisson statistics

Neural variability and Poisson statistics Neural variability and Poisson statistics January 15, 2014 1 Introduction We are in the process of deriving the Hodgkin-Huxley model. That model describes how an action potential is generated by ion specic

More information

MODULE -4 BAYEIAN LEARNING

MODULE -4 BAYEIAN LEARNING MODULE -4 BAYEIAN LEARNING CONTENT Introduction Bayes theorem Bayes theorem and concept learning Maximum likelihood and Least Squared Error Hypothesis Maximum likelihood Hypotheses for predicting probabilities

More information

How Behavioral Constraints May Determine Optimal Sensory Representations

How Behavioral Constraints May Determine Optimal Sensory Representations How Behavioral Constraints May Determine Optimal Sensory Representations by Salinas (2006) CPSC 644 Presented by Yoonsuck Choe Motivation Neural response is typically characterized in terms of a tuning

More information

Parameter estimation Conditional risk

Parameter estimation Conditional risk Parameter estimation Conditional risk Formalizing the problem Specify random variables we care about e.g., Commute Time e.g., Heights of buildings in a city We might then pick a particular distribution

More information

Parametric Techniques

Parametric Techniques Parametric Techniques Jason J. Corso SUNY at Buffalo J. Corso (SUNY at Buffalo) Parametric Techniques 1 / 39 Introduction When covering Bayesian Decision Theory, we assumed the full probabilistic structure

More information

Bayesian Learning Features of Bayesian learning methods:

Bayesian Learning Features of Bayesian learning methods: Bayesian Learning Features of Bayesian learning methods: Each observed training example can incrementally decrease or increase the estimated probability that a hypothesis is correct. This provides a more

More information

Computational Explorations in Cognitive Neuroscience Chapter 2

Computational Explorations in Cognitive Neuroscience Chapter 2 Computational Explorations in Cognitive Neuroscience Chapter 2 2.4 The Electrophysiology of the Neuron Some basic principles of electricity are useful for understanding the function of neurons. This is

More information

Probability Theory for Machine Learning. Chris Cremer September 2015

Probability Theory for Machine Learning. Chris Cremer September 2015 Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares

More information

Characterization of Nonlinear Neuron Responses

Characterization of Nonlinear Neuron Responses Characterization of Nonlinear Neuron Responses Mid Year Report Matt Whiteway Department of Applied Mathematics and Scientific Computing whit822@umd.edu Advisor Dr. Daniel A. Butts Neuroscience and Cognitive

More information

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals Control and Integration Neurophysiology Chapters 10-12 Nervous system composed of nervous tissue cells designed to conduct electrical impulses rapid communication to specific cells or groups of cells Endocrine

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION Supplementary discussion 1: Most excitatory and suppressive stimuli for model neurons The model allows us to determine, for each model neuron, the set of most excitatory and suppresive features. First,

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.

More information

Bayesian Inference. Will Penny. 24th February Bayesian Inference. Will Penny. Bayesian Inference. References

Bayesian Inference. Will Penny. 24th February Bayesian Inference. Will Penny. Bayesian Inference. References 24th February 2011 Given probabilities p(a), p(b), and the joint probability p(a, B), we can write the conditional probabilities p(b A) = p(a B) = p(a, B) p(a) p(a, B) p(b) Eliminating p(a, B) gives p(b

More information