Previously Monte Carlo Integration

Similar documents
DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

1. Select the unique answer (choice) for each problem. Write only the answer.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

EIGENVALUES AND EIGENVECTORS 3

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math Matrix Algebra

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Dimension. Eigenvalue and eigenvector

MAT 1302B Mathematical Methods II

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Linear Algebra Practice Problems

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Diagonalization of Matrix

Recall : Eigenvalues and Eigenvectors

Math 3191 Applied Linear Algebra

Solutions to Final Exam

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Study Guide for Linear Algebra Exam 2

Lecture 15, 16: Diagonalization

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

1 Last time: least-squares problems

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Linear Algebra II Lecture 22

MATH 583A REVIEW SESSION #1

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Eigenvalues, Eigenvectors, and Diagonalization

I. Multiple Choice Questions (Answer any eight)

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Solution of Linear Equations

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

2. Every linear system with the same number of equations as unknowns has a unique solution.

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Background Mathematics (2/2) 1. David Barber

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

1 9/5 Matrices, vectors, and their applications

Applied Linear Algebra in Geoscience Using MATLAB

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

MA 265 FINAL EXAM Fall 2012

2018 Fall 2210Q Section 013 Midterm Exam II Solution

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

M340L Final Exam Solutions May 13, 1995

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Econ Slides from Lecture 7

Conceptual Questions for Review

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

LINEAR ALGEBRA QUESTION BANK

MATH 1553-C MIDTERM EXAMINATION 3

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Singular Value Decomposition

Math 315: Linear Algebra Solutions to Assignment 7

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Introduction to Machine Learning CMU-10701

Results: MCMC Dancers, q=10, n=500

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

2.3. VECTOR SPACES 25

Announcements Wednesday, November 7

Q1 Q2 Q3 Q4 Tot Letr Xtra

MTH 5102 Linear Algebra Practice Final Exam April 26, 2016

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

235 Final exam review questions

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Eigenvalues and Eigenvectors

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Eigenvalue and Eigenvector Homework

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Review of Linear Algebra

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

Calculating determinants for larger matrices

Announcements Monday, November 06

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Eigenvalues and Eigenvectors

Linear Algebra, part 2 Eigenvalues, eigenvectors and least squares solutions

Markov Chain Monte Carlo

Review problems for MA 54, Fall 2004.

Exercise Set 7.2. Skills

Math Camp Notes: Linear Algebra II

MTH 464: Computational Linear Algebra

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Lecture 3 Eigenvalues and Eigenvectors

Maths for Signals and Systems Linear Algebra in Engineering

LINEAR ALGEBRA SUMMARY SHEET.

A Review of Linear Algebra

Review of Some Concepts from Linear Algebra: Part 2

Online Exercises for Linear Algebra XM511

Vector Spaces and Linear Transformations

systems of linear di erential If the homogeneous linear di erential system is diagonalizable,

Eigenvalues and Eigenvectors A =

Transcription:

Previously Simulation, sampling Monte Carlo Simulations Inverse cdf method Rejection sampling Today: sampling cont., Bayesian inference via sampling Eigenvalues and Eigenvectors Markov processes, PageRank alg. 1 Previously Monte Carlo Integration 1 Compute f (x)dx 0 1. Draw u 1,L,u n from U(0,1) 2. Approximate integral as ˆ I = 1 n ( f (u 1 ) +L+ f (u n )) Now to arbitrary integral from on interval (a,b) b ˆ (b a) f (x)dx I = ( f (u 1 ) +L+ f (u n )) a n This is interpreted as expected value of RV ˆ I = E[(b a) f (x)]) 2 1

How to sample from f (x) Inversion Method, where F is cdf of f (x) X = F 1 (U) 3 Rejection sampling 1. Generate x from g(x) 2. Draw u from unif (0,1) 3. Accept if u < f (x) / Mg(x) 4. The accepted follows f (x) Problems many samples x can get rejected if g(x) is too different from f (x) 2

Importance sampling Suppose you have g(x) and f (x) But do not know the scale M Sample from g(x) x 1,..., x n Calculate weight for each sample w i = f (x i ) /g(x i ) Mass for each sample is q i = w i / n i=1 w i 5 Importance Sampling The weights are biggest when the distributions agree 6 3

Importance Sampling in 1D Image Frank Deallert 7 Bayes calculations Recall p(x z) l(x;z) p(x) We can do rejection sampling from posterior, but no guarantee that p(x) >= p(x z) Idea, sample from p(x), and give each sample importance weight w i = p(z x i ) q i = w i / n i=1 w i 8 4

Bayesian inference via sampling Robot example Probability of robot s position in 2D Different representations of probability Histograms Mixture of Gaussians Does not scale up to higher dimensions 9 Monte Carlo Expected Value Prior sample from mixture of Gaussians Compute expected value of the bearing (angle) to origin 10 Image Frank Deallert 5

Rejection sampling example Image Frank Deallert 11 Importance Sampling Example Image Frank Deallert 12 6

Linear algebra - digression Previously Rank of a matrix Determinant of a matrix Matrix inverse Matrix pseudo-inverse Solving system of equations Solving linear least squares problems 13 Determinants If det(a) 0 then matrix is invertible If det(a) = 0 then matrix is not invertible A is rank deficient Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data 14 7

Eigenvalues and Eigenvectors Motivated by solution to differential equations For square matrices For scalar ODE s We look for the solutions of the following type exponentials u(t) = [v(t) w(t)] T Behavior varies If a > 0 unstable a = 0 neutrally stable a < 0 stable Substitute back to the equation Eigenvalues and Eigenvectors eigenvalue eigenvector Solve the equation: x is in the null space of λ is chosen such that (1) has a null space Computation of eigenvalues and eigenvectors (for dim 2,3) 1. Compute determinant 2. Find roots (eigenvalues) of the polynomial such that determinant = 0 3. For each eigenvalue solve the equation (1) For larger matrices alternative ways of computation 8

Eigenvalues and Eigenvectors For the previous example We will get special solutions to ODE Their linear combination is also a solution (due to the linearity of ) In the context of diff. equations special meaning Any solution can be expressed as linear combination Individual solutions correspond to modes Eigenvalues and Eigenvectors Only special vectors are eigenvectors - such vectors whose direction will not be changed by the transformation A (only scale) - they correspond to normal modes of the system act independently eigenvalues eigenvectors Examples 2, 3 Whatever A does to an arbitrary vector is fully determined by its eigenvalues and eigenvectors 9

Eigenvalues and Eigenvectors - Diagonalization Given a square matrix A and its eigenvalues and eigenvectors matrix can be diagonalized Matrix of eigenvectors AS = SΛ Diagonal matrix of eigenvalues If some of the eigenvalues are the same, eigenvectors are not independent Diagonalization If there are no zero eigenvalues matrix is invertible If there are no repeated eigenvalues matrix is diagonalizable If all the eigenvalues are different then eigenvectors are linearly independent For Symmetric Matrices If A is symmetric orthonormal matrix of eigenvectors i.e. for a covariance matrix or some matrix B = A^TA Diagonal matrix of eigenvalues 10

Dimensionality Reduction Next: Many dimensions are often interdependent (correlated); We can: Reduce the dimensionality of problems; Transform interdependent coordinates into significant and independent ones; 11