Least squares. Václav Hlaváč. Czech Technical University in Prague

Similar documents
Linear Regression & Least Squares!

CISE 301: Numerical Methods Lecture 5, Topic 4 Least Squares, Curve Fitting

Fall 2012 Analysis of Experimental Measurements B. Eisenstein/rev. S. Errede. with respect to λ. 1. χ λ χ λ ( ) λ, and thus:

Definition of Tracking

Principle Component Analysis

Multiple view geometry

Review of linear algebra. Nuno Vasconcelos UCSD

Rank One Update And the Google Matrix by Al Bernstein Signal Science, LLC

An Introduction to Support Vector Machines

Model Fitting and Robust Regression Methods

DCDM BUSINESS SCHOOL NUMERICAL METHODS (COS 233-8) Solutions to Assignment 3. x f(x)

Variable time amplitude amplification and quantum algorithms for linear algebra. Andris Ambainis University of Latvia

Partially Observable Systems. 1 Partially Observable Markov Decision Process (POMDP) Formalism

UNIVERSITY OF IOANNINA DEPARTMENT OF ECONOMICS. M.Sc. in Economics MICROECONOMIC THEORY I. Problem Set II

Learning Enhancement Team

Math 497C Sep 17, Curves and Surfaces Fall 2004, PSU

Introduction to Numerical Integration Part II

Chapter 2 Introduction to Algebra. Dr. Chih-Peng Li ( 李 )

Quiz: Experimental Physics Lab-I

ESCI 342 Atmospheric Dynamics I Lesson 1 Vectors and Vector Calculus

Jens Siebel (University of Applied Sciences Kaiserslautern) An Interactive Introduction to Complex Numbers

Chapter 5 Supplemental Text Material R S T. ij i j ij ijk

Course Review Introduction to Computer Methods

COMPLEX NUMBER & QUADRATIC EQUATION

6 Roots of Equations: Open Methods

Abhilasha Classes Class- XII Date: SOLUTION (Chap - 9,10,12) MM 50 Mob no

Continuous Random Variables

1 Linear Least Squares

Linear and Nonlinear Optimization

COMPLEX NUMBERS INDEX

where I = (n x n) diagonal identity matrix with diagonal elements = 1 and off-diagonal elements = 0; and σ 2 e = variance of (Y X).

SVMs for regression Non-parametric/instance based classification method

Chapter Newton-Raphson Method of Solving a Nonlinear Equation

Lecture 3 Camera Models 2 & Camera Calibration. Professor Silvio Savarese Computational Vision and Geometry Lab

Lecture 2e Orthogonal Complement (pages )

INTRODUCTION TO COMPLEX NUMBERS

CALIBRATION OF SMALL AREA ESTIMATES IN BUSINESS SURVEYS

The Schur-Cohn Algorithm

ME 501A Seminar in Engineering Analysis Page 1

VECTORS VECTORS VECTORS VECTORS. 2. Vector Representation. 1. Definition. 3. Types of Vectors. 5. Vector Operations I. 4. Equal and Opposite Vectors

Lecture 10 Support Vector Machines. Oct

A Family of Multivariate Abel Series Distributions. of Order k

Modeling Labor Supply through Duality and the Slutsky Equation

Stats & Summary

Chapter 7 Generalized and Weighted Least Squares Estimation. In this method, the deviation between the observed and expected values of

Support vector machines for regression

Lecture 36. Finite Element Methods

SVMs for regression Multilayer neural networks

Dennis Bricker, 2001 Dept of Industrial Engineering The University of Iowa. MDP: Taxi page 1

Demand. Demand and Comparative Statics. Graphically. Marshallian Demand. ECON 370: Microeconomic Theory Summer 2004 Rice University Stanley Gilbert

Chapter Newton-Raphson Method of Solving a Nonlinear Equation

( ) [ ( k) ( k) ( x) ( ) ( ) ( ) [ ] ξ [ ] [ ] [ ] ( )( ) i ( ) ( )( ) 2! ( ) = ( ) 3 Interpolation. Polynomial Approximation.

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

e i is a random error

Katholieke Universiteit Leuven Department of Computer Science

CS4495/6495 Introduction to Computer Vision. 3C-L3 Calibrating cameras

Linear Approximation with Regularization and Moving Least Squares

Numerical Methods I Orthogonal Polynomials

GAUSS ELIMINATION. Consider the following system of algebraic linear equations

Things to Memorize: A Partial List. January 27, 2017

Statistics 423 Midterm Examination Winter 2009

Announcements. Image Formation: Outline. The course. Image Formation and Cameras (cont.)

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Generalized Least-Squares Regressions I: Efcient Derivations

Lecture Solution of a System of Linear Equation

1B40 Practical Skills

VECTORS AND TENSORS IV.1.1. INTRODUCTION

International Journal of Pure and Applied Sciences and Technology

CIS587 - Artificial Intelligence. Uncertainty CIS587 - AI. KB for medical diagnosis. Example.

Machine Learning. Support Vector Machines. Le Song. CSE6740/CS7641/ISYE6740, Fall Lecture 8, Sept. 13, 2012 Based on slides from Eric Xing, CMU

A Matrix Algebra Primer

LINEAR ALGEBRA APPLIED

Machine Learning Support Vector Machines SVM

Advanced Machine Learning. An Ising model on 2-D image

( dg. ) 2 dt. + dt. dt j + dh. + dt. r(t) dt. Comparing this equation with the one listed above for the length of see that

Math 426: Probability Final Exam Practice

Effects of polarization on the reflected wave

Joint distribution. Joint distribution. Marginal distributions. Joint distribution

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 9

Best Approximation in the 2-norm

MATRICES AND VECTORS SPACE

INSTITUTE OF AERONAUTICAL ENGINEERING Dundigal, Hyderabad

Exploiting Structure in Probability Distributions Irit Gat-Viks

Eigen Values and Eigen Vectors of a given matrix

Proof that if Voting is Perfect in One Dimension, then the First. Eigenvector Extracted from the Double-Centered Transformed

Maximal Margin Classifier

Lecture Notes on Linear Regression

Applied Statistics Qualifier Examination

Creative Practicing. By Jimmy Wyble edited by David Oakes

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Singular Value Decomposition: Theory and Applications

Math 135, Spring 2012: HW 7

Binomial Distribution: Tossing a coin m times. p = probability of having head from a trial. y = # of having heads from n trials (y = 0, 1,..., m).

The Number of Rows which Equal Certain Row

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Experiments, Outcomes, Events and Random Variables: A Revisit

PhysicsAndMathsTutor.com

ITERATIVE METHODS FOR SOLVING SYSTEMS OF LINEAR ALGEBRAIC EQUATIONS

18.7 Artificial Neural Networks

Fall 2012 Analysis of Experimental Measurements B. Eisenstein/rev. S. Errede

Transcription:

Lest squres Václv Hlváč Czech echncl Unversty n Prgue hlvc@fel.cvut.cz http://cmp.felk.cvut.cz/~hlvc Courtesy: Fred Pghn nd J.P. Lews, SIGGRAPH 2007 Course;

Outlne 2 Lner regresson Geometry of lest-squres Dscusson of the Guss-Mrkov theorem

One-dmensonl regresson 3

One-dmensonl regresson 4 Fnd lne tht represent the est lner reltonshp:

One-dmensonl regresson Prolem: the dt does not go through lne e 5

One-dmensonl regresson Prolem: the dt does not go through lne e 6 Fnd the lne tht mnmzes the sum: ( ) 2

One-dmensonl regresson Prolem: the dt does not go through lne e 7 Fnd the lne tht mnmzes the sum: ( We re lookng for mnmzes ) ˆ e( ) ( 2 tht 2 )

Lest squres emple here re 3 mountns u, y, z tht from one ste hve een mesured s 2474 m, 3882 m, nd 4834 m. But from u, y looks 1422 m tller nd the z looks 2354 m tller, nd from y, z looks 950 m tller. Set up the overdetermned system. 8 A 1 0 0 0 1 0 0 0 1-1 1 0-1 0 1 0-1 1 u y z ~ 2474 3882 4834 1422 2354 950 Wnt to mnmze A- 2

Approches to solve A Norml equtons-quck nd drty QR- stndrd n lrres uses orthogonl decomposton SVD - decomposton whch lso gves ndcton how lner ndependent columns re Conjugte grdent - no decompostons, good for lrge sprse prolems 9

10 Mtr notton Usng the followng nottons nd we cn rewrte the error functon usng lner lger s: n : 1 n : 1 2 2 ) ( ) ( ) ( ) ( ) ( e e

Multdmentonl lner regresson 11 Usng model wth m prmeters +... + 1 1 m m j j j 1 2

12 Usng model wth m prmeters nd n mesurements + + j j j m m... 1 1 2 2 1, 2 1 1, ) ( ) ( ) ( A e e m j j j n m j j j Multdmentonl lner regresson

13 + + + + )... ( : )... ( :... : :.. :, 1,1 1, 1 1,1 1 1,,1 1, 1,1 1 m m n n n m m n m n n m n A mesurement n prmeter 1 Mtr representton

Mnmzng e() e() s flt t mn ) 0 e( mn 14 mn mnmzes e( ) f e() does not go down round mn e() H e ( mn ) s sem- defnte postve mn

Postve sem-defnte A s postve sem- defnte A 0, for ll 15 In 1-D In 2-D

Mnmzng e() 16 e() 1 2 H ( ˆ ) e ˆ

Mnmzng e( ) A 2 17 1 e ( ) H ( ˆ e ) 2 ˆ

Mnmzng e( ) A 2 18 A Aˆ A he norml equton ˆ mnmzes e( ) f 2A A s postve sem- defnte Alwys true

Geometrc nterpretton s vector n R n 19

Geometrc nterpretton s vector n R n he columns of A defne vector spce rnge(a) 20 1 2

Geometrc nterpretton s vector n R n he columns of A defne vector spce rnge(a) A s n rtrry vector n rnge(a) 21 1 1 1 + 2 2 A 2

Geometrc nterpretton s vector n R n he columns of A defne vector spce rnge(a) A s n rtrry vector n rnge(a) 22 A 1 1 1 + 2 2 A 2

Geometrc nterpretton s the orthogonl projecton of onto rnge(a) A ˆ 23 A ( Aˆ ) 0 A Aˆ A Aˆ 1 ˆ ˆ 1 + 2 2 1 Aˆ 2

he norml equton: A Aˆ A 24

he norml equton: Estence: A Aˆ A A Aˆ hs lwys soluton A 25

he norml equton: A Aˆ Estence: A Aˆ A hs lwys soluton Unqueness: the soluton s unque f the columns of A re lnerly ndependent A 26

he norml equton: Aˆ Estence: A Aˆ A hs lwys soluton Unqueness: the soluton s unque f the columns of A re lnerly ndependent A A 27 Aˆ 2 1

Under-constrned prolem 28 1 2

Under-constrned prolem 29 1 2

Under-constrned prolem 30 1 2

Under-constrned prolem Poorly selected dt One or more of the prmeters re redundnt 1 31 2

Under-constrned prolem Poorly selected dt One or more of the prmeters re redundnt 1 32 Add constrnts A A A wth mn 2

How good s the lest-squres? Optmlty: the Guss-Mrkov theorem Let nd e two sets of rndom vrles nd defne: If { } { } j,... e 11, m A1: A2: { },j E(e A3: vr(e A4 : cov( e re not rndom vrles, ) 0, for ll, ) σ, for ll,, e j m ) 0, for ll nd j, 33 hen ˆ 2 rg mn e s the est unsed lner estmtor

34 e no errors n

35 e e no errors n errors n

36 homogeneous errors

37 homogeneous errors non-homogeneous errors

38 no outlers

outlers 39 no outlers outlers