CSE 554 Lecture 6: Deformation I
|
|
- Myles Barrett
- 6 years ago
- Views:
Transcription
1 CSE 554 Lecture 6: Deformation I Fall 20 CSE554 Deformation I Slide Review Alignment Registering source to target by rotation and translation Methods Rigid-body transformations Aligning principle directions (PCA) Aligning corresponding points (SVD) Iterative improvement (ICP) Combines PCA and SVD Input After PCA After ICP Source Target CSE554 Deformation I Slide 2
2 Non-rigid Registration Rigid alignment cannot account for shape variance Non-rigid deformation can give a better fit Source Target Rigid alignment After non-rigid deformation CSE554 Deformation I Slide 3 Non-rigid Registration A minimization problem Minimizing the distance between the deformed source and the target Fitting term Minimizing the distortion to the source shape Distortion term CSE554 Deformation I Slide 4 2
3 Intrinsic vs. Extrinsic Intrinsic methods Deforms points on the source curve/surface App: boundary curve or surface matching Extrinsic methods Deforms all points on and interior to the source curve/surface App: image or volume matching CSE554 Deformation I Slide 5 Laplacian-based Deformation An intrinsic method Simple to implement, produces reasonable results Preserving local shape features Widely used in graphics applications for interactive deformation Reference: Laplacian surface editing, by Sorkine et al., 2004 (citation > 300) CSE554 Deformation I Slide 6 3
4 Setup Input Source with n points: p,,p n Target with m points (m<=n): q,,q m Corresponding to first m source points Output Deformed locations of source points: p,,p n Deformed q 2 Source p =q p 3 =q 3 p 2 An example with 3 target points, two of which are stationary (red) CSE554 Deformation I Slide 7 Overview Finding deformed locations p i that minimize: E E f E d E f : fitting term Measures how close are the deformed source to the target E d : distortion term Measures how much the source shape is changed CSE554 Deformation I Slide 8 4
5 Fitting Term Sum of squared distances between corresponding points Over all pairs of source and target points m E f p i q i 2 q 2 p 2 CSE554 Deformation I Slide 9 Distortion Term Q: How to measure shape? A: By bumpiness at each vertex Laplacian: vector from the centroid of neighbors to the vertex Recall that in fairing, we reduced this vector to smooth out bumps A linear operator over point locations Lp i p i N i p j jni p i where N i ={i,i2, } are indices of neighboring vertices of p i p i p i p i2 2 p i2 CSE554 Deformation I Slide 0 5
6 Distortion Term Minimizing changes in Laplacians during deformation Over all source points n E d Lp i i 2 d i : Laplacian at p i before deformation p i p i i Lp i CSE554 Deformation I Slide Putting Together Finding deformed locations p i that minimize: A quadratic equation in terms of variables (p ix, p iy, p iz ) q i, d i are constants E E f E d m L[] is a linear operator n p i q i 2 Lp i i 2 CSE554 Deformation I Slide 2 6
7 Quadratic Minimization A general form of quadratic minimization: k min a i T x b i 2 There are s variables: x=(x,,x s ) T Each a,, a k is a length-s column vector (linear coefficients) Each b,, b k is a scalar (constant coefficients) k should be greater than s CSE554 Deformation I Slide 3 Quadratic Minimization Re-writing our minimization in the general form E E f E d m n p i q i 2 Lp i i 2 In 2D, there are 2n variables: x = (p x,, p nx, p y,, p ny ) T In 3D, there are 3n variables We will next re-write each quadratic term in 2D as (a i x-b i ) 2 Can be extended easily to 3D CSE554 Deformation I Slide 4 7
8 Quadratic Minimization The a i and b i in the fitting term m E f p i q i 2 p i x q ix 2 p i y q iy 2 There are 2m quadratic terms In the first set of m terms: For i=,,m, b i =q ix, a i contains all zero, except its (i)th entry is. In the second set of m terms: m 2 m a i T x b i 2 For i=,,m, b i+m =q iy, a i+m contains all zero, except its (i+n)th entry is m CSE554 Deformation I Slide 5 Quadratic Minimization The a i and b i in the fitting term m E f p i q i 2 p i x q ix 2 p i y q iy 2 There are 2m quadratic terms m Example with 3 vertices and 2 fitting constraints (n=3; m=2): a T a 2 T a 3 T a 4 T m a i T x b i 2 x p x p 2 x p 3 x p y p 2 y p 3 y m b q x b 2 q 2x b 3 q y b 4 q 2y CSE554 Deformation I Slide 6 8
9 Quadratic Minimization The a i and b i in the distortion term: Lp i p i N i p j jni n E d n Lp i i 2 n Lp i x ix 2 Lp i y iy 2 There are 2n quadratic terms The first set of n terms: 2 n a i T x b i 2 For i=,,n, a i is all zero except the (i)th entry is, the (j)th entries are -/ N i for all jœn i, and b i =d ix The second set of n terms: For i=,,n, a i+n is all zero except the (i+n)th entry is, the (j+n)th entries are -/ N i for all jœn i, and b i+n =d iy CSE554 Deformation I Slide 7 Quadratic Minimization The a i and b i in the distortion term: Lp i p i N i p j jni n E d n Lp i i 2 There are 2n quadratic terms Example with 3 vertices (n=3): n Lp i x ix 2 2 n a i T x b i 2 Lp i y iy 2 a T a T a T a T a T x p x p 2 x p 3 x p y p 2 y p 3 y b x b 2 2x b 3 3x b 4 y b 5 2y b 6 3y p 3 p p 2 a 6 T CSE554 Deformation I Slide 8 9
10 Quadratic Minimization To solve: Re-write in matrix form: min A x B 2 k min a T i x b i 2 where A B a T ª a k T b ª b k is a k by s matrix is a length-k vector CSE554 Deformation I Slide 9 Quadratic Minimization The minimizer is where the partial derivatives are all zero Ax B2 0 x 2A T Ax2A T B A T A x A T B We have: x A T A A T B CSE554 Deformation I Slide 20 0
11 Summary x A T A A T B CSE554 Deformation I Slide 2 Results Deformed A small deformation CSE554 Deformation I Slide 22
12 Results Deformed A larger deformation CSE554 Deformation I Slide 23 Results Deformed Rotation CSE554 Deformation I Slide 24 2
13 Results Deformed Stretching CSE554 Deformation I Slide 25 Results Deformed Shrinking CSE554 Deformation I Slide 26 3
14 Discussion Limitations Local features are skewed, and they don t scale with the model Reason: Laplacian changes with rotation or scale Two bumps that differ by rotation or scale have different Laplacians Which will be penalized by our distortion term p i p i Lp i Lp i Lp i Lp i CSE554 Deformation I Slide 27 A Better Distortion Term Not penalizing rotation and scaling of local features Transforming the original Laplacian vectors before comparing to the deformed Laplacians n E d Lp i T i i 2 T i is a matrix that describes how the local shape around p i is deformed Including translation, rotation and isotropic scaling Collectively called similarity transformations Entries of T i are linear forms of variables p i So that the minimization problem is still quadratic CSE554 Deformation I Slide 28 4
15 Some Questions How to represent similarity transformations as matrices? How to compute T i so that it is linear in the deformed points? We will focus in the derivations of the 2D case 3D results will be briefly presented at the end CSE554 Deformation I Slide 29 Similarity Transforms (2D) Homogeneous coordinates A 2D point: (x,y,) A 2D vector: (x,y,0) A 3D point: (x,y,z,) A 3D vector: (x,y,z,0) CSE554 Deformation I Slide 30 5
16 Similarity Transforms (2D) Translation Cartesian coordinates: vector addition p x v x v y p x Homogeneous coordinates: matrix product p x 0 v x 0 v y 0 0 p x CSE554 Deformation I Slide 3 Similarity Transforms (2D) Isotropic scaling Cartesian coordinates: vector scaling p x s p x Homogeneous coordinates: matrix product p x s s p x CSE554 Deformation I Slide 32 6
17 Similarity Transforms (2D) Rotation Cartesian coordinates: matrix product p x Cos Sin Sin Cos p x Homogeneous coordinates: matrix product p x Cos Sin 0 Sin Cos p x CSE554 Deformation I Slide 33 Similarity Transforms (2D) Summary of elementary similarity transformations To combine transformations: take the product of these matrices p x M p x Trsv Scls 0 v x 0 v y 0 0 s s Translation by vector v Scaling by scalar s Rot Cos Sin 0 Sin Cos Rotation by angle a CSE554 Deformation I Slide 34 7
18 Similarity Transforms (2D) General similarity transformations T a w t x w a t y 0 0 The product of any set of elementary matrices can be written this way Any choice of (a, w, t x, t y ) can be written as a sequence of rotation, isotropic scaling and translation a w t x w a t y 0 0 Trst x,t y.scl Note that a and w can t be both zero a 2 w 2.RotArcTan a w CSE554 Deformation I Slide 35 Computing T i (2D) Suppose we know the deformed locations p i Compute T i as the similarity transform that best fits the neighborhood of p i to that of p i min T i p i p i 2 T i p j p j 2 jni This is a quadratic minimization problem for entries of T i E.g., a, w, t x, t y CSE554 Deformation I Slide 36 8
19 Computing T i (2D) The matrix form of the minimization is: min C a w t x t y p i x p i y p i x p i y ª 2 p i x p i y 0 p i y p i x 0 where C p i x p i y 0 p i y p ix 0 ª ª ª ª is a 2 N i +2 by 4 matrix, and N i ={i, i2, } are indices of neighboring vertices of p i CSE554 Deformation I Slide 37 Computing T i (2D) By quadratic minimization: a w t x t y C T C C T p i x p i y p i x p i y ª Linear expressions of variables (p ix, p iy ) CSE554 Deformation I Slide 38 9
20 Distortion Term (2D) Two parts of each distortion term: Lp i T i i 2 Transformed Laplacian: T i i D a w t x t y D C T C C T p i x p i y p i x p i y ª where D ix i y 0 0 i y ix 0 0 Laplacian of the deformed locations: Lp i L p i x p i y p i x p i y ª where L Ni Ni is a 2 by 2 N i +2 matrix CSE554 Deformation I Slide 39 Distortion Term (2D) Putting together: E d n n Lp i T i i 2 H p i x p i y p i x p i y 2 n H2 p i x p i y p i x p i y 2 where H L D C T C C T and H, H2 are its rows ª ª They form 2n quadratic terms (a i x-b i ) 2 for x = (p x,, p nx, p y,, p ny ) T All b i are zero Each a i can be extracted from H CSE554 Deformation I Slide 40 20
21 Results (2D) Old distortion term New distortion term CSE554 Deformation I Slide 4 Results (2D) Old distortion term New distortion term CSE554 Deformation I Slide 42 2
22 Results (2D) Old distortion term New distortion term CSE554 Deformation I Slide 43 Results (2D) Old distortion term New distortion term CSE554 Deformation I Slide 44 22
23 Registration Use nearest neighbors as corresponding target locations Assuming the source is already close to the target Iterative closest point (ICP). For each point on the source, assign its closest point on the target as its corresponding point. Compute Laplacian-based deformation. A threshold on the closest distance can be used to throw away unlikely correspondences 2. Repeat step () until a termination criteria is met. Maximum iteration or minimum RMSD improvement CSE554 Deformation I Slide 45 Result After rigid alignment iteration of Laplacian 7 iterations of Laplacian Overlaying all curves CSE554 Deformation I Slide 46 23
24 Result Weighting the distortion term E E f w E d large w medium w small w CSE554 Deformation I Slide 47 Similarity Transforms (3D) Elementary transformation matrices To perform a sequence of transformations: take the product of these matrices 0 0 v x 0 0 v Trsv y 0 0 v z p x p y p z M p x p z Scls s s s Translation by vector v Scaling by scalar s RotX, Cos Sin 0 0 Sin Cos Rotation by angle a around X axis CSE554 Deformation I Slide 48 24
25 Similarity Transforms (3D) General similarity transformations in 3D T s h 3 h 2 t x h 3 s h t y h 2 h s t z Approximates the product of a set of elementary matrices Up to a small rotation angle May introduce skewing for large rotations CSE554 Deformation I Slide 49 Computing T i (3D) Assuming known deformation, by quadratic minimization: s h h 2 h 3 t x t y t z C T C C T p i x p i y p i z p i x p i y p i z ª where C p i x 0 p i z p i y 0 0 p i y p i z 0 p i x 0 0 p i z p i y p i x p i x 0 p i z p i y 0 0 p i y p iz 0 p i x 0 0 p i z p i y p i x ª ª ª ª ª ª ª Linear expressions of the deformed points p i C is a 3 N i +3 by 7 matrix CSE554 Deformation I Slide 50 25
26 Distortion Term (3D) Constructing transformed Laplacian: T i i D s h h 2 h 3 t x t y t z p i x p i y p i where D C T C C T z p i x p i y p i z ª where D i x 0 i z i y i y iz 0 i x i z i y i x CSE554 Deformation I Slide 5 Results (3D) Moving targets Stationary targets CSE554 Deformation I Slide 52 26
27 Results (3D) CSE554 Deformation I Slide 53 27
CSE 554 Lecture 7: Alignment
CSE 554 Lecture 7: Alignment Fall 2012 CSE554 Alignment Slide 1 Review Fairing (smoothing) Relocating vertices to achieve a smoother appearance Method: centroid averaging Simplification Reducing vertex
More information10. Linear Systems of ODEs, Matrix multiplication, superposition principle (parts of sections )
c Dr. Igor Zelenko, Fall 2017 1 10. Linear Systems of ODEs, Matrix multiplication, superposition principle (parts of sections 7.2-7.4) 1. When each of the functions F 1, F 2,..., F n in right-hand side
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationCovariance to PCA. CS 510 Lecture #8 February 17, 2014
Covariance to PCA CS 510 Lecture 8 February 17, 2014 Status Update Programming Assignment 2 is due March 7 th Expect questions about your progress at the start of class I still owe you Assignment 1 back
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationCS 468, Spring 2013 Differential Geometry for Computer Science Justin Solomon and Adrian Butscher
http://igl.ethz.ch/projects/arap/arap_web.pdf CS 468, Spring 2013 Differential Geometry for Computer Science Justin Solomon and Adrian Butscher Homework 4: June 5 Project: June 6 Scribe notes: One week
More informationMATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.
MATH 311-504 Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product. Determinant is a scalar assigned to each square matrix. Notation. The determinant of a matrix A = (a ij
More informationLinear Algebra Review. Fei-Fei Li
Linear Algebra Review Fei-Fei Li 1 / 51 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector
More informationCOMP 175 COMPUTER GRAPHICS. Lecture 04: Transform 1. COMP 175: Computer Graphics February 9, Erik Anderson 04 Transform 1
Lecture 04: Transform COMP 75: Computer Graphics February 9, 206 /59 Admin Sign up via email/piazza for your in-person grading Anderson@cs.tufts.edu 2/59 Geometric Transform Apply transforms to a hierarchy
More informationLaplacian Mesh Processing
Sorkine et al. Laplacian Mesh Processing (includes material from Olga Sorkine, Yaron Lipman, Marc Pauly, Adrien Treuille, Marc Alexa and Daniel Cohen-Or) Siddhartha Chaudhuri http://www.cse.iitb.ac.in/~cs749
More informationLinear Algebra March 16, 2019
Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented
More informationRow Space, Column Space, and Nullspace
Row Space, Column Space, and Nullspace MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Introduction Every matrix has associated with it three vector spaces: row space
More informationIntroduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin
1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)
More informationMathematics for 3D Graphics
math 1 Topics Mathematics for 3D Graphics math 1 Points, Vectors, Vertices, Coordinates Dot Products, Cross Products Lines, Planes, Intercepts References Many texts cover the linear algebra used for 3D
More informationPose estimation from point and line correspondences
Pose estimation from point and line correspondences Giorgio Panin October 17, 008 1 Problem formulation Estimate (in a LSE sense) the pose of an object from N correspondences between known object points
More informationCOMP 558 lecture 18 Nov. 15, 2010
Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to
More informationCS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra /34
Linear Algebra /34 Vectors A vector is a magnitude and a direction Magnitude = v Direction Also known as norm, length Represented by unit vectors (vectors with a length of 1 that point along distinct axes)
More informationCOMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017
COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University PRINCIPAL COMPONENT ANALYSIS DIMENSIONALITY
More informationSpan and Linear Independence
Span and Linear Independence It is common to confuse span and linear independence, because although they are different concepts, they are related. To see their relationship, let s revisit the previous
More informationLemma 8: Suppose the N by N matrix A has the following block upper triangular form:
17 4 Determinants and the Inverse of a Square Matrix In this section, we are going to use our knowledge of determinants and their properties to derive an explicit formula for the inverse of a square matrix
More informationBasic Linear Algebra in MATLAB
Basic Linear Algebra in MATLAB 9.29 Optional Lecture 2 In the last optional lecture we learned the the basic type in MATLAB is a matrix of double precision floating point numbers. You learned a number
More informationTwo conventions for coordinate systems. Left-Hand vs Right-Hand. x z. Which is which?
walters@buffalo.edu CSE 480/580 Lecture 2 Slide 3-D Transformations 3-D space Two conventions for coordinate sstems Left-Hand vs Right-Hand (Thumb is the ais, inde is the ais) Which is which? Most graphics
More informationMath 3C Lecture 20. John Douglas Moore
Math 3C Lecture 20 John Douglas Moore May 18, 2009 TENTATIVE FORMULA I Midterm I: 20% Midterm II: 20% Homework: 10% Quizzes: 10% Final: 40% TENTATIVE FORMULA II Higher of two midterms: 30% Homework: 10%
More information10. Rank-nullity Definition Let A M m,n (F ). The row space of A is the span of the rows. The column space of A is the span of the columns.
10. Rank-nullity Definition 10.1. Let A M m,n (F ). The row space of A is the span of the rows. The column space of A is the span of the columns. The nullity ν(a) of A is the dimension of the kernel. The
More information(Refer Slide Time: 1:58 min)
Applied Mechanics Prof. R. K. Mittal Department of Applied Mechanics Indian Institution of Technology, Delhi Lecture No. # 13 Moments and Products of Inertia (Contd.) Today s lecture is lecture thirteen
More information1 Coordinate Transformation
1 Coordinate Transformation 1.1 HOMOGENEOUS COORDINATES A position vector in a three-dimensional space (Fig. 1.1.1) may be represented (i) in vector form as r m = O m M = x m i m + y m j m + z m k m (1.1.1)
More informationRegistration of Time Varying Scans
Simon Flöry FSP Meeting Graz, March 2007 joint work with N. Mitra H. Pottmann M. Ovsjanikov N. Gelfand Introduction Space-Time Registration Properties of Space-Time Registration Applications and Results
More informationCS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works
CS68: The Modern Algorithmic Toolbox Lecture #8: How PCA Works Tim Roughgarden & Gregory Valiant April 20, 206 Introduction Last lecture introduced the idea of principal components analysis (PCA). The
More informationDetermine whether the following system has a trivial solution or non-trivial solution:
Practice Questions Lecture # 7 and 8 Question # Determine whether the following system has a trivial solution or non-trivial solution: x x + x x x x x The coefficient matrix is / R, R R R+ R The corresponding
More informationMath 304 (Spring 2010) - Lecture 2
Math 304 (Spring 010) - Lecture Emre Mengi Department of Mathematics Koç University emengi@ku.edu.tr Lecture - Floating Point Operation Count p.1/10 Efficiency of an algorithm is determined by the total
More informationLecture: Face Recognition and Feature Reduction
Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 11-1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed
More informationCovariance to PCA. CS 510 Lecture #14 February 23, 2018
Covariance to PCA CS 510 Lecture 14 February 23, 2018 Overview: Goal Assume you have a gallery (database) of images, and a probe (test) image. The goal is to find the database image that is most similar
More informationEvaluating Determinants by Row Reduction
Evaluating Determinants by Row Reduction MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Objectives Reduce a matrix to row echelon form and evaluate its determinant.
More informationMATHEMATICS. Units Topics Marks I Relations and Functions 10
MATHEMATICS Course Structure Units Topics Marks I Relations and Functions 10 II Algebra 13 III Calculus 44 IV Vectors and 3-D Geometry 17 V Linear Programming 6 VI Probability 10 Total 100 Course Syllabus
More informationProperties of Linear Transformations from R n to R m
Properties of Linear Transformations from R n to R m MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Topic Overview Relationship between the properties of a matrix transformation
More informationNonlinear Dimensionality Reduction
Nonlinear Dimensionality Reduction Piyush Rai CS5350/6350: Machine Learning October 25, 2011 Recap: Linear Dimensionality Reduction Linear Dimensionality Reduction: Based on a linear projection of the
More informationLecture 22: Section 4.7
Lecture 22: Section 47 Shuanglin Shao December 2, 213 Row Space, Column Space, and Null Space Definition For an m n, a 11 a 12 a 1n a 21 a 22 a 2n A = a m1 a m2 a mn, the vectors r 1 = [ a 11 a 12 a 1n
More informationLecture 6. Numerical methods. Approximation of functions
Lecture 6 Numerical methods Approximation of functions Lecture 6 OUTLINE 1. Approximation and interpolation 2. Least-square method basis functions design matrix residual weighted least squares normal equation
More informationEcon Slides from Lecture 7
Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for
More informationWhat is the Matrix? Linear control of finite-dimensional spaces. November 28, 2010
What is the Matrix? Linear control of finite-dimensional spaces. November 28, 2010 Scott Strong sstrong@mines.edu Colorado School of Mines What is the Matrix? p. 1/20 Overview/Keywords/References Advanced
More informationMath 302 Outcome Statements Winter 2013
Math 302 Outcome Statements Winter 2013 1 Rectangular Space Coordinates; Vectors in the Three-Dimensional Space (a) Cartesian coordinates of a point (b) sphere (c) symmetry about a point, a line, and a
More informationCorners, Blobs & Descriptors. With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros
Corners, Blobs & Descriptors With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros Motivation: Build a Panorama M. Brown and D. G. Lowe. Recognising Panoramas. ICCV 2003 How do we build panorama?
More informationANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3
ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ISSUED 24 FEBRUARY 2018 1 Gaussian elimination Let A be an (m n)-matrix Consider the following row operations on A (1) Swap the positions any
More informationN. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:
0.1 N. L. P. Katta G. Murty, IOE 611 Lecture slides Introductory Lecture NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP does not include everything
More informationThe Simplex Algorithm
8.433 Combinatorial Optimization The Simplex Algorithm October 6, 8 Lecturer: Santosh Vempala We proved the following: Lemma (Farkas). Let A R m n, b R m. Exactly one of the following conditions is true:.
More informationLinear Algebra Review. Fei-Fei Li
Linear Algebra Review Fei-Fei Li 1 / 37 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector
More informationLinear Algebra. Linear Equations and Matrices. Copyright 2005, W.R. Winfrey
Copyright 2005, W.R. Winfrey Topics Preliminaries Systems of Linear Equations Matrices Algebraic Properties of Matrix Operations Special Types of Matrices and Partitioned Matrices Matrix Transformations
More informationLecture 6: Methods for high-dimensional problems
Lecture 6: Methods for high-dimensional problems Hector Corrada Bravo and Rafael A. Irizarry March, 2010 In this Section we will discuss methods where data lies on high-dimensional spaces. In particular,
More informationChapter 1. Linear equations
Chapter 1. Linear equations Review of matrix theory Fields System of linear equations Row-reduced echelon form Invertible matrices Fields Field F, +, F is a set. +:FxFè F, :FxFè F x+y = y+x, x+(y+z)=(x+y)+z
More information7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.
7.5 Operations with Matrices Copyright Cengage Learning. All rights reserved. What You Should Learn Decide whether two matrices are equal. Add and subtract matrices and multiply matrices by scalars. Multiply
More informationLinear Regression (continued)
Linear Regression (continued) Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 6, 2017 1 / 39 Outline 1 Administration 2 Review of last lecture 3 Linear regression
More informationLecture: Face Recognition and Feature Reduction
Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab 1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed in the
More information1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det
What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix
More informationHomogeneous Transformations
Purpose: Homogeneous Transformations The purpose of this chapter is to introduce you to the Homogeneous Transformation. This simple 4 x 4 transformation is used in the geometry engines of CAD systems and
More informationSystem of Linear Equations
Math 20F Linear Algebra Lecture 2 1 System of Linear Equations Slide 1 Definition 1 Fix a set of numbers a ij, b i, where i = 1,, m and j = 1,, n A system of m linear equations in n variables x j, is given
More informationCS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA)
CS68: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA) Tim Roughgarden & Gregory Valiant April 0, 05 Introduction. Lecture Goal Principal components analysis
More informationLecture 12: Solving Systems of Linear Equations by Gaussian Elimination
Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination Winfried Just, Ohio University September 22, 2017 Review: The coefficient matrix Consider a system of m linear equations in n variables.
More informationRigidity of Graphs and Frameworks
Rigidity of Graphs and Frameworks Rigid Frameworks The Rigidity Matrix and the Rigidity Matroid Infinitesimally Rigid Frameworks Rigid Graphs Rigidity in R d, d = 1,2 Global Rigidity in R d, d = 1,2 1
More information13. Systems of Linear Equations 1
13. Systems of Linear Equations 1 Systems of linear equations One of the primary goals of a first course in linear algebra is to impress upon the student how powerful matrix methods are in solving systems
More informationFinal Exam Due on Sunday 05/06
Final Exam Due on Sunday 05/06 The exam should be completed individually without collaboration. However, you are permitted to consult with the textbooks, notes, slides and even internet resources. If you
More informationData Preprocessing Tasks
Data Tasks 1 2 3 Data Reduction 4 We re here. 1 Dimensionality Reduction Dimensionality reduction is a commonly used approach for generating fewer features. Typically used because too many features can
More informationClosed-Form Solution Of Absolute Orientation Using Unit Quaternions
Closed-Form Solution Of Absolute Orientation Using Unit Berthold K. P. Horn Department of Computer and Information Sciences November 11, 2004 Outline 1 Introduction 2 3 The Problem Given: two sets of corresponding
More informationCamera Calibration The purpose of camera calibration is to determine the intrinsic camera parameters (c 0,r 0 ), f, s x, s y, skew parameter (s =
Camera Calibration The purpose of camera calibration is to determine the intrinsic camera parameters (c 0,r 0 ), f, s x, s y, skew parameter (s = cotα), and the lens distortion (radial distortion coefficient
More informationImage Registration Lecture 2: Vectors and Matrices
Image Registration Lecture 2: Vectors and Matrices Prof. Charlene Tsai Lecture Overview Vectors Matrices Basics Orthogonal matrices Singular Value Decomposition (SVD) 2 1 Preliminary Comments Some of this
More informationMedical Image Analysis
Medical Image Analysis CS 593 / 791 Computer Science and Electrical Engineering Dept. West Virginia University 20th January 2006 Outline 1 Discretizing the heat equation 2 Outline 1 Discretizing the heat
More informationa 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.
Chapter 1 LINEAR EQUATIONS 11 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,, a n, b are given real
More informationDefinition 2.3. We define addition and multiplication of matrices as follows.
14 Chapter 2 Matrices In this chapter, we review matrix algebra from Linear Algebra I, consider row and column operations on matrices, and define the rank of a matrix. Along the way prove that the row
More informationHomographies and Estimating Extrinsics
Homographies and Estimating Extrinsics Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps Adapted from: Computer vision: models, learning and inference. Simon J.D. Prince Review: Motivation
More informationStructural Consensus Controllability of Singular Multi-agent Linear Dynamic Systems
Structural Consensus Controllability of Singular Multi-agent Linear Dynamic Systems M. ISAL GARCÍA-PLANAS Universitat Politècnica de Catalunya Departament de Matèmatiques Minería 1, sc. C, 1-3, 08038 arcelona
More informationA Riemannian Framework for Denoising Diffusion Tensor Images
A Riemannian Framework for Denoising Diffusion Tensor Images Manasi Datar No Institute Given Abstract. Diffusion Tensor Imaging (DTI) is a relatively new imaging modality that has been extensively used
More informationMATH 423 Linear Algebra II Lecture 3: Subspaces of vector spaces. Review of complex numbers. Vector space over a field.
MATH 423 Linear Algebra II Lecture 3: Subspaces of vector spaces. Review of complex numbers. Vector space over a field. Vector space A vector space is a set V equipped with two operations, addition V V
More informationChapter 1. Vectors, Matrices, and Linear Spaces
1.4 Solving Systems of Linear Equations 1 Chapter 1. Vectors, Matrices, and Linear Spaces 1.4. Solving Systems of Linear Equations Note. We give an algorithm for solving a system of linear equations (called
More informationMacroscopic theory Rock as 'elastic continuum'
Elasticity and Seismic Waves Macroscopic theory Rock as 'elastic continuum' Elastic body is deformed in response to stress Two types of deformation: Change in volume and shape Equations of motion Wave
More informationLocal regression I. Patrick Breheny. November 1. Kernel weighted averages Local linear regression
Local regression I Patrick Breheny November 1 Patrick Breheny STA 621: Nonparametric Statistics 1/27 Simple local models Kernel weighted averages The Nadaraya-Watson estimator Expected loss and prediction
More informationChapter 2. Vectors and Vector Spaces
2.1. Operations on Vectors 1 Chapter 2. Vectors and Vector Spaces Section 2.1. Operations on Vectors Note. In this section, we define several arithmetic operations on vectors (especially, vector addition
More informationELEMENTARY LINEAR ALGEBRA
ELEMENTARY LINEAR ALGEBRA K R MATTHEWS DEPARTMENT OF MATHEMATICS UNIVERSITY OF QUEENSLAND First Printing, 99 Chapter LINEAR EQUATIONS Introduction to linear equations A linear equation in n unknowns x,
More informationMatrix Arithmetic. a 11 a. A + B = + a m1 a mn. + b. a 11 + b 11 a 1n + b 1n = a m1. b m1 b mn. and scalar multiplication for matrices via.
Matrix Arithmetic There is an arithmetic for matrices that can be viewed as extending the arithmetic we have developed for vectors to the more general setting of rectangular arrays: if A and B are m n
More informationLinear Algebra Homework and Study Guide
Linear Algebra Homework and Study Guide Phil R. Smith, Ph.D. February 28, 20 Homework Problem Sets Organized by Learning Outcomes Test I: Systems of Linear Equations; Matrices Lesson. Give examples of
More informationDirectional Field. Xiao-Ming Fu
Directional Field Xiao-Ming Fu Outlines Introduction Discretization Representation Objectives and Constraints Outlines Introduction Discretization Representation Objectives and Constraints Definition Spatially-varying
More informationEffective Resistance and Schur Complements
Spectral Graph Theory Lecture 8 Effective Resistance and Schur Complements Daniel A Spielman September 28, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened in
More informationLINEAR SYSTEMS, MATRICES, AND VECTORS
ELEMENTARY LINEAR ALGEBRA WORKBOOK CREATED BY SHANNON MARTIN MYERS LINEAR SYSTEMS, MATRICES, AND VECTORS Now that I ve been teaching Linear Algebra for a few years, I thought it would be great to integrate
More informationCSCI5654 (Linear Programming, Fall 2013) Lectures Lectures 10,11 Slide# 1
CSCI5654 (Linear Programming, Fall 2013) Lectures 10-12 Lectures 10,11 Slide# 1 Today s Lecture 1. Introduction to norms: L 1,L 2,L. 2. Casting absolute value and max operators. 3. Norm minimization problems.
More informationFFTs in Graphics and Vision. Homogenous Polynomials and Irreducible Representations
FFTs in Graphics and Vision Homogenous Polynomials and Irreducible Representations 1 Outline The 2π Term in Assignment 1 Homogenous Polynomials Representations of Functions on the Unit-Circle Sub-Representations
More informationCS168: The Modern Algorithmic Toolbox Lecture #8: PCA and the Power Iteration Method
CS168: The Modern Algorithmic Toolbox Lecture #8: PCA and the Power Iteration Method Tim Roughgarden & Gregory Valiant April 15, 015 This lecture began with an extended recap of Lecture 7. Recall that
More informationMultiple View Geometry in Computer Vision
Multiple View Geometry in Computer Vision Prasanna Sahoo Department of Mathematics University of Louisville 1 Scene Planes & Homographies Lecture 19 March 24, 2005 2 In our last lecture, we examined various
More informationA matrix over a field F is a rectangular array of elements from F. The symbol
Chapter MATRICES Matrix arithmetic A matrix over a field F is a rectangular array of elements from F The symbol M m n (F ) denotes the collection of all m n matrices over F Matrices will usually be denoted
More information12. Perturbed Matrices
MAT334 : Applied Linear Algebra Mike Newman, winter 208 2. Perturbed Matrices motivation We want to solve a system Ax = b in a context where A and b are not known exactly. There might be experimental errors,
More informationCS-184: Computer Graphics
CS-184: Computer Graphics Lecture #25: Rigid Body Simulations Tobias Pfaff 537 Soda (Visual Computing Lab) tpfaff@berkeley.edu Reminder Final project presentations next week! Game Physics Types of Materials
More informationExamples and MatLab. Vector and Matrix Material. Matrix Addition R = A + B. Matrix Equality A = B. Matrix Multiplication R = A * B.
Vector and Matrix Material Examples and MatLab Matrix = Rectangular array of numbers, complex numbers or functions If r rows & c columns, r*c elements, r*c is order of matrix. A is n * m matrix Square
More informationLecture: Algorithms for LP, SOCP and SDP
1/53 Lecture: Algorithms for LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html wenzw@pku.edu.cn Acknowledgement:
More informationCS 6820 Fall 2014 Lectures, October 3-20, 2014
Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given
More informationLinear Diffusion. E9 242 STIP- R. Venkatesh Babu IISc
Linear Diffusion Derivation of Heat equation Consider a 2D hot plate with Initial temperature profile I 0 (x, y) Uniform (isotropic) conduction coefficient c Unit thickness (along z) Problem: What is temperature
More informationLecture 15: Random Projections
Lecture 15: Random Projections Introduction to Learning and Analysis of Big Data Kontorovich and Sabato (BGU) Lecture 15 1 / 11 Review of PCA Unsupervised learning technique Performs dimensionality reduction
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationLecture 1 Introduction
L. Vandenberghe EE236A (Fall 2013-14) Lecture 1 Introduction course overview linear optimization examples history approximate syllabus basic definitions linear optimization in vector and matrix notation
More informationCSE 473/573 Computer Vision and Image Processing (CVIP)
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu inwogu@buffalo.edu Lecture 11 Local Features 1 Schedule Last class We started local features Today More on local features Readings for
More information03 - Basic Linear Algebra and 2D Transformations
03 - Basic Linear Algebra and 2D Transformations (invited lecture by Dr. Marcel Campen) Overview In this box, you will find references to Eigen We will briefly overview the basic linear algebra concepts
More information9.1 Linear Programs in canonical form
9.1 Linear Programs in canonical form LP in standard form: max (LP) s.t. where b i R, i = 1,..., m z = j c jx j j a ijx j b i i = 1,..., m x j 0 j = 1,..., n But the Simplex method works only on systems
More information1 Functions and Graphs
1 Functions and Graphs 1.1 Functions Cartesian Coordinate System A Cartesian or rectangular coordinate system is formed by the intersection of a horizontal real number line, usually called the x axis,
More informationP3.C8.COMPLEX NUMBERS
Recall: Within the real number system, we can solve equation of the form and b 2 4ac 0. ax 2 + bx + c =0, where a, b, c R What is R? They are real numbers on the number line e.g: 2, 4, π, 3.167, 2 3 Therefore,
More information