Preliminaries Algebra Calculus Surface Fitting Moving Least Squares Gradients If F is a function assigning a real value to a D point the gradient of F is the vector: Preliminaries Algebra Calculus Extrema If F is a function assigning a real value to a D point then p is an extremum of F only if the gradient of F at p is zero: 0 Preliminaries Algebra Calculus Dot Products If F has the form: for some fixed q then: Preliminaries Algebra Calculus Dot Products If F has the form: for some fixed q then: Preliminaries Algebra Calculus Dot Products If F has the form: then:
Preliminaries Algebra Calculus Dot Products If F has the form: then: Preliminaries Algebra Calculus Lagrangians The extrema of F subject to the constraint Gpc can be found by solving: λ i.e. At p the function F should only be changing in a direction that is perpendicular to the constraint. Preliminaries Algebra Calculus Lagrangians and Symmetric Matrices If M is a symmetric matrix p is an extremum of: subject to the constraint p if and only if p is an eigen-vector of M. λ McLain 9 Challenge: Given a discrete sampling of a function McLain 9 Challenge: Given a discrete sampling of a function complete the function to all of space. Constant For each point in space we can find the closest sample point and use its value.? The reconstruction is piecewise constant
Challenge Given a set of D points {p i } with associated real values i define a function Φ defined over all of D space that fitsapproximates the sample data. Weighted Averaging Given a set of D points {p i x i y i } with associated real values i for any point pxy set: Φ Properties of the weight function r: It should drop off with distance Drop off too slow blurring Drop off too sharp numerical instability Key Idea McLain 9 This type of fitting can be viewed as function optimization: Generalized Framework This type of fitting can be viewed as function optimization: Let L be a space of functions at each point p find p L minimizing the weighted error: and set: Φ Generalized Framework In the case that L is the space of constant order polynomials this reduces to solving for the real value p that minimizes: Generalized Framework In the case that L is the space of constant order polynomials this reduces to solving for the real value p that minimizes: Thus:
!!.- 6 8 6 * Example: We can let L be the six-dimensional space of second order polynomials: L {c 00 c 0 xc 0 yc xyc 0 x c 0 y } Example: We can let L be the six-dimensional space of second order polynomials: L {c 00 c 0 xc 0 yc xyc 0 x c 0 y } Then we would like to solve for the polynomial p xy minimizing: Solving for p : Let C be the vector of coefficients: 00 0 0 0 0 And let V i be vector: Solving for p : To solve for p we need to find the vector C p minimizing: [ ] This lets us write: Solving for p : [ ] Setting the derivative with respect to C p equal to zero gives: Solving for p : 6 Using the fact that w uv wu t v gives: 0
.- 6 8 6 *! Solving for p : 6 Using the fact that w uv wu t v gives: 0 Solving for p : Given the optimal coefficients of the polynomial we can now set the value at the point p: Φ Using higher order functions gives us more freedom to better fit the data MLS vs. Splines MLS A continuous set of functions glued together with the weight function. Continuity defined by the continuity of the weight function. No structure on the distribution of the sample points. Splines A discrete set of functions designed to glue together at the end points. Continuity defined by the order of the polynomial at the joints. Sample points are distributed with a regular grid topology. What is a manifold? What is a manifold? A smooth -manifold embedded in D is a surface smoothly stitched together from the graphs of functions. MLS Approach Levin As in McLain define a continuous set of graphs and stitch them together using the weighting function: f x x f x x f x x
6 MLS Approach Levin As in McLain define a continuous set of graphs and stitch them together using the weighting function: Given a set of points {r i } and a weighting function for any point r find a graph that locally fits the data points. MLS Approach Levin As in McLain define a continuous set of graphs and stitch them together using the weighting function:. For each point define locally best fitting plane. Using the height of the points from the plane as the sample value apply MLS to complete the function. MLS Approach Basic Approach Given a set of points {r i } and a weighting function for any point r find a graph that locally fits the data points.. Plane fitting: Solve for the plane with normal a r and containing the point q r that minimizes: Plane Fitting Basic Approach Solving for q r Taking the derivative with respect to q r and setting the result equal to zero gives: Plane Fitting Basic Approach Solving for q Taking the derivative with respect to q and setting the result equal to zero gives: Plane Fitting Basic Approach Solving for a r Using the fact that w! uv wu t v gives: *.- 0
0. @ A @ * - @ A! * 6? : : : 9 9 9 8 Plane Fitting Basic Approach Solving for a r Since the matrix: is symmetric a r must be an eigen-vector. MLS Approach Basic Approach. Function fitting: Fix an orthonormal basis {v r w r } perpendicular to a r and express r i as the D sample: with value: Now use the McLain approach to fit a quadratic polynomial P r xy as the graph. MLS Approach Basic Approach Projection: Given a set of points {r i } and a weighting function for any point r:. Find the fitting plane a r q r at r. Find the fitting polynomial P r xy at r. Express r as: Set the projection of r to be: π > ; < MLS Approach Basic Approach Advantage: Gives a way of sending points to the surface without explicitly defining where the surface is. Limitations: The projection operator π is not actually a projection: ππr πr. If r is close to the surface πr will not necessarily be mapped onto the approximating surface. MLS Approach Levin The projection operator is not actually a projection because:. The fitting plane computed for r is not the same fitting plane computed for πr. The graph fitted to r is not the same graph fitted to πr since the weighting coefficients change: π@ MLS Approach Levin Problem The fitting plane computed for r is not the same fitting plane computed for πr Observation The projection πr only moves r along the normal direction of the plane.
MLS Approach Levin Problem The fitting plane computed for r is not the same fitting plane computed for πr MLS Approach Levin Problem. The graph fitted to r is not the same graph fitted to πr π Observation The projection πr only moves r along the normal direction of the plane. Solution Modify the definition of best fitting plane so that it locally only depends on the line from r in the direction of a r. MLS Approach Levin Problem. The graph fitted to r is not the same graph fitted to πr π Solution Modify the weighting for the best fitting function so that it only depends on q r and not on r: MLS Approach Basic Approach Advantages: The projection operator π is a projection for points sufficiently close to the surface: ππrπr If r is close to the surface πr will be mapped onto the approximating surface. MLS Approach Basic Approach Disadvantages: By changing the fitting function: it is now necessary to optimize the fitting plane over the weighting functions which can be very difficult. 8