Optimization Criterion. Minimum Distance Minimum Time Minimum Acceleration Change Minimum Torque Change Minimum End Point Variance

Similar documents
Dipartimento di Elettronica e Informazione e Bioingegneria Robotics

Two or more points can be used to describe a rigid body. This will eliminate the need to define rotational coordinates for the body!

15.093J Optimization Methods. Lecture 22: Barrier Interior Point Algorithms

ECONOMIC OPERATION OF POWER SYSTEMS

Week 1, Lecture 2. Neural Network Basics. Announcements: HW 1 Due on 10/8 Data sets for HW 1 are online Project selection 10/11. Suggested reading :

ME 539, Fall 2008: Learning-Based Control

1. Linearization of a nonlinear system given in the form of a system of ordinary differential equations

Machine Learning Regression I Hamid R. Rabiee [Slides are based on Bishop Book] Spring

NYU Center for Data Science: DS-GA 1003 Machine Learning and Computational Statistics (Spring 2018)

Systems of Particles: Angular Momentum and Work Energy Principle

Machine Learning Brett Bernstein

Lecture 4. Hw 1 and 2 will be reoped after class for every body. New deadline 4/20 Hw 3 and 4 online (Nima is lead)

The Blood Testing Problem

Basics of Inference. Lecture 21: Bayesian Inference. Review - Example - Defective Parts, cont. Review - Example - Defective Parts

Problem Set 4 Due Oct, 12

( ) = is larger than. the variance of X V

Nonequilibrium Excess Carriers in Semiconductors

BIOSTATISTICAL METHODS FOR TRANSLATIONAL & CLINICAL RESEARCH

Chapter 6: BINOMIAL PROBABILITIES

ECE 442. Spring, Lecture - 4

Notes The Incremental Motion Model:

Regenerative Property

Optimization Methods MIT 2.098/6.255/ Final exam

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5

ECE534, Spring 2018: Final Exam

( ) = p and P( i = b) = q.

Expectation maximization

Bayesian Methods: Introduction to Multi-parameter Models

10-701/ Machine Learning Mid-term Exam Solution

Classification of DT signals

Stable Gait Synthesis and Analysis of a 12-degree of Freedom Biped Robot in Sagittal and Frontal Planes

Pharmacogenomics. Yossi Levy Yossi Levy. Central Dogma of Molecular Biology

[ 47 ] then T ( m ) is true for all n a. 2. The greatest integer function : [ ] is defined by selling [ x]

Outline. Linear regression. Regularization functions. Polynomial curve fitting. Stochastic gradient descent for regression. MLE for regression

Castiel, Supernatural, Season 6, Episode 18

Special Modeling Techniques

2.2 One-dimensional Elastodynamics

Dimension of a Maximum Volume

Data Analysis and Statistical Methods Statistics 651

Machine Learning Theory (CS 6783)

Boosting. Professor Ameet Talwalkar. Professor Ameet Talwalkar CS260 Machine Learning Algorithms March 1, / 32

arxiv: v2 [cs.lg] 20 May 2010

Recursive Updating Fixed Parameter

As metioed earlier, directly forecastig o idividual product demads usually result i a far-off forecast that ot oly impairs the quality of subsequet ma

ln(i G ) 26.1 Review 26.2 Statistics of multiple breakdowns M Rows HBD SBD N Atoms Time

MATH 10550, EXAM 3 SOLUTIONS

Maximum Likelihood Estimation and Complexity Regularization

CHAPTER 10 INFINITE SEQUENCES AND SERIES

Machine Learning Brett Bernstein

HE ATOM & APPROXIMATION METHODS MORE GENERAL VARIATIONAL TREATMENT. Examples:

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Class 23. Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science. Marquette University MATH 1700

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Lecture 6 Simple alternatives and the Neyman-Pearson lemma

PC5215 Numerical Recipes with Applications - Review Problems

Microscale Modelling of the Frequency Dependent Resistivity of Porous Media

Announcements, Nov. 19 th

Hypothesis Testing, Model Selection, and Prediction in Least Squares and Maximum Likelihood Estimation

Introductory statistics

Perceptron. Inner-product scalar Perceptron. XOR problem. Gradient descent Stochastic Approximation to gradient descent 5/10/10

Linear Associator Linear Layer

2 One-Dimensional Elasticity

Adaptive Control: CDS 270 I

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Machine Learning. Logistic Regression -- generative verses discriminative classifier. Le Song /15-781, Spring 2008

Using the IML Procedure to Examine the Efficacy of a New Control Charting Technique

State Space Representation

Kinetics of Complex Reactions

Algorithms for Clustering

Sample Size Estimation in the Proportional Hazards Model for K-sample or Regression Settings Scott S. Emerson, M.D., Ph.D.

Mixtures of Gaussians and the EM Algorithm

Topic 9: Sampling Distributions of Estimators

Design Sensitivity Analysis and Optimization of Nonlinear Transient Dynamics

Hypothesis Testing. Evaluation of Performance of Learned h. Issues. Trade-off Between Bias and Variance

Dept. of Biomed. Eng. BME801: Inverse Problems in Bioengineering Kyung Hee Univ.

Question1 Multiple choices (circle the most appropriate one):

Lecture 3. Electron and Hole Transport in Semiconductors

Statistical Inference Based on Extremum Estimators

Correlation Regression

The Expectation-Maximization (EM) Algorithm

Topic 9: Sampling Distributions of Estimators

Lecture 9. NMOS Field Effect Transistor (NMOSFET or NFET)

Optimally Sparse SVMs

Generalized Semi- Markov Processes (GSMP)

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

REGRESSION (Physics 1210 Notes, Partial Modified Appendix A)

Lecture 7: Linear Classification Methods

Mathematical Modeling of Optimum 3 Step Stress Accelerated Life Testing for Generalized Pareto Distribution

Fast preconditioned solution of Navier-Stokes equations for compressible flows with physics

Topic 9: Sampling Distributions of Estimators

LECTURE 17: Linear Discriminant Functions

Vector Quantization: a Limiting Case of EM

Appendix: The Laplace Transform

Topics Machine learning: lecture 3. Linear regression. Linear regression. Linear regression. Linear regression

Putnam Training Exercise Counting, Probability, Pigeonhole Principle (Answers)

Songklanakarin Journal of Science and Technology SJST R1 Teerapabolarn

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise)

Appendix A: Mathematical Formulae and Statistical Tables

Transcription:

Lecture XIV Trajectory Formatio through Otimizatio Cotets: Otimizatio Criterio Miimum istace Miimum Time Miimum Acceleratio Chage Miimum Torque Chage Miimum Ed Poit Variace Usig Motor Redudacies Efficietly Lecture XIV: MLSC - r. Sethu Vijayakumar 1

Trajectory Plaig Phases Trajectory geeratio Ivolves comutatio of the best trajectory for the object Force istributio Ivolves determiig the force distributio betwee differet actuators a.k.a. resolvig actuator redudacy Some of the aroaches solve the trajectory geeratio ad force distributio roblems searately i two hases. It has bee argued that solvig the two issues simultaeously as a global otimizatio roblem is suerior i may cases. Kiematics: refers to geometrical ad time-based roerties of motio; the variables of iterest are ositios e.g. joit agles or had Cartesia coordiates ad their corresodig velocities, acceleratios ad higher derivatives. yamics: refers to the forces required to roduce motio ad is therefore itimately liked to the roerties of the object such as it s mass, iertia ad stiffess. Lecture XIV: MLSC - r. Sethu Vijayakumar 2

Some Simle Cost Fuctios Shortest istace Refer: F.C.Park ad R.W.Brockett, Kiematic exterity of robot mechaisms, It. Joural of Robotic Research, 1391,. 1-15, 1994 Miimum Acceleratio Refer: L. Noakes, G. Heiziger, ad B. ade, Cubic slies o curved surfaces, IMA Joural of Mathematical Cotrol & Iformatio, 6,. 465-473, 1989. Miimum Time Bag-Bag Cotrol Refer: Z. Shiller, Time Eergy otimal cotrol of articulated systems with geometric ath costraits, I. Proc. of 1994 Itl. Coferece o Robotics ad Automatio,. 2680-2685, Sa iego, CA, 1994. x ẋ.. x t t t Lecture XIV: MLSC - r. Sethu Vijayakumar 3

Miimum Jerk Trajectory Plaig Proosed by Flash & Hoga 1985 Otimizatio Criterio miimizes the jerk i the trajectory T 3 2 3 2 d x d y CJ dt dt dt + 0 3 3 2 1 For movemet i x-y lae The miimum-jerk solutio ca be writte as: x t y t where x y 0 0 + x 0 + y 0 tˆ t / t y f x f 15ˆ t f ad 15ˆ t x 4 0 4, 6ˆ t 6ˆ t y 0 5 5 10ˆ t 3 10ˆ t 3 are iitial coordiate s at t 0. eeds oly o the kiematics of the task ad is ideedet of the hysical structure or dyamics of the lat Predicts bell shaed velocity rofiles Lecture XIV: MLSC - r. Sethu Vijayakumar 4

Miimum Torque Chage Plaig Proosed by Uo, Kawato & Suzuki 1989 Otimizatio Criterio miimizes the chage of torque 2 2 1 T d d CT dt dt dt τ1 τ 2 + 2 0 For two joit arm or robot The Mi. Jerk ad Mi. Torque chage cost fuctios are closely related sice acceleratio is roortioal to torque at zero seed. No Aalytical solutio ossible for Mi. Torque chage criterio but iterative solutio is ossible. Like Mi. Jerk, redicts bell shaed velocity rofiles. But also redicts that the form of the trajectory should vary across the arm s worksace. Lecture XIV: MLSC - r. Sethu Vijayakumar 5

Mi. Jerk vs Mi. Torque Chage Oe way of resolvig how humas la their movemet is by settig u a exerimet which ca distiguish betwee the kiematic vs dyamic las Lecture XIV: MLSC - r. Sethu Vijayakumar 6

Mi. Jerk vs Mi. Torque Chage II No erturbatios at the start & ed. Most studies suggest that trajectories are laed i visually-based kiematic coordiates Lecture XIV: MLSC - r. Sethu Vijayakumar 7

Miimum Edoit Variace Plaig Proosed by Harris & Wolert 1998 Also called TOPS Trajectory Otimizatio i the Presece of Sigal deedet oise Basic Theory: Sigle hysiological assumtio that eural sigals are corruted by oise whose variace icreases with the size of the cotrol sigal. I the resece of such oise, the shae of the trajectory is selected to miimize the variace of the fial ed-oit ositio. Biologically more lausible sice we do have access to ed oit errors as oosed to comlex otimizatio rocesses mi. jerk ad mi. torque chage itegrated over the etire movemet that other otimizatio criterio suggest the brai has to solve. Refer: Harris & Wolert, Sigal-deedet oise determies motor laig, Nature, vol. 394, 780-784 Lecture XIV: MLSC - r. Sethu Vijayakumar 8

Testig the Iteral Model Learig Hyothesis Lecture XIV: MLSC - r. Sethu Vijayakumar 9

Learig of Iteral Models Usig the force maiuladum, oe ca create a iterestig exerimet i which you modify the dyamics of your arm movemet i oly the x-y lae. Humas are very adet at learig these chaged dyamic fields ad adat at a relatively short time scale. Whe we remove these effects, after effects of learig ca be felt for some time before re-adatatio, rovidig evidece that we lear iteral models ad use them i a redictive feedforward fashio Lecture XIV: MLSC - r. Sethu Vijayakumar 10

Multile Model Hyothesis Lecture XIV: MLSC - r. Sethu Vijayakumar 11

Lecture XIV: MLSC - r. Sethu Vijayakumar 12 Recursive Bayes Estimatio d k k 1 x If we exlicitly idex the idividual exerieces by, i.e., Ð {x 1,,x }, 1 x 0 1 1 where d x x We get Py x Px ypy Px Usig

Multile model hyothesis II Multile Paired Forward-Iverse Models MPFIM Lecture XIV: MLSC - r. Sethu Vijayakumar 13

Resolvig Motor Redudacies Iverse Kiematics 4 7 x f eye +head : R R x x x y y T R L R L Head1 Head 2 Head 3 Retial islacemet T EyeLP EyeLT EyeRP EyeRT Eye $ Head dislacemets Lecture XIV: MLSC - r. Sethu Vijayakumar 14

Resolvig kiematics with RMRC Resolved Motio Rate Cotrol with locality i joit ositios x J Itegrate over small icremets where liearity holds to get comlete trajectory Based o collected traiig data forward kiematics i velocity sace was almost comletely liear, irresective of joit ositio x & J & Hece, i this case, we ca use seudo-iversio with the costat Jacobia!! Lecture XIV: MLSC - r. Sethu Vijayakumar 15

Pseudoiverse & Null Sace maiulatio Give x & J &, what is? & & J k L # # J x& + I - J J ull, i J T JJ T 1 2 L ot curret, i 1 # k ull : Pseudoiverse wi curret, i default, i Otimizatio based o gradiet descet i Null sace 2 ot mi w i curret, i default, i Otimizatio criterio Lecture XIV: MLSC - r. Sethu Vijayakumar 16 Iertial Weightig for each joit