Rapid Discrimination of Crystal Polymorphic Forms Using Nonlinear Optical Stokes Ellipsometric Microscopy

Size: px
Start display at page:

Download "Rapid Discrimination of Crystal Polymorphic Forms Using Nonlinear Optical Stokes Ellipsometric Microscopy"

Transcription

1 Rapid Discrimination of Crystal Polymorphic Forms Using Nonlinear Optical Stokes Ellipsometric Microscopy Paul D. Schmitt, Emma L. DeWalt, Ximeng Y. Dow, and Garth J. Simpson* *Corresponding uthor: Department of Chemistry, Purdue University, West Lafayette, IN. Supporting Information Example single color Fourier coefficient images (Figure SI-1), images of in-plane crystal orientation from the OrientationJ plug-in (NIH ImageJ) (Figure SI-2), polarized laser transmittance images (Figure SI-3), and an analytical/ graphical explanation of the process of linear discriminant analysis S-1

2 Figure SI-1: Raw 32-bit SHG image (upper left, only 1 of 10 polarizations shown) and 5 Fourier coefficient images (converted to voltages) -E shown on red, green, blue, cyan, and magenta color scales (respectively). Merging of these 5 independent images into one 5 color image enables single-image visualization of the data set. Together with knowledge of the scale of each color s lookup table, this single 5 color image contains the entirety of the polarization-dependent information recovered from the NOSE the measurement. S-2

3 Figure SI-2: Output of orientation analysis via OrientationJ for a single FOV for each form of D- Mannitol. The hue represents the in-plane rotation angle for each pixel according to the look-uptable at right, with pixel intensity retained from the original images. Images are shown on different brightness scales for ease of visualization. The calculated values of phi are averaged across each crystal and then fed into an algorithm for recovery of additional orientation angles (theta, psi), overall SHG intensity, and the unique, non-zero, local-frame tensor elements. Figure SI-3: Example polarized laser transmittance images for orthorhombic (left) and monoclinic (right) forms of D-Mannitol. Collection of the polarized laser transmittance is simultaneous with the acquisition of SHG, and allows recovery of information describing sample birefringence. The simultaneous acquisition of these two imaging modalities (laser transmittance and SHG) also ensures perfect registry between the two images. S-3

4 Description of Linear Discriminant nalysis (LD): LD was used to identify a single projected dimension that best resolved the sample response (polynomial coefficients, local-frame tensor elements, etc.) from each class (monoclinic and orthorhombic, and respectively). Data are treated as vectors in a multi-dimensional space, where the number of vector elements (e.g., 10 for the recovered polynomial coefficients) dictates the dimensionality. In LD, the classes undergo optimal separation through maximizing the value of the Fisher linear discriminant, J, given in equation SI-1. 1,2 ( ) Jw ( ) s s (Equation SI-1) Here, and are the scalar projections of the mean data vectors for each class onto the projected dimension ( w ), and s and s are the variances of each class after projection onto the same axis. The axis w was found through solving the eigenvector-eigenvalue equation shown below: S S w Jw (Equation SI-2) 1 W Where the matrices S and W given by the equations below S represent the within and between class variance (respectively), and are S ( ) ( ) T (Equation SI-3) With an analagous expression for N 1 T S S S ; S (( x ) ) (( x ) ) (Equation SI-4) W W W W i i N 1 i 1 relevant class ( N total vectors in class ), and S W not shown here explicitly. x i indicates the i th vector from the is the average vector from class. graphical representation of LD for two classes of 2-dimensional data is shown below. S-4

5 Figure SI-4: Left: Raw data to undergo analysis by LD. 5 2-dimensional vectors (arranged together into matrices) make up each class of data, and. Middle: 2-dimensional plot of the data, along with the Fisher discriminant found from solving equations SI-2 through SI-4. Right: The same data post-projection onto the Fisher discriminant. The data are plotted against themselves for ease in visualizing the separation. S-5

6 Cited References: (1) Fisher, R.. nn. Hum. Genet. 1936, 7 (2), (2) McLachlan, G. J. Discriminant nalysis and Statistical Pattern Recognition; John Wiley & Sons: Hoboken, New Jersey, S-6

Fisher Linear Discriminant Analysis

Fisher Linear Discriminant Analysis Fisher Linear Discriminant Analysis Cheng Li, Bingyu Wang August 31, 2014 1 What s LDA Fisher Linear Discriminant Analysis (also called Linear Discriminant Analysis(LDA)) are methods used in statistics,

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

Symmetry. 2-D Symmetry. 2-D Symmetry. Symmetry. EESC 2100: Mineralogy 1. Symmetry Elements 1. Rotation. Symmetry Elements 1. Rotation.

Symmetry. 2-D Symmetry. 2-D Symmetry. Symmetry. EESC 2100: Mineralogy 1. Symmetry Elements 1. Rotation. Symmetry Elements 1. Rotation. Symmetry a. Two-fold rotation = 30 o /2 rotation a. Two-fold rotation = 30 o /2 rotation Operation Motif = the symbol for a two-fold rotation EESC 2100: Mineralogy 1 a. Two-fold rotation = 30 o /2 rotation

More information

5. Discriminant analysis

5. Discriminant analysis 5. Discriminant analysis We continue from Bayes s rule presented in Section 3 on p. 85 (5.1) where c i is a class, x isap-dimensional vector (data case) and we use class conditional probability (density

More information

Family Feud Review. Linear Algebra. October 22, 2013

Family Feud Review. Linear Algebra. October 22, 2013 Review Linear Algebra October 22, 2013 Question 1 Let A and B be matrices. If AB is a 4 7 matrix, then determine the dimensions of A and B if A has 19 columns. Answer 1 Answer A is a 4 19 matrix, while

More information

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides Intelligent Data Analysis and Probabilistic Inference Lecture

More information

Data Preprocessing Tasks

Data Preprocessing Tasks Data Tasks 1 2 3 Data Reduction 4 We re here. 1 Dimensionality Reduction Dimensionality reduction is a commonly used approach for generating fewer features. Typically used because too many features can

More information

Principal Component Analysis and Linear Discriminant Analysis

Principal Component Analysis and Linear Discriminant Analysis Principal Component Analysis and Linear Discriminant Analysis Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/29

More information

Face Recognition and Biometric Systems

Face Recognition and Biometric Systems The Eigenfaces method Plan of the lecture Principal Components Analysis main idea Feature extraction by PCA face recognition Eigenfaces training feature extraction Literature M.A.Turk, A.P.Pentland Face

More information

ECE 661: Homework 10 Fall 2014

ECE 661: Homework 10 Fall 2014 ECE 661: Homework 10 Fall 2014 This homework consists of the following two parts: (1) Face recognition with PCA and LDA for dimensionality reduction and the nearest-neighborhood rule for classification;

More information

Second Harmonic Generation in Solid-State Materials

Second Harmonic Generation in Solid-State Materials Second Harmonic Generation in Solid-State Materials Galan Moody Alan D. Bristow, Steven T. Cundiff Summer 2007 Abstract Solid-state materials are examined as a function of azimuthal angle using optical

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors 5 Eigenvalues and Eigenvectors 5.2 THE CHARACTERISTIC EQUATION DETERMINANATS n n Let A be an matrix, let U be any echelon form obtained from A by row replacements and row interchanges (without scaling),

More information

GEOG 4110/5100 Advanced Remote Sensing Lecture 15

GEOG 4110/5100 Advanced Remote Sensing Lecture 15 GEOG 4110/5100 Advanced Remote Sensing Lecture 15 Principal Component Analysis Relevant reading: Richards. Chapters 6.3* http://www.ce.yildiz.edu.tr/personal/songul/file/1097/principal_components.pdf *For

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms Data Mining and Analysis: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA

More information

PCA and LDA. Man-Wai MAK

PCA and LDA. Man-Wai MAK PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer

More information

PCA and LDA. Man-Wai MAK

PCA and LDA. Man-Wai MAK PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer

More information

CS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA)

CS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA) CS68: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA) Tim Roughgarden & Gregory Valiant April 0, 05 Introduction. Lecture Goal Principal components analysis

More information

VAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where:

VAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where: VAR Model (k-variate VAR(p model (in the Reduced Form: where: Y t = A + B 1 Y t-1 + B 2 Y t-2 + + B p Y t-p + ε t Y t = (y 1t, y 2t,, y kt : a (k x 1 vector of time series variables A: a (k x 1 vector

More information

Singular Value Decomposition and Digital Image Compression

Singular Value Decomposition and Digital Image Compression Singular Value Decomposition and Digital Image Compression Chris Bingham December 1, 016 Page 1 of Abstract The purpose of this document is to be a very basic introduction to the singular value decomposition

More information

1 Principal Components Analysis

1 Principal Components Analysis Lecture 3 and 4 Sept. 18 and Sept.20-2006 Data Visualization STAT 442 / 890, CM 462 Lecture: Ali Ghodsi 1 Principal Components Analysis Principal components analysis (PCA) is a very popular technique for

More information

Data Mining. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395

Data Mining. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395 Data Mining Dimensionality reduction Hamid Beigy Sharif University of Technology Fall 1395 Hamid Beigy (Sharif University of Technology) Data Mining Fall 1395 1 / 42 Outline 1 Introduction 2 Feature selection

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors 5 Eigenvalues and Eigenvectors 5.2 THE CHARACTERISTIC EQUATION DETERMINANATS nn Let A be an matrix, let U be any echelon form obtained from A by row replacements and row interchanges (without scaling),

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Math 7, Professor Ramras Linear Algebra Practice Problems () Consider the following system of linear equations in the variables x, y, and z, in which the constants a and b are real numbers. x y + z = a

More information

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis .. December 20, 2013 Todays lecture. (PCA) (PLS-R) (LDA) . (PCA) is a method often used to reduce the dimension of a large dataset to one of a more manageble size. The new dataset can then be used to make

More information

Supporting Information

Supporting Information Supporting Information Visualizing the Effect of Partial Oxide Formation on Single Silver Nanoparticle Electrodissolution Vignesh Sundaresan, Joseph W. Monaghan, and Katherine A. Willets* Department of

More information

Polarized Light. Second Edition, Revised and Expanded. Dennis Goldstein Air Force Research Laboratory Eglin Air Force Base, Florida, U.S.A.

Polarized Light. Second Edition, Revised and Expanded. Dennis Goldstein Air Force Research Laboratory Eglin Air Force Base, Florida, U.S.A. Polarized Light Second Edition, Revised and Expanded Dennis Goldstein Air Force Research Laboratory Eglin Air Force Base, Florida, U.S.A. ш DEK KER MARCEL DEKKER, INC. NEW YORK BASEL Contents Preface to

More information

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction ECE 521 Lecture 11 (not on midterm material) 13 February 2017 K-means clustering, Dimensionality reduction With thanks to Ruslan Salakhutdinov for an earlier version of the slides Overview K-means clustering

More information

PCA and admixture models

PCA and admixture models PCA and admixture models CM226: Machine Learning for Bioinformatics. Fall 2016 Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar, Alkes Price PCA and admixture models 1 / 57 Announcements HW1

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

JUST THE MATHS UNIT NUMBER 9.9. MATRICES 9 (Modal & spectral matrices) A.J.Hobson

JUST THE MATHS UNIT NUMBER 9.9. MATRICES 9 (Modal & spectral matrices) A.J.Hobson JUST THE MATHS UNIT NUMBER 9.9 MATRICES 9 (Modal & spectral matrices) by A.J.Hobson 9.9. Assumptions and definitions 9.9.2 Diagonalisation of a matrix 9.9.3 Exercises 9.9.4 Answers to exercises UNIT 9.9

More information

Lecture Notes on the Gaussian Distribution

Lecture Notes on the Gaussian Distribution Lecture Notes on the Gaussian Distribution Hairong Qi The Gaussian distribution is also referred to as the normal distribution or the bell curve distribution for its bell-shaped density curve. There s

More information

IR-MAD Iteratively Re-weighted Multivariate Alteration Detection

IR-MAD Iteratively Re-weighted Multivariate Alteration Detection IR-MAD Iteratively Re-weighted Multivariate Alteration Detection Nielsen, A. A., Conradsen, K., & Simpson, J. J. (1998). Multivariate Alteration Detection (MAD) and MAF Postprocessing in Multispectral,

More information

Auger Electron Spectroscopy Overview

Auger Electron Spectroscopy Overview Auger Electron Spectroscopy Overview Also known as: AES, Auger, SAM 1 Auger Electron Spectroscopy E KLL = E K - E L - E L AES Spectra of Cu EdN(E)/dE Auger Electron E N(E) x 5 E KLL Cu MNN Cu LMM E f E

More information

3.024 Electrical, Optical, and Magnetic Properties of Materials Spring 2012 Recitation 1. Office Hours: MWF 9am-10am or by appointment

3.024 Electrical, Optical, and Magnetic Properties of Materials Spring 2012 Recitation 1. Office Hours: MWF 9am-10am or by  appointment Adam Floyd Hannon Office Hours: MWF 9am-10am or by e-mail appointment Topic Outline 1. a. Fourier Transform & b. Fourier Series 2. Linear Algebra Review 3. Eigenvalue/Eigenvector Problems 1. a. Fourier

More information

CS281 Section 4: Factor Analysis and PCA

CS281 Section 4: Factor Analysis and PCA CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we

More information

Lecture: Face Recognition

Lecture: Face Recognition Lecture: Face Recognition Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 12-1 What we will learn today Introduction to face recognition The Eigenfaces Algorithm Linear

More information

A Three-dimensional Physiologically Realistic Model of the Retina

A Three-dimensional Physiologically Realistic Model of the Retina A Three-dimensional Physiologically Realistic Model of the Retina Michael Tadross, Cameron Whitehouse, Melissa Hornstein, Vicky Eng and Evangelia Micheli-Tzanakou Department of Biomedical Engineering 617

More information

Motivating the Covariance Matrix

Motivating the Covariance Matrix Motivating the Covariance Matrix Raúl Rojas Computer Science Department Freie Universität Berlin January 2009 Abstract This note reviews some interesting properties of the covariance matrix and its role

More information

Linear & Non-Linear Discriminant Analysis! Hugh R. Wilson

Linear & Non-Linear Discriminant Analysis! Hugh R. Wilson Linear & Non-Linear Discriminant Analysis! Hugh R. Wilson PCA Review! Supervised learning! Fisher linear discriminant analysis! Nonlinear discriminant analysis! Research example! Multiple Classes! Unsupervised

More information

Portable Raman Spectroscopy for the Study of Polymorphs and Monitoring Polymorphic Transitions

Portable Raman Spectroscopy for the Study of Polymorphs and Monitoring Polymorphic Transitions Introduction Portable Raman Spectroscopy for the Study of Polymorphs and Monitoring Polymorphic Transitions Philip Zhou and Katherine A. Bakeev, B&W Tek, Inc., Newark, DE Materials can exist in different

More information

Face Recognition. Lecture-14

Face Recognition. Lecture-14 Face Recognition Lecture-14 Face Recognition imple Approach Recognize faces (mug shots) using gray levels (appearance). Each image is mapped to a long vector of gray levels. everal views of each person

More information

Exercise Set 7.2. Skills

Exercise Set 7.2. Skills Orthogonally diagonalizable matrix Spectral decomposition (or eigenvalue decomposition) Schur decomposition Subdiagonal Upper Hessenburg form Upper Hessenburg decomposition Skills Be able to recognize

More information

Face Recognition. Lecture-14

Face Recognition. Lecture-14 Face Recognition Lecture-14 Face Recognition imple Approach Recognize faces mug shots) using gray levels appearance). Each image is mapped to a long vector of gray levels. everal views of each person are

More information

Linear Dimensionality Reduction

Linear Dimensionality Reduction Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Principal Component Analysis 3 Factor Analysis

More information

ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015

ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015 ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015 http://intelligentoptimization.org/lionbook Roberto Battiti

More information

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces Singular Value Decomposition This handout is a review of some basic concepts in linear algebra For a detailed introduction, consult a linear algebra text Linear lgebra and its pplications by Gilbert Strang

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition Feature Extraction Hamid R. Rabiee Jafar Muhammadi, Alireza Ghasemi, Payam Siyari Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Dimensionality Reduction

More information

STA 414/2104: Lecture 8

STA 414/2104: Lecture 8 STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks Delivered by Mark Ebden With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable

More information

BSO Crystals for the HHCAL Detector Concept

BSO Crystals for the HHCAL Detector Concept BSO Crystals for the HHCAL Detector Concept Fan Yang 1, Hui Yuan 2, Liyuan Zhang 1, Ren-Yuan Zhu 1 1 California Institute of Technology 2 Shanghai Institute of Ceramics 1 Homogeneous Hadronic Calorimeter

More information

Eigenvalue and Eigenvector Homework

Eigenvalue and Eigenvector Homework Eigenvalue and Eigenvector Homework Olena Bormashenko November 4, 2 For each of the matrices A below, do the following:. Find the characteristic polynomial of A, and use it to find all the eigenvalues

More information

We use the overhead arrow to denote a column vector, i.e., a number with a direction. For example, in three-space, we write

We use the overhead arrow to denote a column vector, i.e., a number with a direction. For example, in three-space, we write 1 MATH FACTS 11 Vectors 111 Definition We use the overhead arrow to denote a column vector, ie, a number with a direction For example, in three-space, we write The elements of a vector have a graphical

More information

Least-Squares Performance of Analog Product Codes

Least-Squares Performance of Analog Product Codes Copyright 004 IEEE Published in the Proceedings of the Asilomar Conference on Signals, Systems and Computers, 7-0 ovember 004, Pacific Grove, California, USA Least-Squares Performance of Analog Product

More information

Linear Algebra Methods for Data Mining

Linear Algebra Methods for Data Mining Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 Linear Discriminant Analysis Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki Principal

More information

Principal Component Analysis vs. Independent Component Analysis for Damage Detection

Principal Component Analysis vs. Independent Component Analysis for Damage Detection 6th European Workshop on Structural Health Monitoring - Fr..D.4 Principal Component Analysis vs. Independent Component Analysis for Damage Detection D. A. TIBADUIZA, L. E. MUJICA, M. ANAYA, J. RODELLAR

More information

PROJECT 2 DYNAMICS OF MACHINES 41514

PROJECT 2 DYNAMICS OF MACHINES 41514 PROJECT 2 DYNAMICS OF MACHINES 41514 Dynamics of Rotor-Bearing System Lateral Vibrations and Stability Threshold of Rotors Supported On Hydrodynamic Bearing and Ball Bearing. Ilmar Ferreira Santos, Prof.

More information

Digital Image Processing Chapter 11 Representation and Description

Digital Image Processing Chapter 11 Representation and Description Digital Image Processing Chapter 11 Representation and Description Last Updated: July 20, 2003 Preview 11.1 Representation 11.1.1 Chain Codes Chain codes are used to represent a boundary by a connected

More information

Garth J. Simpson. Department of Chemistry Purdue University. Garth J. Simpson

Garth J. Simpson. Department of Chemistry Purdue University. Garth J. Simpson Objectives: 1. Discuss the benefits of coordinateindependent visualization of molecular tensors. 2. Describe sagittary representations for the resonant molecular tensor. 3. Introduce space-filling hyper-ellipsoid

More information

Estimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition

Estimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition Estimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition Seema Sud 1 1 The Aerospace Corporation, 4851 Stonecroft Blvd. Chantilly, VA 20151 Abstract

More information

Template-based Recognition of Static Sitting Postures

Template-based Recognition of Static Sitting Postures Template-based Recognition of Static Sitting Postures Manli Zhu 1, Aleix M. Martínez 1 and Hong Z. Tan 2 1 Dept. of Electrical Engineering The Ohio State University 2 School of Electrical and Computer

More information

Math 312 Final Exam Jerry L. Kazdan May 5, :00 2:00

Math 312 Final Exam Jerry L. Kazdan May 5, :00 2:00 Math 32 Final Exam Jerry L. Kazdan May, 204 2:00 2:00 Directions This exam has three parts. Part A has shorter questions, (6 points each), Part B has 6 True/False questions ( points each), and Part C has

More information

7. Variable extraction and dimensionality reduction

7. Variable extraction and dimensionality reduction 7. Variable extraction and dimensionality reduction The goal of the variable selection in the preceding chapter was to find least useful variables so that it would be possible to reduce the dimensionality

More information

Integrating MD Nastran with Optical Performance Analysis

Integrating MD Nastran with Optical Performance Analysis Integrating MD Nastran with Optical Performance Analysis Victor Genberg, Gregory Michels Sigmadyne, Inc., 803 West Ave, Rochester, NY 14611 genberg@sigmadyne.com Abstract The development of products in

More information

How to analyze multiple distance matrices

How to analyze multiple distance matrices DISTATIS How to analyze multiple distance matrices Hervé Abdi & Dominique Valentin Overview. Origin and goal of the method DISTATIS is a generalization of classical multidimensional scaling (MDS see the

More information

Regularized Discriminant Analysis and Reduced-Rank LDA

Regularized Discriminant Analysis and Reduced-Rank LDA Regularized Discriminant Analysis and Reduced-Rank LDA Department of Statistics The Pennsylvania State University Email: jiali@stat.psu.edu Regularized Discriminant Analysis A compromise between LDA and

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecture Slides for Machine Learning 2nd Edition ETHEM ALPAYDIN, modified by Leonardo Bobadilla and some parts from http://www.cs.tau.ac.il/~apartzin/machinelearning/ The MIT Press, 2010

More information

Pattern Recognition 2

Pattern Recognition 2 Pattern Recognition 2 KNN,, Dr. Terence Sim School of Computing National University of Singapore Outline 1 2 3 4 5 Outline 1 2 3 4 5 The Bayes Classifier is theoretically optimum. That is, prob. of error

More information

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) PCA transforms the original input space into a lower dimensional space, by constructing dimensions that are linear combinations

More information

Quantum Computing Lecture 2. Review of Linear Algebra

Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

More information

STATS 306B: Unsupervised Learning Spring Lecture 12 May 7

STATS 306B: Unsupervised Learning Spring Lecture 12 May 7 STATS 306B: Unsupervised Learning Spring 2014 Lecture 12 May 7 Lecturer: Lester Mackey Scribe: Lan Huong, Snigdha Panigrahi 12.1 Beyond Linear State Space Modeling Last lecture we completed our discussion

More information

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis.

More information

OPTIMAL CONTROL AND ESTIMATION

OPTIMAL CONTROL AND ESTIMATION OPTIMAL CONTROL AND ESTIMATION Robert F. Stengel Department of Mechanical and Aerospace Engineering Princeton University, Princeton, New Jersey DOVER PUBLICATIONS, INC. New York CONTENTS 1. INTRODUCTION

More information

FREE VIBRATION RESPONSE OF UNDAMPED SYSTEMS

FREE VIBRATION RESPONSE OF UNDAMPED SYSTEMS Lecture Notes: STRUCTURAL DYNAMICS / FALL 2011 / Page: 1 FREE VIBRATION RESPONSE OF UNDAMPED SYSTEMS : : 0, 0 As demonstrated previously, the above Equation of Motion (free-vibration equation) has a solution

More information

Fisher s Linear Discriminant Analysis

Fisher s Linear Discriminant Analysis Fisher s Linear Discriminant Analysis Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

DECOMPOSING A SYMMETRIC MATRIX BY WAY OF ITS EXPONENTIAL FUNCTION

DECOMPOSING A SYMMETRIC MATRIX BY WAY OF ITS EXPONENTIAL FUNCTION DECOMPOSING A SYMMETRIC MATRIX BY WAY OF ITS EXPONENTIAL FUNCTION MALIHA BAHRI, WILLIAM COLE, BRAD JOHNSTON, AND MADELINE MAYES Abstract. It is well known that one can develop the spectral theorem to decompose

More information

GY-343 Petrology Petrographic Microscope Laboratory

GY-343 Petrology Petrographic Microscope Laboratory Introduction to the Petrographic Microscope In this laboratory you will be using the petrographic microscope to analyze thin sections of various types of igneous rocks. You will be assigned a thin section

More information

CSC 411 Lecture 12: Principal Component Analysis

CSC 411 Lecture 12: Principal Component Analysis CSC 411 Lecture 12: Principal Component Analysis Roger Grosse, Amir-massoud Farahmand, and Juan Carrasquilla University of Toronto UofT CSC 411: 12-PCA 1 / 23 Overview Today we ll cover the first unsupervised

More information

OPTI 511L Fall A. Demonstrate frequency doubling of a YAG laser (1064 nm -> 532 nm).

OPTI 511L Fall A. Demonstrate frequency doubling of a YAG laser (1064 nm -> 532 nm). R.J. Jones Optical Sciences OPTI 511L Fall 2017 Experiment 3: Second Harmonic Generation (SHG) (1 week lab) In this experiment we produce 0.53 µm (green) light by frequency doubling of a 1.06 µm (infrared)

More information

Classifying Galaxy Morphology using Machine Learning

Classifying Galaxy Morphology using Machine Learning Julian Kates-Harbeck, Introduction: Classifying Galaxy Morphology using Machine Learning The goal of this project is to classify galaxy morphologies. Generally, galaxy morphologies fall into one of two

More information

Three-Dimensional Electron Microscopy of Macromolecular Assemblies

Three-Dimensional Electron Microscopy of Macromolecular Assemblies Three-Dimensional Electron Microscopy of Macromolecular Assemblies Joachim Frank Wadsworth Center for Laboratories and Research State of New York Department of Health The Governor Nelson A. Rockefeller

More information

Simulation study on using moment functions for sufficient dimension reduction

Simulation study on using moment functions for sufficient dimension reduction Michigan Technological University Digital Commons @ Michigan Tech Dissertations, Master's Theses and Master's Reports - Open Dissertations, Master's Theses and Master's Reports 2012 Simulation study on

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Principal Component Analysis Barnabás Póczos Contents Motivation PCA algorithms Applications Some of these slides are taken from Karl Booksh Research

More information

CITS 4402 Computer Vision

CITS 4402 Computer Vision CITS 4402 Computer Vision A/Prof Ajmal Mian Adj/A/Prof Mehdi Ravanbakhsh Lecture 06 Object Recognition Objectives To understand the concept of image based object recognition To learn how to match images

More information

EJL R Sβ. sum. General objective: Reduce the complexity of the analysis by exploiting symmetry. Garth J. Simpson

EJL R Sβ. sum. General objective: Reduce the complexity of the analysis by exploiting symmetry. Garth J. Simpson e sum = EJL R Sβ General objective: Reduce the complexity of the analysis by exploiting symmetry. Specific Objectives: 1. The molecular symmetry matrix S. How to populate it.. Relationships between the

More information

Canonical Correlation & Principle Components Analysis

Canonical Correlation & Principle Components Analysis Canonical Correlation & Principle Components Analysis Aaron French Canonical Correlation Canonical Correlation is used to analyze correlation between two sets of variables when there is one set of IVs

More information

Principal Components Analysis

Principal Components Analysis Principal Components Analysis Santiago Paternain, Aryan Mokhtari and Alejandro Ribeiro March 29, 2018 At this point we have already seen how the Discrete Fourier Transform and the Discrete Cosine Transform

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

Dimension Reduction (PCA, ICA, CCA, FLD,

Dimension Reduction (PCA, ICA, CCA, FLD, Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction

More information

Optical Interface for MSC.Nastran

Optical Interface for MSC.Nastran Optical Interface for MSC.Nastran Victor Genberg, Keith Doyle, Gregory Michels Sigmadyne, Inc., 803 West Ave, Rochester, NY 14611 genberg@sigmadyne.com Abstract Thermal and structural output from MSC.Nastran

More information

Imaging of vibrating objects using speckle subtraction

Imaging of vibrating objects using speckle subtraction Rollins College Rollins Scholarship Online Student-Faculty Collaborative Research 8-1-2010 Imaging of vibrating objects using speckle subtraction Thomas R. Moore TMOORE@rollins.edu Ashley E. Cannaday Sarah

More information

Dimension. Eigenvalue and eigenvector

Dimension. Eigenvalue and eigenvector Dimension. Eigenvalue and eigenvector Math 112, week 9 Goals: Bases, dimension, rank-nullity theorem. Eigenvalue and eigenvector. Suggested Textbook Readings: Sections 4.5, 4.6, 5.1, 5.2 Week 9: Dimension,

More information

Lecture 7 Predictive Coding & Quantization

Lecture 7 Predictive Coding & Quantization Shujun LI (李树钧): INF-10845-20091 Multimedia Coding Lecture 7 Predictive Coding & Quantization June 3, 2009 Outline Predictive Coding Motion Estimation and Compensation Context-Based Coding Quantization

More information

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello Artificial Intelligence Module 2 Feature Selection Andrea Torsello We have seen that high dimensional data is hard to classify (curse of dimensionality) Often however, the data does not fill all the space

More information

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction Using PCA/LDA Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction One approach to deal with high dimensional data is by reducing their

More information

Learning features by contrasting natural images with noise

Learning features by contrasting natural images with noise Learning features by contrasting natural images with noise Michael Gutmann 1 and Aapo Hyvärinen 12 1 Dept. of Computer Science and HIIT, University of Helsinki, P.O. Box 68, FIN-00014 University of Helsinki,

More information

arxiv: v1 [cs.it] 6 Jun 2007

arxiv: v1 [cs.it] 6 Jun 2007 Position Coding arxiv:07060869v1 [csit 6 Jun 2007 Edward Aboufadel (Grand Valley State University) Liz Smietana (John Carroll University) Tim Armstrong (Reed College) June 1 2007 draft 1 The Fly Pentop

More information

Subspace Methods for Visual Learning and Recognition

Subspace Methods for Visual Learning and Recognition This is a shortened version of the tutorial given at the ECCV 2002, Copenhagen, and ICPR 2002, Quebec City. Copyright 2002 by Aleš Leonardis, University of Ljubljana, and Horst Bischof, Graz University

More information

Lecture 13: Tensors in paleomagnetism. What are tensors anyway? anisotropy of magnetic susceptibility (AMS) how to find it. what to do with it

Lecture 13: Tensors in paleomagnetism. What are tensors anyway? anisotropy of magnetic susceptibility (AMS) how to find it. what to do with it Lecture 13: Tensors in paleomagnetism What are tensors anyway? anisotropy of magnetic susceptibility (AMS) how to find it what to do with it anisotropy of magnetic remanence 1 What is a tensor? An array

More information

Opto-Mechanical I/F for ANSYS

Opto-Mechanical I/F for ANSYS Opto-Mechanical I/F for ANSYS Victor Genberg, Gregory Michels, Keith Doyle Sigmadyne, Inc. Abstract Thermal and structural output from ANSYS is not in a form useful for optical analysis software. Temperatures,

More information

Karhunen-Loève Transform KLT. JanKees van der Poel D.Sc. Student, Mechanical Engineering

Karhunen-Loève Transform KLT. JanKees van der Poel D.Sc. Student, Mechanical Engineering Karhunen-Loève Transform KLT JanKees van der Poel D.Sc. Student, Mechanical Engineering Karhunen-Loève Transform Has many names cited in literature: Karhunen-Loève Transform (KLT); Karhunen-Loève Decomposition

More information

AOL Spring Wavefront Sensing. Figure 1: Principle of operation of the Shack-Hartmann wavefront sensor

AOL Spring Wavefront Sensing. Figure 1: Principle of operation of the Shack-Hartmann wavefront sensor AOL Spring Wavefront Sensing The Shack Hartmann Wavefront Sensor system provides accurate, high-speed measurements of the wavefront shape and intensity distribution of beams by analyzing the location and

More information