Supporting information for: Norovirus capsid proteins self-assemble through. biphasic kinetics via long-lived stave-like.

Size: px
Start display at page:

Download "Supporting information for: Norovirus capsid proteins self-assemble through. biphasic kinetics via long-lived stave-like."

Transcription

1 Supporting information for: Norovirus capsid proteins self-assemble through biphasic kinetics via long-lived stave-like intermediates Guillaume Tresset,, Clémence Le Cœur, Jean-François Bryche, Mouna Tatou, Mehdi Zeghal, Annie Charpilienne, Didier Poncet, Doru Constantin, and Stéphane Bressanelli, Laboratoire de Physique des Solides, Université Paris-Sud, CNRS, Orsay, France, Institut de Chimie et des Matériaux Paris-Est, Université Paris-Est, CNRS, Thiais, France, and Laboratoire de Virologie Moléculaire et Structurale, CNRS, INRA, Gif-sur-Yvette, France Phone: +33 (0) ; +33 (0) Fax: +33 (0) ; +33 (0) NB2-VP1 State in Dissociation Buffer The scattered intensity at q = 0 is given by I 0 I(q = 0) = K(v, b)cm where K(v, b) is a constant depending upon the specific volume v and the scattering length density b of the protein, To whom correspondence should be addressed Laboratoire de Physique des Solides Institut de Chimie et des Matériaux Paris-Est Laboratoire de Virologie Moléculaire et Structurale S1

2 c is the mass concentration and M the molecular weight. K is computed from K = (v b) 2 /N A with N A being Avogadro s number. The specific volume was determined by CRYSOL S1 for NV- VP1 and reads v = 0.76 cm 3 /g. b = r el (ρ protein ρ solvent ) where r el = nm is the Thomson scattering length, and ρ solvent = 334 nm 3 and ρ protein = 426 nm 3 (after CRYSOL for NV-VP1) are the electron densities of the water solvent and of the protein respectively. Note that the weak contrast b with respect to water is typical for proteins. With a concentration c = g/cm 3 measured by spectrophotometry at 280 nm (extinction coefficient ε percent = 8.1) and I 0 = cm 1 (Figure SS1) estimated from the GNOM reconstruction, S2 we arrive at M = kda. Figure S1: SAXS pattern of NB2-VP1 dimers in dissociation buffer. A reconstruction performed with the program GNOM (red line) is compared to experimental data (black symbols). The dimer concentration was 28 µm. The inset shows the pair distribution function p(r) computed by GNOM along with the calculated radius of gyration R g and the maximum dimension D max. S2

3 Singular Value Decomposition (SVD) of the Matrix of Intensities The M N (M > N) matrix of intensities I can be expressed as the product I = UΣV T (1) where U is a M N matrix and V is a N N matrix, the columns of each comprising orthonormal sets of vectors (i.e. U T U = V T V = I N the N N identity matrix), and Σ is a N N diagonal matrix with non-negative diagonal elements s i called the singular values of I. If I is a matrix of spatio-temporal intensities where the spectra are arranged in columns, then the columns of U are themselves spectra of the same nature, and called the normalized basis spectra of I. It should be noted that these columns have no direct significance, and a N-dimensional rotation is required to draw a meaningful interpretation. The singular values s i are sorted in a rapidly decreasing order so that the matrix of intensities can be approximated at varying levels by a product of truncated matrices Ũ, Σ and Ṽ. Indeed, if we retain the first r columns of U and V to build Ũ and Ṽ respectively, and we set Σ as a diagonal matrix whose diagonal elements are the r largest singular values s 1 s r, the matrix Ĩ = Ũ ΣṼ provides the best least-squares approximation of rank r of the matrix I. Given an estimate of the measurement uncertainties, it is then possible to use the singular values to determine the rank sufficient to describe the intensities to within the experimental uncertainties via the relation I Ĩ 2 = N s 2 i µνσ 2 (2) i=r+1 with σ 2 being the variance of data. A good choice for parameters µ and ν is the number of degrees of freedom for the representation which remain after the selection of r eigenvectors, i.e. µ = M r and ν = N r. S3 As mentionned earlier, the eigenvectors U and V do not hold any physical meaning and remain purely mathematical quantities. Even with a large associated eigenvalue, the signal-to-noise ratio of a given vector can be rather low. The content of relevant information is better quantified by the S3

4 autocorrelations defined by S3 C(U i ) = C(V i ) = M 1 j=1 U j,i U j+1,i N 1 (3) V j,i V j+1,i j=1 An autocorrelation value close to one indicates a high signal-to-noise ratio for the given vector (good information content), while a noisy vector (poor information content) will have rapid rowto-row variations and consequently an autocorrelation much less than one, possibly negative. Figure SS2 illustrates a SVD analysis performed on the matrix of intensities I. The noise variance was taken as the mean value of the variance over q. Clearly, a rank of 3 was sufficient to reliably describe the data both in terms of noise variance and autocorrelation of eigenvectors. Figure S2: SVD analysis of the matrix of intensities. The graph shows the residual of the truncated matrix of rank r with respect to the initial matrix as explicited by equation (2) (blue dots). The residual is compared to the experimental noise variance weighted by the number of remaining degrees of freedom (red line). The figures on the right side are the five largest singular values and the autocorrelations of the associated eigenvectors. S4

5 Nucleation-Elongation Kinetic Model A classical nucleation-elongation model was tested on our experimental data. The kinetic scheme is depicted on Figure SS3. It consisted of a nucleation step of order α assembling free dimers into a nucleus. Afterwards, each of these nuclei behaved as a site for a polymerization-like reaction in which remaining free dimers were captured to complete the capsid. Stoichiometric considerations impose the elongation step be made of 90 α successive additions of free dimers. Figure S3: Nucleation-elongation kinetic scheme. D and C denote dimers and capsids respectively, while k + and k are the forward and backward reaction rates. I α is an intermediate made up of α dimers. Global fitting was carried out with vaying values of α, and the form factors of dimers and capsids were fixed. Table S1 summarizes the results: for a wide range of α values, the figures of merit χ 2 and R were both above 5 and 20% respectively. The fits were therefore deemed not acceptable and this kinetic model was ruled out. Table S1: Results of global fitting for a nucleation-elongation kinetic model. α χ 2 R (%) ρ M ρ V S5

6 Figure S4: Schematic representation of the intermediate species in the hypothesis of an assembly around the fivefold axes. The number α of dimers constituting each intermediate is indicated above its structure. The dimers assembled around the fivefold axes are labelled in blue and the dimers connecting the pentamers of dimers are colored in red. S6

7 Figure S5: Comparison between the extracted form factor of intermediates and models constructed from the crystal structure of NV-VLPs (PDB reference 1IHM). Extracted data (symbols) were smoothed by a GNOM reconstruction (blue). The form factors of the models of 22-mer (red), 20-mer (green) and 10-mer (black) were calculated by the program CRYSOL. The corresponding structures are depicted on the right with the conventions used on Figure SS4. As we can see, the form factor of intermediates is close to that of the 22-mer model both in terms of radius of gyration (108 Å for NB2 and 96 Å for NV) and shape. The radii of gyration for the 20-mer and 10-mer models are 77 Å and 60 Å respectively. S7

8 References (S1) Svergun, D. I.; Barberato, C.; Koch, M. H. J. J. Appl. Crystallogr. 1995, 28, (S2) Svergun, D. I. J. Appl. Crystallogr. 1992, 25, (S3) Henry, E. R.; Hofrichter, J. Methods Enzymol. 1992, 210, S8

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR 1. Definition Existence Theorem 1. Assume that A R m n. Then there exist orthogonal matrices U R m m V R n n, values σ 1 σ 2... σ p 0 with p = min{m, n},

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

5 Linear Algebra and Inverse Problem

5 Linear Algebra and Inverse Problem 5 Linear Algebra and Inverse Problem 5.1 Introduction Direct problem ( Forward problem) is to find field quantities satisfying Governing equations, Boundary conditions, Initial conditions. The direct problem

More information

Principal Component Analysis-I Geog 210C Introduction to Spatial Data Analysis. Chris Funk. Lecture 17

Principal Component Analysis-I Geog 210C Introduction to Spatial Data Analysis. Chris Funk. Lecture 17 Principal Component Analysis-I Geog 210C Introduction to Spatial Data Analysis Chris Funk Lecture 17 Outline Filters and Rotations Generating co-varying random fields Translating co-varying fields into

More information

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Singular Value Decomposition 1 / 35 Understanding

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

Singular Value Decomposition and Principal Component Analysis (PCA) I

Singular Value Decomposition and Principal Component Analysis (PCA) I Singular Value Decomposition and Principal Component Analysis (PCA) I Prof Ned Wingreen MOL 40/50 Microarray review Data per array: 0000 genes, I (green) i,i (red) i 000 000+ data points! The expression

More information

A Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag

A Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag A Tutorial on Data Reduction Principal Component Analysis Theoretical Discussion By Shireen Elhabian and Aly Farag University of Louisville, CVIP Lab November 2008 PCA PCA is A backbone of modern data

More information

SI Text S1 Solution Scattering Data Collection and Analysis. SI references

SI Text S1 Solution Scattering Data Collection and Analysis. SI references SI Text S1 Solution Scattering Data Collection and Analysis. The X-ray photon energy was set to 8 kev. The PILATUS hybrid pixel array detector (RIGAKU) was positioned at a distance of 606 mm from the sample.

More information

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2 Norwegian University of Science and Technology Department of Mathematical Sciences TMA445 Linear Methods Fall 07 Exercise set Please justify your answers! The most important part is how you arrive at an

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

Data Mining Lecture 4: Covariance, EVD, PCA & SVD

Data Mining Lecture 4: Covariance, EVD, PCA & SVD Data Mining Lecture 4: Covariance, EVD, PCA & SVD Jo Houghton ECS Southampton February 25, 2019 1 / 28 Variance and Covariance - Expectation A random variable takes on different values due to chance The

More information

1. Background: The SVD and the best basis (questions selected from Ch. 6- Can you fill in the exercises?)

1. Background: The SVD and the best basis (questions selected from Ch. 6- Can you fill in the exercises?) Math 35 Exam Review SOLUTIONS Overview In this third of the course we focused on linear learning algorithms to model data. summarize: To. Background: The SVD and the best basis (questions selected from

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

Computational Methods. Eigenvalues and Singular Values

Computational Methods. Eigenvalues and Singular Values Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations

More information

Lecture 6 Sept Data Visualization STAT 442 / 890, CM 462

Lecture 6 Sept Data Visualization STAT 442 / 890, CM 462 Lecture 6 Sept. 25-2006 Data Visualization STAT 442 / 890, CM 462 Lecture: Ali Ghodsi 1 Dual PCA It turns out that the singular value decomposition also allows us to formulate the principle components

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 37 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

Redox Noninnocence of the Bridge in Copper(II) Salophen and bis-oxamato Complexes

Redox Noninnocence of the Bridge in Copper(II) Salophen and bis-oxamato Complexes Electronic supplementary Information for Redox Noninnocence of the Bridge in Copper(II) Salophen and bis-oxamato Complexes David de Bellefeuille, a Maylis Orio, b Anne-Laure Barra, c Ally Aukauloo, d,e

More information

Why the QR Factorization can be more Accurate than the SVD

Why the QR Factorization can be more Accurate than the SVD Why the QR Factorization can be more Accurate than the SVD Leslie V. Foster Department of Mathematics San Jose State University San Jose, CA 95192 foster@math.sjsu.edu May 10, 2004 Problem: or Ax = b for

More information

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will

More information

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD DATA MINING LECTURE 8 Dimensionality Reduction PCA -- SVD The curse of dimensionality Real data usually have thousands, or millions of dimensions E.g., web documents, where the dimensionality is the vocabulary

More information

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra & Geometry why is linear algebra useful in computer vision? Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia

More information

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra & Geometry why is linear algebra useful in computer vision? Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 6 1 / 22 Overview

More information

Supporting Information. The structure of dynamic, taxol-stabilized, and. GMPPCP-stabilized microtubules

Supporting Information. The structure of dynamic, taxol-stabilized, and. GMPPCP-stabilized microtubules Supporting Information The structure of dynamic, taxol-stabilized, and MPPCP-stabilized microtubules Avi insburg,,,, Asaf Shemesh,,, Abigail Millgram,, Raviv Dharan,, Yael Levi-Kalisman,, Israel Ringel,,

More information

Characterizing Biological Macromolecules by SAXS Detlef Beckers, Jörg Bolze, Bram Schierbeek, PANalytical B.V., Almelo, The Netherlands

Characterizing Biological Macromolecules by SAXS Detlef Beckers, Jörg Bolze, Bram Schierbeek, PANalytical B.V., Almelo, The Netherlands Characterizing Biological Macromolecules by SAXS Detlef Beckers, Jörg Bolze, Bram Schierbeek, PANalytical B.V., Almelo, The Netherlands This document was presented at PPXRD - Pharmaceutical Powder X-ray

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition

More information

Normalized power iterations for the computation of SVD

Normalized power iterations for the computation of SVD Normalized power iterations for the computation of SVD Per-Gunnar Martinsson Department of Applied Mathematics University of Colorado Boulder, Co. Per-gunnar.Martinsson@Colorado.edu Arthur Szlam Courant

More information

Positive Definite Matrix

Positive Definite Matrix 1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function

More information

Introduction to Biological Small Angle Scattering

Introduction to Biological Small Angle Scattering Introduction to Biological Small Angle Scattering Tom Grant, Ph.D. Staff Scientist BioXFEL Science and Technology Center Hauptman-Woodward Institute Buffalo, New York, USA tgrant@hwi.buffalo.edu SAXS Literature

More information

Supporting Information

Supporting Information Supporting Information Localized Nanoscale Heating Leads to Ultrafast Hydrogel Volume-Phase Transition Jing Zhao, Hanquan Su, Gregory E. Vansuch, Zheng Liu, Khalid Salaita, * R. Brian Dyer * Department

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 6 Singular Value Decomposition In Chapter 5, we derived a number of algorithms for computing the eigenvalues and eigenvectors of matrices A R n n. Having developed this machinery, we complete our

More information

CHAPTER 4 PRINCIPAL COMPONENT ANALYSIS-BASED FUSION

CHAPTER 4 PRINCIPAL COMPONENT ANALYSIS-BASED FUSION 59 CHAPTER 4 PRINCIPAL COMPONENT ANALYSIS-BASED FUSION 4. INTRODUCTION Weighted average-based fusion algorithms are one of the widely used fusion methods for multi-sensor data integration. These methods

More information

Mathematical foundations - linear algebra

Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar

More information

AM205: Assignment 2. i=1

AM205: Assignment 2. i=1 AM05: Assignment Question 1 [10 points] (a) [4 points] For p 1, the p-norm for a vector x R n is defined as: ( n ) 1/p x p x i p ( ) i=1 This definition is in fact meaningful for p < 1 as well, although

More information

PHOTOLUMINESCENCE SPECTRA AND QUANTUM YIELDS OF GOLD NANOSPHERE MONOMERS AND DIMERS IN AQUEOUS SUSPENSION

PHOTOLUMINESCENCE SPECTRA AND QUANTUM YIELDS OF GOLD NANOSPHERE MONOMERS AND DIMERS IN AQUEOUS SUSPENSION Electronic Supplementary Material (ESI) for Physical Chemistry Chemical Physics. This journal is the Owner Societies 2016 ELECTRONIC SUPPLEMENTARY INFORMATION FOR PHOTOLUMINESCENCE SPECTRA AND QUANTUM

More information

Supplemental Information for:

Supplemental Information for: Supplemental Information for: New Insight into the Structure of RNA in Red clover necrotic mosaic virus and the Role of Divalent Cations Revealed by Small-Angle Neutron Scattering Stanton L. Martin a,

More information

Foundations of Computer Vision

Foundations of Computer Vision Foundations of Computer Vision Wesley. E. Snyder North Carolina State University Hairong Qi University of Tennessee, Knoxville Last Edited February 8, 2017 1 3.2. A BRIEF REVIEW OF LINEAR ALGEBRA Apply

More information

Fitting functions to data

Fitting functions to data 1 Fitting functions to data 1.1 Exact fitting 1.1.1 Introduction Suppose we have a set of real-number data pairs x i, y i, i = 1, 2,, N. These can be considered to be a set of points in the xy-plane. They

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Principal Component Analysis (PCA) Our starting point consists of T observations from N variables, which will be arranged in an T N matrix R,

Principal Component Analysis (PCA) Our starting point consists of T observations from N variables, which will be arranged in an T N matrix R, Principal Component Analysis (PCA) PCA is a widely used statistical tool for dimension reduction. The objective of PCA is to find common factors, the so called principal components, in form of linear combinations

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE

More information

Principal components analysis COMS 4771

Principal components analysis COMS 4771 Principal components analysis COMS 4771 1. Representation learning Useful representations of data Representation learning: Given: raw feature vectors x 1, x 2,..., x n R d. Goal: learn a useful feature

More information

Guanosine oxidation explored by pulse radiolysis coupled with transient electrochemistry. Electronic Supplementary Information

Guanosine oxidation explored by pulse radiolysis coupled with transient electrochemistry. Electronic Supplementary Information Electronic Supplementary Material (ESI) for ChemComm. This journal is The Royal Society of Chemistry 2015 Guanosine oxidation explored by pulse radiolysis coupled with transient electrochemistry. A. Latus,

More information

Singular value decomposition

Singular value decomposition Singular value decomposition The eigenvalue decomposition (EVD) for a square matrix A gives AU = UD. Let A be rectangular (m n, m > n). A singular value σ and corresponding pair of singular vectors u (m

More information

Biological Small Angle X-ray Scattering (SAXS) Dec 2, 2013

Biological Small Angle X-ray Scattering (SAXS) Dec 2, 2013 Biological Small Angle X-ray Scattering (SAXS) Dec 2, 2013 Structural Biology Shape Dynamic Light Scattering Electron Microscopy Small Angle X-ray Scattering Cryo-Electron Microscopy Wide Angle X- ray

More information

2. Review of Linear Algebra

2. Review of Linear Algebra 2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear

More information

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces Singular Value Decomposition This handout is a review of some basic concepts in linear algebra For a detailed introduction, consult a linear algebra text Linear lgebra and its pplications by Gilbert Strang

More information

Numerical Methods I Singular Value Decomposition

Numerical Methods I Singular Value Decomposition Numerical Methods I Singular Value Decomposition Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 9th, 2014 A. Donev (Courant Institute)

More information

6 The SVD Applied to Signal and Image Deblurring

6 The SVD Applied to Signal and Image Deblurring 6 The SVD Applied to Signal and Image Deblurring We will discuss the restoration of one-dimensional signals and two-dimensional gray-scale images that have been contaminated by blur and noise. After an

More information

LINEAR ALGEBRA KNOWLEDGE SURVEY

LINEAR ALGEBRA KNOWLEDGE SURVEY LINEAR ALGEBRA KNOWLEDGE SURVEY Instructions: This is a Knowledge Survey. For this assignment, I am only interested in your level of confidence about your ability to do the tasks on the following pages.

More information

Problem set 5: SVD, Orthogonal projections, etc.

Problem set 5: SVD, Orthogonal projections, etc. Problem set 5: SVD, Orthogonal projections, etc. February 21, 2017 1 SVD 1. Work out again the SVD theorem done in the class: If A is a real m n matrix then here exist orthogonal matrices such that where

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

Novelty Detection. Cate Welch. May 14, 2015

Novelty Detection. Cate Welch. May 14, 2015 Novelty Detection Cate Welch May 14, 2015 1 Contents 1 Introduction 2 11 The Four Fundamental Subspaces 2 12 The Spectral Theorem 4 1 The Singular Value Decomposition 5 2 The Principal Components Analysis

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

Please do not adjust margins. Supporting Information

Please do not adjust margins. Supporting Information Electronic Supplementary Material (ESI) for Journal of Materials Chemistry A. This journal is The Royal Society of Chemistry Please do 17 not adjust margins Journal of Materials Chemistry A Supporting

More information

Write your name here:

Write your name here: MSE 102, Fall 2013 Final Exam Write your name here: Instructions: Answer all questions to the best of your abilities. Be sure to write legibly and state your answers clearly. The point values for each

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 51 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

Lecture: Face Recognition and Feature Reduction

Lecture: Face Recognition and Feature Reduction Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 11-1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed

More information

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Name: TA Name and section: NO CALCULATORS, SHOW ALL WORK, NO OTHER PAPERS ON DESK. There is very little actual work to be done on this exam if

More information

8 The SVD Applied to Signal and Image Deblurring

8 The SVD Applied to Signal and Image Deblurring 8 The SVD Applied to Signal and Image Deblurring We will discuss the restoration of one-dimensional signals and two-dimensional gray-scale images that have been contaminated by blur and noise. After an

More information

8 The SVD Applied to Signal and Image Deblurring

8 The SVD Applied to Signal and Image Deblurring 8 The SVD Applied to Signal and Image Deblurring We will discuss the restoration of one-dimensional signals and two-dimensional gray-scale images that have been contaminated by blur and noise. After an

More information

SI 1. Figure SI 1.1 CuCl 4

SI 1. Figure SI 1.1 CuCl 4 Electronic Supplementary Material (ESI) for Dalton Transactions. This journal is The Royal Society of Chemistry 2014 SI 1 FFT analysis of residuals was carried out. The residuals were obtained by fitting

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis CS5240 Theoretical Foundations in Multimedia Leow Wee Kheng Department of Computer Science School of Computing National University of Singapore Leow Wee Kheng (NUS) Principal

More information

Principal Component Analysis

Principal Component Analysis Machine Learning Michaelmas 2017 James Worrell Principal Component Analysis 1 Introduction 1.1 Goals of PCA Principal components analysis (PCA) is a dimensionality reduction technique that can be used

More information

Example Linear Algebra Competency Test

Example Linear Algebra Competency Test Example Linear Algebra Competency Test The 4 questions below are a combination of True or False, multiple choice, fill in the blank, and computations involving matrices and vectors. In the latter case,

More information

Matrix decompositions

Matrix decompositions Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The

More information

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations. Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,

More information

Problems. Looks for literal term matches. Problems:

Problems. Looks for literal term matches. Problems: Problems Looks for literal term matches erms in queries (esp short ones) don t always capture user s information need well Problems: Synonymy: other words with the same meaning Car and automobile 电脑 vs.

More information

Linear Algebra Methods for Data Mining

Linear Algebra Methods for Data Mining Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 The Singular Value Decomposition (SVD) continued Linear Algebra Methods for Data Mining, Spring 2007, University

More information

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 EigenValue decomposition Singular Value Decomposition Ramani Duraiswami, Dept. of Computer Science Hermitian Matrices A square matrix for which A = A H is said

More information

ID14-EH3. Adam Round

ID14-EH3. Adam Round Bio-SAXS @ ID14-EH3 Adam Round Contents What can be obtained from Bio-SAXS Measurable parameters Modelling strategies How to collect data at Bio-SAXS Procedure Data collection tests Data Verification and

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

Lecture: Face Recognition and Feature Reduction

Lecture: Face Recognition and Feature Reduction Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab 1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed in the

More information

Fast-pulsing NMR techniques for the detection of weak interactions: successful natural abundance probe of hydrogen bonds in peptides

Fast-pulsing NMR techniques for the detection of weak interactions: successful natural abundance probe of hydrogen bonds in peptides Fast-pulsing NMR techniques for the detection of weak interactions: successful natural abundance probe of hydrogen bonds in peptides A. Altmayer-Henzien, a V. Declerck, a D. J. Aitken, a E. Lescop, b D.

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

Linear Least Squares. Using SVD Decomposition.

Linear Least Squares. Using SVD Decomposition. Linear Least Squares. Using SVD Decomposition. Dmitriy Leykekhman Spring 2011 Goals SVD-decomposition. Solving LLS with SVD-decomposition. D. Leykekhman Linear Least Squares 1 SVD Decomposition. For any

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

14 Singular Value Decomposition

14 Singular Value Decomposition 14 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing

More information

Machine Learning (Spring 2012) Principal Component Analysis

Machine Learning (Spring 2012) Principal Component Analysis 1-71 Machine Learning (Spring 1) Principal Component Analysis Yang Xu This note is partly based on Chapter 1.1 in Chris Bishop s book on PRML and the lecture slides on PCA written by Carlos Guestrin in

More information

COMP 558 lecture 18 Nov. 15, 2010

COMP 558 lecture 18 Nov. 15, 2010 Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to

More information

Lecture Notes 2: Matrices

Lecture Notes 2: Matrices Optimization-based data analysis Fall 2017 Lecture Notes 2: Matrices Matrices are rectangular arrays of numbers, which are extremely useful for data analysis. They can be interpreted as vectors in a vector

More information

Preliminary Examination, Numerical Analysis, August 2016

Preliminary Examination, Numerical Analysis, August 2016 Preliminary Examination, Numerical Analysis, August 2016 Instructions: This exam is closed books and notes. The time allowed is three hours and you need to work on any three out of questions 1-4 and any

More information

COMPUTATIONAL ISSUES RELATING TO INVERSION OF PRACTICAL DATA: WHERE IS THE UNCERTAINTY? CAN WE SOLVE Ax = b?

COMPUTATIONAL ISSUES RELATING TO INVERSION OF PRACTICAL DATA: WHERE IS THE UNCERTAINTY? CAN WE SOLVE Ax = b? COMPUTATIONAL ISSUES RELATING TO INVERSION OF PRACTICAL DATA: WHERE IS THE UNCERTAINTY? CAN WE SOLVE Ax = b? Rosemary Renaut http://math.asu.edu/ rosie BRIDGING THE GAP? OCT 2, 2012 Discussion Yuen: Solve

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the

More information

MAXIMUM ENTROPIES COPULAS

MAXIMUM ENTROPIES COPULAS MAXIMUM ENTROPIES COPULAS Doriano-Boris Pougaza & Ali Mohammad-Djafari Groupe Problèmes Inverses Laboratoire des Signaux et Systèmes (UMR 8506 CNRS - SUPELEC - UNIV PARIS SUD) Supélec, Plateau de Moulon,

More information

sodium ion battery synthesized by a soft chemistry route

sodium ion battery synthesized by a soft chemistry route Supporting Information Title: γ-na 0.96 V 2 O 5 : a new competitive cathode material for sodium ion battery synthesized by a soft chemistry route Nicolas Emery 1,, Rita Baddour-Hadjean 1, Dauren Batyrbekuly

More information

GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis. Massimiliano Pontil

GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis. Massimiliano Pontil GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis Massimiliano Pontil 1 Today s plan SVD and principal component analysis (PCA) Connection

More information

Chemometrics. Matti Hotokka Physical chemistry Åbo Akademi University

Chemometrics. Matti Hotokka Physical chemistry Åbo Akademi University Chemometrics Matti Hotokka Physical chemistry Åbo Akademi University Linear regression Experiment Consider spectrophotometry as an example Beer-Lamberts law: A = cå Experiment Make three known references

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms Data Mining and Analysis: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA

More information

Fundamentals of Matrices

Fundamentals of Matrices Maschinelles Lernen II Fundamentals of Matrices Christoph Sawade/Niels Landwehr/Blaine Nelson Tobias Scheffer Matrix Examples Recap: Data Linear Model: f i x = w i T x Let X = x x n be the data matrix

More information

Bare minimum on matrix algebra. Psychology 588: Covariance structure and factor models

Bare minimum on matrix algebra. Psychology 588: Covariance structure and factor models Bare minimum on matrix algebra Psychology 588: Covariance structure and factor models Matrix multiplication 2 Consider three notations for linear combinations y11 y1 m x11 x 1p b11 b 1m y y x x b b n1

More information

Multivariate Statistical Analysis

Multivariate Statistical Analysis Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 4 for Applied Multivariate Analysis Outline 1 Eigen values and eigen vectors Characteristic equation Some properties of eigendecompositions

More information

arxiv: v5 [math.na] 16 Nov 2017

arxiv: v5 [math.na] 16 Nov 2017 RANDOM PERTURBATION OF LOW RANK MATRICES: IMPROVING CLASSICAL BOUNDS arxiv:3.657v5 [math.na] 6 Nov 07 SEAN O ROURKE, VAN VU, AND KE WANG Abstract. Matrix perturbation inequalities, such as Weyl s theorem

More information