Medical Visualization - Tensor Visualization. J.-Prof. Dr. Kai Lawonn
|
|
- Jessie Andrew Gibson
- 5 years ago
- Views:
Transcription
1 Medical Visualization - Tensor Visualization J.-Prof. Dr. Kai Lawonn
2 Lecture is partially based on the lecture by Prof. Thomas Schultz 2
3 What is a Tensor? A tensor is a multilinear transformation that maps vectors to a scalar: T is a tensor of rank r 3
4 What is a linear map? Let T be a linear map Then, the following two conditions are satisfied: Multilinear means linearity is fulfilled for every argument 4
5 Examples Let be the rank r=1 and n=2 5
6 Examples Let be the rank r=1 and n=2 6
7 Examples Let be the rank r=1 and n=2 WHY??? 7
8 Examples Let be the rank r=1 and n=2 Why is the first one a tensor? 8
9 Examples Let be the rank r=1 and n=2 Why is the first one a tensor? Linear! 9
10 Examples Let be the rank r=1 and n=2 The second map is not linear and therefore it is not a tensor! 10
11 Tensor Let be the rank r=1 The tensor can be written as: 11
12 Examples Let be the rank r=1 and n=3 12
13 Examples Let be the rank r=2 and n=2 13
14 Examples Let be the rank r=2 and n=2 14
15 Examples Let be the rank r=2 and n=2 WHY??? 15
16 Examples Let be the rank r=2 and n=2 16
17 Examples Let be the rank r=2 and n=2 c = 0 17
18 Examples Let be the rank r=2 and n=2 c = 0 NOT LINEAR!!! 18
19 Tensor Let be the rank r=2 The tensor can be written as: 19
20 Tensor Let be the rank r=2 The tensor can be written as: 20
21 Geometric Intuition of Linear Maps A linear (non-singular) transform A always takes (hyper-)spheres to (hyper-)ellipses. A A (From: Thomas Schultz) 21
22 Geometric Intuition of Linear Maps Thus, one good way to understand what A does is to find which vectors are mapped to the main axes of the ellipsoid. A A (From: Thomas Schultz) 22
23 Geometric Intuition of Linear Maps If A is symmetric: A = V S V T V orthogonal rows and columns are orthogonal unit vectors Transpose equals inverse: V T =V -1 The eigenvectors of A are the axes of the ellipse 1 1 A s 2 s 1 (From: Thomas Schultz) 23
24 Symmetric matrix: Eigendecomposition In this case A is just a scaling transformation. The eigendecomposition of A tells us which orthogonal axes it scales and by how much: Av s v i i i 1 1 A s 2 s 1 24
25 General Linear Transformations: SVD (Singular Value Decomposition) In general, A will also contain rotations, not just scales: A U V T 1 1 A s 2 s 1 25
26 General Linear Transformations: SVD orthonormal orthonormal AV U A v s u, s 0 i i i i 1 1 s 2 s 1 A 26
27 SVD more formally SVD exists for any matrix Formal definition: For square matrices A R n n, there exist orthogonal matrices U, V R n n and a diagonal matrix S, such that all the diagonal values s i of S are nonnegative and A USV T = A 27 U S V T
28 SVD more formally The diagonal values of S (s 1,, s n ) are called the singular values. They are usually sorted: s 1 s 2 s n The columns of U (u 1,, u n ) are called the left singular vectors. They are the axes of the ellipsoid. The columns of V (v 1,, v n ) are called the right singular vectors. They are the pre-images of the axes of the ellipsoid. A USV T = A U S V T 28
29 Reduced SVD For rectangular matrices, we have two forms of SVD. The reduced SVD looks like this: The columns of U are orthonormal Cheaper form for computation and storage = A U S T 29 V
30 Full SVD We can complete U to a full orthogonal matrix and pad S by zeros accordingly = T A U S V 30
31 Spectra and Diagonalization If A is symmetric, the eigenvectors are orthogonal (and there s always an eigenbasis). A A = U U T Au i = i u i 31
32 Properties of Tensors (Square) tensors can be symmetric or unsymmetric represented by symmetric / unsymmetric matrix definite or indefinite Positive semi-definite: v T Mv 0 for all v Easily seen from non-negative eigenvalues of M Strictly positive definite: v T Mv > 0 for all v second-order (matrix representation) or higher-order (multi-way array, outside our scope) 32
33 Summary: Introduction to Tensors Tensors are linear maps between vector spaces Can be represented as matrices Decomposition in terms of eigenvalues and eigenvectors (if symmetric) or SVD (in general) Visualization by their effect on the unit sphere (tensor ellipsoid) Applications include geometry, flow fields, material science, diffusion tensor imaging 33
34 Diffusion Tensor MRI 34
35 Introduction to Diffusion MRI Goal: Investigate the microstructure of biological tissue using Magnetic Resonance Imaging (MRI) Challenge: Voxel size is far too large to resolve the structures of interest 2 mm 1 mm 1 mm Voxel size ø μm Axon (nerve fiber) size 35
36 Images from Gordon Kindlmann Introduction to Diffusion MRI Approach: Use water molecules as a contrast agent Exploits their spontaneous heat motion at the desired spatial scale Analogy: Observe diffusion of ink on paper Kleenex Newspaper 36
37 Introduction to Diffusion MRI Water molecules at any temperature above absolute zero undergo Brownian motion or molecular diffusion In free water, this motion is completely random, and water molecules move with equal probability in all directions (isotropic diffusion) In the presence of constraining structures, such as the axons connecting neurons together, water molecules move more often in the same direction than they do across these structures (anisotropic diffusion) 37
38 Introduction to Diffusion Weighting Standard MRI Diffusion Weighted MRI
39 Stejskal-Tanner Equation:
40 Diffusion Tensor Imaging When at least six directions are acquired d becomes a 3 3 symmetric diffusion tensor D Shown tensor field as a matrix of scalar fields Shows all information Rarely permits an effective interpretation 40
41 Vertical Gradient Horizontal Gradient
42 Vertical Gradient
43 Diffusion Tensor Imaging D = 3x3 symmetric matrix Diffusion Tensor
44 Estimating Diffusion Tensors Stejskal-Tanner Equation: Taking the logarithm and solving for D produces a system of linear equations (one per measurement i): D = symmetric 3x3 matrix six free variables Six measurements exact solution More measurements least squares solution 44
45 Eigenvector Decomposition Diffusion tensor D decomposes into: 3 eigenvalues λ 1 λ 2 λ 3 If D is positive definite: All λ>0 Note: Diffusivity is a non-negative physical quantity. Due to measurement noise, we might obtain S 0 <S(d), which can lead to D with negative λ s 3 orthogonal eigenvectors e 1 /e 2 /e 3 e 1 indicates single main fiber orientation Axes of tensor ellipsoid are aligned with eigenvectors and scaled by eigenvalues
46 Summary: Diffusion Tensor MRI Diffusion Tensor MRI takes spatially resolved measurements of molecular heat motion Diffusion leads to MR signal attenuation Often assumed to follow an exponential law, parameterized by an apparent diffusion coefficient Diffusion Tensor captures directionally dependent (anisotropic) diffusivity Often applied to the (human) brain Free diffusion is symmetric (v and v equally likely); diffusion MRI currently cannot reliably detect non-symmetric diffusion 46
47 Tensor Field Visualization (Tensor Glyphs) 47
48 Overview of Tensor Field Visualization Tensor Field Visualization Glyph-Based Derived Scalar Fields Derived Vector Fields Slices DVR Isosurfaces Ellipsoid Glyphs Westin s Glyph Superquadric Glyphs Streamlines LIC Topology 48
49 Overview of Tensor Field Visualization Tensor Field Visualization Reduce information content Can be used to derive dense visualizations Slices Derived Scalar Fields DVR Isosurfaces Streamlines Derived Vector Fields LIC Topology 49
50 Overview of Tensor Field Visualization Tensor Field Visualization Ellipsoid Glyphs Glyph-Based Westin s Glyph Superquadric Glyphs Convey the full information in a tensor But only at discrete points in space (i.e., sparse visualization) 50
51 Visualizing Individual Coefficients Simplest idea: Show tensor field as a matrix of scalar fields Shows all information Rarely permits an effective interpretation 51
52 Tensor Ellipsoid A glyph is a geometric object whose shape, size, orientation, and color conveys the data Common example for tensor data: Ellipsoid Axes aligned with eigenvectors scaled with eigenvalues Implicit Equation: x T D 2 x 1
53 Examples 53
54 A Classic Use of Glyphs Glyphs are useful to Quickly assess data quality Identify artifacts Inspect the data immediately, without complex visualization algorithms Images courtesy of Gordon Kindlmann Flipped z Coordinate Corrected Sign
55 Principles for Tensor Glyph Design Faithful and expressive visualization requires: Preservation of Symmetry: Glyph should have same symmetries as the tensor Continuity: Disambiguity: 55
56 Westin Measures Three rotationally invariant metrics allow us to differentiate the three possible types of anisotropy (Westin et al., 1997) c l = λ 1 λ 2 λ 1 + λ 2 + λ 3 c p = 2 λ 2 λ 3 λ 1 + λ 2 + λ 3 c s = 3λ 3 λ 1 + λ 2 + λ 3 57
57 Westin Measures c l = λ 1 λ 2 λ 1 + λ 2 + λ 3 c p = 2 λ 2 λ 3 λ 1 + λ 2 + λ 3 c s = These metrics add up to one and can thus be used to define a barycentric space of diffusion tensor shapes c s =1 3λ 3 λ 1 + λ 2 + λ 3 c l =1 c p =1 58
58 Box and Cylinder Glyphs Box Glyphs Cylinder Glyphs 59
59 Superquadric Tensor Glyphs Ellipsoids suffer from visual ambiguities: Superquadric Glyphs greatly reduce them: Images taken from Kindlmann [2004]
60 Superquadric Shape Space Superquadrics are given as where exponentiation is defined to be sign preserving cos α θ sign cos θ cos θ α 61
61 The Idea Behind Superquadric Glyphs Ellipsoids are transformations of the sphere Superquadrics smoothly interpolate between sphere, cylinder, and box Ellipsoids Superquadrics
62 Superquadric Parameters from Tensor Shape If c l < c p, we obtain the basic superquadric tensor glyph shape as follows: α = 1 c l γ β = 1 c p γ 63
63 Superquadric Glyphs: Sharpness Superquadric glyphs combine advantages of boxes, cylinders and ellipsoids Edges give strong visual cues for orientation Round shapes are used when orientation is not clearly defined Sharpness parameter decides how quickly sharp features develop = 1.5 = 6 64
64 Superquadric Parameters when c l c p If c l c p, we use an alternative family of superquadrics that is defined to be symmetric around the x axis: α = 1 c p γ β = 1 c l γ 65
65 Superquadric Glyphs: Final Equation To obtain final glyph, superquadric geometry is scaled by eigenvalues and rotated so that principal axes align with eigenvectors: scaling rotation normalized eigenvalues basis shape 66
66 Coloring For Indefinite Tensors We color each point x on the glyph by Satisfies Preservation of Symmetry Continuity Disambiguity sign(x T Dx) + red - blue 67
67 The Lune of Tensor Shape 68
68 69
69 70
70 71
71 Image taken from Kindlmann et al. [2006] Glyphs on Regular Grid Compared to glyphs on a regular grid 76
72 Image taken from Kindlmann et al. [2006] Glyph Packing Packing glyphs distribute the glyphs regularly on the screen 77
73 Glyphs: Benefits and Drawbacks Unlike scalar- or vector-based visualizations, glyphs convey the full information present in the tensor glyphs only show the field at discrete points in space Glyphs work well for 2D slices, but easily create an illegible mass in 3D (due to occlusion) In 3D, culling of tensors is necessary, e.g., based on an anisotropy threshold (in the example: c l + c p 0.5) 78
74 Visualizing Derived Scalar Fields 79
75 Scalar Invariants Invariant = scalar quantity that is a function of the tensor does not change under rotation / changes of the frame of reference Eigenvalues / singular values are invariant All measures that can be expressed as functions of eigenvalues / singular values are invariant Many tensor visualization methods focus on symmetric tensors Eigenvalues parameterize the shape of symmetric tensors 80
76 Frobenius Norm Frobenius Norm Defined like l 2 vector norm: T = i,j T ij 2 For symmetric T, same as l 2 norm of eigenvalues: λ λ λ
77 Matrix Trace / Mean Diffusivity Matrix Trace Sum of diagonal elements: tr(t) = T xx +T yy +T zz For symmetric T, same as sum of eigenvalues: λ 1 +λ 2 +λ 3 Mean Diffusivity (MD) Average eigenvalue: tr(t)/3 Interpretation in DT-MRI: Average diffusivity (over all directions) 82
78 Fractional Anisotropy (FA) Fractional Anisotropy Quantifies the degree of anisotropy Correlates with fiber density / integrity Also correlates with orientation dispersion Based on variance of eigenvalues Normalized to [0,1] (if positive definite) 83
79 Images from Gordon Kindlmann Eigenvalue Space Illustration: Isosurfaces of Frobenius norm, MD, and FA in 3D space spanned by eigenvalues Frobenius Norm MD FA 84
80 Image from Gordon Kindlmann FA vs. Tensor Mode FA and mode are not independent Maximum FA=1 only achieved with mode=1 86
81 Westin Measures Westin s c l, c p, c s quantify the extent to which the tensor ellipsoid is linear / planar / spherical Used in design of superquadric tensor glyphs c l = λ 1 λ 2 λ 1 + λ 2 + λ 3 c s =1 c p = 2 λ 2 λ 3 λ 1 + λ 2 + λ 3 c s = 3λ 3 λ 1 + λ 2 + λ 3 c l =1 c p =1 87
82 Volume Rendering Tensor Fields Define color / opacity based on scalar measures such as Westin s shape Shading based on gradient of opacity Approximation: Pre-compute derived scalar field on a grid 88
83 Isosurfaces in Tensor Fields Isosurface of c l Shows outline of white matter core Segmented to highlight different fiber bundles [Schultz et al. 2007] 89
84 Summary: Derived Scalar Measures Scalar Invariants are independent of the chosen frame of reference Define the shape of the tensor Can be expressed in terms of eigenvalues Measure overall norm, amount of anisotropy (e.g., FA), type of anisotropy (e.g., mode or Westin s measures) Can be used to apply standard volume visualization to tensors (e.g., volume rendering, isosurfaces) 90
85 Visualizing Derived Vector Fields 91
86 Visualizing Eigenvector Fields Most methods for vector field visualization can be applied to eigenvector fields Keep in mind: Unlike proper vectors, eigenvectors (i.e., v such that Mv=λv) do not have an intrinsic norm or orientation any scaled version αv (α 0) is also an eigenvector in particular, -v is also an eigenvector Keep in mind: When two (or more) eigenvalues are equal, v can be rotated freely in their two- (or higher) dimensional eigenspace 92
87 Color Coded Eigenvector Maps Standard Color Coding: XYZ->RGB Usually applied to principal eigenvector: R v 1x, G v 1y, B v 1z Usually modulated by FA or c l Note: FA can be large even for planar anisotropy! 93
88
89 Seed Points on Mid-Sagittal Plane Deterministic Tractography
90 Fiber Tracking / Tractography In the context of diffusion MRI, streamline integration in the principal eigenvector field is called fiber tracking or tractography Need to orient eigenvectors Flip sign if dot product with previous vector is <0 Need for additional stopping criteria Anisotropy: Stop when FA or c l are low (better to use c l ) Curvature: Stop when dot product with previous vector is small White matter mask: Stop when leaving the white matter (e.g., found from traditional MRI) 96
91 Fiber Tracking as Solution of an ODE Deterministic Tractography can be viewed as solving the ordinary differential equation xሶ t = v x t Vector field v derived from tensor field Example: Major eigenvector of diffusion tensor Choose sign of v so that we are tracking forwards Basser et al. [2000]: Euler integration with stepsize s: x i+1 = x i + s v(x i ) More exact: Higher-order schemes (Runge-Kutta) Stop on high curvature or low anisotropy Use interpolation to obtain continuous tensor field At each step, interpolate tensor and compute its principal eigenvector
92 FACT Fiber Assignment by Continuous Tracking (FACT) [Mori et al. 1999] Follow principal eigenvector until voxel is left Stop when direction would change abruptly Advantage: No interpolation needed Drawback: Follows the streamline less precisely Image taken from Mori et al. [1999]
93 Streamtubes Stream tubes [Zhang et al. 2003] also encode second and third eigenvector Elliptical cross-section reflects second/third eval Fix maximum radius, preserve aspect ratio Color indicates c l (large c l red) 100
94 Superquadric Streamtubes Superquadric streamtubes [Wiens et al. 2014] Superquadric instead of elliptical cross-section Shape index σ = λ 3 λ 2 Spherical for λ 2 = λ 3 Clear edges for λ 2 λ 3 γ 101
95 Superquadric Streamtubes: Connectivity Given square cross-sections, it s important to connect corners to corners: 102
96 Superquadric Streamtubes: Example 103
97 Warning: Inconsistent Terminology! Terminology is inconsistent in the literature Sometimes, streamlines in eigenvector fields are called tensor lines, even if they are not created by the tensor lines algorithm by Weinstein et al. Sometimes, streamlines in eigenvector fields are called hyperstreamlines ; other authors use that term to denote stream tubes [Delmarcelle/Hesselink 1992] 104
98 Fiber Bundles in the Brain 105
99 Interpreting DT-MRI Fibers Reminder: Individual axons are much smaller than voxel size DT-MRI streamlines are often called fibers, but do not correspond to individual axons axons often run in parallel ( fiber bundle ), making it possible to detect them at voxel level Streamlines follow trajectory of fiber bundles ø μm l 1-3mm 106
100 Animated Tractography 112
101 Summary: Fiber Tractography Vector Field Visualization can be applied to eigenvector fields Account for lack of orientation! Fiber tracking traces lines that are tangential to inferred fiber direction in diffusion MRI Similar to streamline integration Tensor lines can track through isotropic regions Stream tubes encode additional eigenvectors 113
102 Questions??? 118
Tensor Visualization. CSC 7443: Scientific Information Visualization
Tensor Visualization Tensor data A tensor is a multivariate quantity Scalar is a tensor of rank zero s = s(x,y,z) Vector is a tensor of rank one v = (v x,v y,v z ) For a symmetric tensor of rank 2, its
More information1 Diffusion Tensor. x 1, , x n
Tensor Field Visualization Tensor is the extension of concept of scalar and vector, it is the language of mechanics. Therefore, tensor field visualization is a challenging issue for scientific visualization.
More informationTensor Visualisation
Tensor Visualisation Computer Animation and Visualisation Lecture 18 tkomura@ed.ac.uk Institute for Perception, Action & Behaviour School of Informatics Tensors 1 Reminder : Attribute Data Types Scalar
More informationTensor Visualisation
Tensor Visualisation Computer Animation and Visualisation Lecture 16 Taku Komura tkomura@ed.ac.uk Institute for Perception, Action & Behaviour School of Informatics 1 Tensor Visualisation What is tensor
More informationTensor Field Visualization. Ronald Peikert SciVis Tensor Fields 9-1
Tensor Field Visualization Ronald Peikert SciVis 2007 - Tensor Fields 9-1 Tensors "Tensors are the language of mechanics" Tensor of order (rank) 0: scalar 1: vector 2: matrix (example: stress tensor) Tensors
More informationTensor Visualisation
Tensor Visualisation Computer Animation and Visualisation Lecture 15 Taku Komura tkomura@ed.ac.uk Institute for Perception, Action & Behaviour School of Informatics 1 Overview Tensor Visualisation What
More informationSuperquadric Glyphs for Symmetric Second- Order Tensors
Superquadric Glyphs for Symmetric Second- Order Tensors Thomas Schultz, Gordon L. Kindlmann Computer Science Dept, Computation Institute University of Chicago Symmetric Tensor Representations [Kindlmann
More informationTensor fields. Tensor fields: Outline. Chantal Oberson Ausoni
Tensor fields Chantal Oberson Ausoni 7.8.2014 ICS Summer school Roscoff - Visualization at the interfaces 28.7-8.8, 2014 1 Tensor fields: Outline 1. TENSOR FIELDS: DEFINITION 2. PROPERTIES OF SECOND-ORDER
More informationAnisotropy of HARDI Diffusion Profiles Based on the L 2 -Norm
Anisotropy of HARDI Diffusion Profiles Based on the L 2 -Norm Philipp Landgraf 1, Dorit Merhof 1, Mirco Richter 1 1 Institute of Computer Science, Visual Computing Group, University of Konstanz philipp.landgraf@uni-konstanz.de
More informationHigher Order Cartesian Tensor Representation of Orientation Distribution Functions (ODFs)
Higher Order Cartesian Tensor Representation of Orientation Distribution Functions (ODFs) Yonas T. Weldeselassie (Ph.D. Candidate) Medical Image Computing and Analysis Lab, CS, SFU DT-MR Imaging Introduction
More informationDiffusion Tensor Imaging (DTI): An overview of key concepts
Diffusion Tensor Imaging (DTI): An overview of key concepts (Supplemental material for presentation) Prepared by: Nadia Barakat BMB 601 Chris Conklin Thursday, April 8 th 2010 Diffusion Concept [1,2]:
More informationTensor Visualisation and Information Visualisation
Tensor Visualisation and Information Visualisation Computer Animation and Visualisation Lecture 18 Taku Komura tkomura@ed.ac.uk Institute for Perception, Action & Behaviour School of Informatics 1 K-means
More informationQuantitative Metrics for White Matter Integrity Based on Diffusion Tensor MRI Data. Stephanie Lee
Quantitative Metrics for White Matter Integrity Based on Diffusion Tensor MRI Data Stephanie Lee May 5, 2005 Quantitative Metrics for White Matter Integrity Based on Diffusion Tensor MRI Data ABSTRACT
More informationA Riemannian Framework for Denoising Diffusion Tensor Images
A Riemannian Framework for Denoising Diffusion Tensor Images Manasi Datar No Institute Given Abstract. Diffusion Tensor Imaging (DTI) is a relatively new imaging modality that has been extensively used
More informationTensor Visualisation and Information Visualisation
Tensor Visualisation and Information Visualisation Computer Animation and Visualisation Lecture 18 Taku Komura tkomura@ed.ac.uk Institute for Perception, Action & Behaviour School of Informatics 1 Reminder
More informationDiffusion Tensor Imaging tutorial
NA-MIC http://na-mic.org Diffusion Tensor Imaging tutorial Sonia Pujol, PhD Surgical Planning Laboratory Harvard University DTI tutorial This tutorial is an introduction to the advanced Diffusion MR capabilities
More informationAdvanced Topics and Diffusion MRI
Advanced Topics and Diffusion MRI Slides originally by Karla Miller, FMRIB Centre Modified by Mark Chiew (mark.chiew@ndcn.ox.ac.uk) Slides available at: http://users.fmrib.ox.ac.uk/~mchiew/teaching/ MRI
More information8 Eigenvectors and the Anisotropic Multivariate Gaussian Distribution
Eigenvectors and the Anisotropic Multivariate Gaussian Distribution Eigenvectors and the Anisotropic Multivariate Gaussian Distribution EIGENVECTORS [I don t know if you were properly taught about eigenvectors
More informationDIFFUSION MAGNETIC RESONANCE IMAGING
DIFFUSION MAGNETIC RESONANCE IMAGING from spectroscopy to imaging apparent diffusion coefficient ADC-Map anisotropy diffusion tensor (imaging) DIFFUSION NMR - FROM SPECTROSCOPY TO IMAGING Combining Diffusion
More informationCIND Pre-Processing Pipeline For Diffusion Tensor Imaging. Overview
CIND Pre-Processing Pipeline For Diffusion Tensor Imaging Overview The preprocessing pipeline of the Center for Imaging of Neurodegenerative Diseases (CIND) prepares diffusion weighted images (DWI) and
More informationSymmetry and Continuity in Visualization and Tensor Glyph Design. Gordon L. Kindlmann. Topic
Symmetry and Continuity in Visualization and Tensor Glyph Design Gordon L. Kindlmann Topic Symmetry and Continuity General: for colormaps, scalar vis... Specific: glyphs for symmetric tensors Experiences
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationDiffusion MRI. Outline. Biology: The Neuron. Brain connectivity. Biology: Brain Organization. Brain connections and fibers
Outline Diffusion MRI Alfred Anwander Download of Slides: www.cbs.mpg.de/events/ teaching/brainsignals1112 password: mpi-brain CBSWIKI: Cornet/DiffusionMRI Neuroanatomy Diffusion MRI Diffusion Tensor Imaging
More informationLarge Scale Data Analysis Using Deep Learning
Large Scale Data Analysis Using Deep Learning Linear Algebra U Kang Seoul National University U Kang 1 In This Lecture Overview of linear algebra (but, not a comprehensive survey) Focused on the subset
More information15 Singular Value Decomposition
15 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing
More informationDiffusion Tensor Imaging I: The basics. Jennifer Campbell
Diffusion Tensor Imaging I: The basics Jennifer Campbell Diffusion Tensor Imaging I: The basics Jennifer Campbell Diffusion Imaging MRI: many different sources of contrast T1W T2W PDW Perfusion BOLD DW
More informationNIH Public Access Author Manuscript Conf Proc IEEE Eng Med Biol Soc. Author manuscript; available in PMC 2009 December 10.
NIH Public Access Author Manuscript Published in final edited form as: Conf Proc IEEE Eng Med Biol Soc. 2006 ; 1: 2622 2625. doi:10.1109/iembs.2006.259826. On Diffusion Tensor Estimation Marc Niethammer,
More informationTensorlines: Advection-Diffusion based Propagation through Diffusion Tensor Fields
Tensorlines: Advection-Diffusion based Propagation through Diffusion Tensor Fields David Weinstein, Gordon Kindlmann, Eric Lundberg Center for Scientific Computing and Imaging Department of Computer Science
More informationA Visual Approach to Analysis of Stress Tensor Fields
A Visual Approach to Analysis of Stress Tensor Fields Andrea Kratz 1, Björn Meyer 1, and Ingrid Hotz 1 1 Zuse Institute Berlin (ZIB) Department Visualization and Data Analysis {kratz, bjoern.meyer, hotz}@zib.de
More informationA Neurosurgeon s Perspectives of Diffusion Tensor Imaging(DTI) Diffusion Tensor MRI (DTI) Background and Relevant Physics.
A Neurosurgeon s Perspectives of Diffusion Tensor Imaging(DTI) Kalai Arasu Muthusamy, D.Phil(Oxon) Senior Lecturer & Consultant Neurosurgeon. Division of Neurosurgery. University Malaya Medical Centre.
More informationDWI acquisition schemes and Diffusion Tensor estimation
DWI acquisition schemes and Diffusion Tensor estimation A simulation based study Santiago Aja-Fernández, Antonio Tristán-Vega, Pablo Casaseca-de-la-Higuera Laboratory of Image Processing L A B O R A T
More informationBayesian multi-tensor diffusion MRI and tractography
Bayesian multi-tensor diffusion MRI and tractography Diwei Zhou 1, Ian L. Dryden 1, Alexey Koloydenko 1, & Li Bai 2 1 School of Mathematical Sciences, Univ. of Nottingham 2 School of Computer Science and
More informationMultivariate Statistical Analysis
Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 4 for Applied Multivariate Analysis Outline 1 Eigen values and eigen vectors Characteristic equation Some properties of eigendecompositions
More information14 Singular Value Decomposition
14 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing
More informationLecture Note 1: Background
ECE5463: Introduction to Robotics Lecture Note 1: Background Prof. Wei Zhang Department of Electrical and Computer Engineering Ohio State University Columbus, Ohio, USA Spring 2018 Lecture 1 (ECE5463 Sp18)
More informationDiscriminative Direction for Kernel Classifiers
Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering
More informationOrdinary Least Squares and its applications
Ordinary Least Squares and its applications Dr. Mauro Zucchelli University Of Verona December 5, 2016 Dr. Mauro Zucchelli Ordinary Least Squares and its applications December 5, 2016 1 / 48 Contents 1
More informationGlyph-based Comparative Visualization for Diffusion Tensor Fields
797 IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, VOL. 22, NO. 1, JANUARY 2016 Glyph-based Comparative Visualization for Diffusion Tensor Fields Changgong Zhang, Thomas Schultz, Kai Lawonn,
More informationImage Registration Lecture 2: Vectors and Matrices
Image Registration Lecture 2: Vectors and Matrices Prof. Charlene Tsai Lecture Overview Vectors Matrices Basics Orthogonal matrices Singular Value Decomposition (SVD) 2 1 Preliminary Comments Some of this
More informationCSE 554 Lecture 7: Alignment
CSE 554 Lecture 7: Alignment Fall 2012 CSE554 Alignment Slide 1 Review Fairing (smoothing) Relocating vertices to achieve a smoother appearance Method: centroid averaging Simplification Reducing vertex
More informationLinear Algebra for Machine Learning. Sargur N. Srihari
Linear Algebra for Machine Learning Sargur N. srihari@cedar.buffalo.edu 1 Overview Linear Algebra is based on continuous math rather than discrete math Computer scientists have little experience with it
More informationDirectional Field. Xiao-Ming Fu
Directional Field Xiao-Ming Fu Outlines Introduction Discretization Representation Objectives and Constraints Outlines Introduction Discretization Representation Objectives and Constraints Definition Spatially-varying
More information1 Linearity and Linear Systems
Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 26 Jonathan Pillow Lecture 7-8 notes: Linear systems & SVD Linearity and Linear Systems Linear system is a kind of mapping f( x)
More informationDiffusion Tensor Processing and Visualization
NA-MIC National Alliance for Medical Image Computing http://na-mic.org Diffusion Tensor Processing and Visualization Guido Gerig University of Utah NAMIC: National Alliance for Medical Image Computing
More informationFrom Diffusion Data to Bundle Analysis
From Diffusion Data to Bundle Analysis Gabriel Girard gabriel.girard@epfl.ch Computational Brain Connectivity Mapping Juan-les-Pins, France 20 November 2017 Gabriel Girard gabriel.girard@epfl.ch CoBCoM2017
More informationDiffusion Imaging II. By: Osama Abdullah
iffusion Imaging II By: Osama Abdullah Review Introduction. What is diffusion? iffusion and signal attenuation. iffusion imaging. How to capture diffusion? iffusion sensitizing gradients. Spin Echo. Gradient
More informationFunctional Analysis Review
Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all
More informationDeep Learning Book Notes Chapter 2: Linear Algebra
Deep Learning Book Notes Chapter 2: Linear Algebra Compiled By: Abhinaba Bala, Dakshit Agrawal, Mohit Jain Section 2.1: Scalars, Vectors, Matrices and Tensors Scalar Single Number Lowercase names in italic
More informationComputational math: Assignment 1
Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange
More informationCS 143 Linear Algebra Review
CS 143 Linear Algebra Review Stefan Roth September 29, 2003 Introductory Remarks This review does not aim at mathematical rigor very much, but instead at ease of understanding and conciseness. Please see
More information醫用磁振學 MRM 擴散張量影像 擴散張量影像原理. 本週課程內容 MR Diffusion 擴散張量造影原理 擴散張量造影應用 盧家鋒助理教授國立陽明大學生物醫學影像暨放射科學系
本週課程內容 http://www.ym.edu.tw/~cflu 擴散張量造影原理 擴散張量造影應用 醫用磁振學 MRM 擴散張量影像 盧家鋒助理教授國立陽明大學生物醫學影像暨放射科學系 alvin4016@ym.edu.tw MRI The Basics (3rd edition) Chapter 22: Echo Planar Imaging MRI in Practice, (4th edition)
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More informationThe Singular Value Decomposition
The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will
More informationData Mining and Analysis: Fundamental Concepts and Algorithms
Data Mining and Analysis: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA
More information8. Diagonalization.
8. Diagonalization 8.1. Matrix Representations of Linear Transformations Matrix of A Linear Operator with Respect to A Basis We know that every linear transformation T: R n R m has an associated standard
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationPreliminary Examination, Numerical Analysis, August 2016
Preliminary Examination, Numerical Analysis, August 2016 Instructions: This exam is closed books and notes. The time allowed is three hours and you need to work on any three out of questions 1-4 and any
More informationCS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works
CS68: The Modern Algorithmic Toolbox Lecture #8: How PCA Works Tim Roughgarden & Gregory Valiant April 20, 206 Introduction Last lecture introduced the idea of principal components analysis (PCA). The
More informationEigenvalues and diagonalization
Eigenvalues and diagonalization Patrick Breheny November 15 Patrick Breheny BST 764: Applied Statistical Modeling 1/20 Introduction The next topic in our course, principal components analysis, revolves
More informationLinear Algebra and Eigenproblems
Appendix A A Linear Algebra and Eigenproblems A working knowledge of linear algebra is key to understanding many of the issues raised in this work. In particular, many of the discussions of the details
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationBasic Calculus Review
Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V
More informationDiffusion Tensor Imaging quality control : artifacts assessment and correction. A. Coste, S. Gouttard, C. Vachet, G. Gerig. Medical Imaging Seminar
Diffusion Tensor Imaging quality control : artifacts assessment and correction A. Coste, S. Gouttard, C. Vachet, G. Gerig Medical Imaging Seminar Overview Introduction DWI DTI Artifact Assessment Artifact
More informationLecture 2: Linear Algebra Review
EE 227A: Convex Optimization and Applications January 19 Lecture 2: Linear Algebra Review Lecturer: Mert Pilanci Reading assignment: Appendix C of BV. Sections 2-6 of the web textbook 1 2.1 Vectors 2.1.1
More informationUsing Eigenvalue Derivatives for Edge Detection in DT-MRI Data
Using Eigenvalue Derivatives for Edge Detection in DT-MRI Data Thomas Schultz and Hans-Peter Seidel MPI Informatik, Campus E 1.4, 66123 Saarbrücken, Germany, Email: schultz@mpi-inf.mpg.de Abstract. This
More informationJeffrey D. Ullman Stanford University
Jeffrey D. Ullman Stanford University 2 Often, our data can be represented by an m-by-n matrix. And this matrix can be closely approximated by the product of two matrices that share a small common dimension
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationDuke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014
Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Linear Algebra A Brief Reminder Purpose. The purpose of this document
More informationDATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD
DATA MINING LECTURE 8 Dimensionality Reduction PCA -- SVD The curse of dimensionality Real data usually have thousands, or millions of dimensions E.g., web documents, where the dimensionality is the vocabulary
More informationDimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas
Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx
More informationThe Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)
Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will
More information5 Linear Algebra and Inverse Problem
5 Linear Algebra and Inverse Problem 5.1 Introduction Direct problem ( Forward problem) is to find field quantities satisfying Governing equations, Boundary conditions, Initial conditions. The direct problem
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationTensor Field Reconstruction Based on Eigenvector and Eigenvalue Interpolation
Tensor Field Reconstruction Based on Eigenvector and Eigenvalue Interpolation Ingrid Hotz 1, Jaya Sreevalsan Nair 1, and Bernd Hamann 1 Institute for Data Analysis and Visualization, (IDAV), Department
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationPrincipal Component Analysis
Principal Component Analysis Laurenz Wiskott Institute for Theoretical Biology Humboldt-University Berlin Invalidenstraße 43 D-10115 Berlin, Germany 11 March 2004 1 Intuition Problem Statement Experimental
More informationMobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti
Mobile Robotics 1 A Compact Course on Linear Algebra Giorgio Grisetti SA-1 Vectors Arrays of numbers They represent a point in a n dimensional space 2 Vectors: Scalar Product Scalar-Vector Product Changes
More informationChapter 7: Symmetric Matrices and Quadratic Forms
Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved
More informationSpectral Processing. Misha Kazhdan
Spectral Processing Misha Kazhdan [Taubin, 1995] A Signal Processing Approach to Fair Surface Design [Desbrun, et al., 1999] Implicit Fairing of Arbitrary Meshes [Vallet and Levy, 2008] Spectral Geometry
More informationBackground Mathematics (2/2) 1. David Barber
Background Mathematics (2/2) 1 David Barber University College London Modified by Samson Cheung (sccheung@ieee.org) 1 These slides accompany the book Bayesian Reasoning and Machine Learning. The book and
More information7 Principal Component Analysis
7 Principal Component Analysis This topic will build a series of techniques to deal with high-dimensional data. Unlike regression problems, our goal is not to predict a value (the y-coordinate), it is
More informationLecture: Face Recognition and Feature Reduction
Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 11-1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed
More informationLecture 6 Positive Definite Matrices
Linear Algebra Lecture 6 Positive Definite Matrices Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Spring 2017 2017/6/8 Lecture 6: Positive Definite Matrices
More informationDS-GA 1002 Lecture notes 10 November 23, Linear models
DS-GA 2 Lecture notes November 23, 2 Linear functions Linear models A linear model encodes the assumption that two quantities are linearly related. Mathematically, this is characterized using linear functions.
More informationHST.583 Functional Magnetic Resonance Imaging: Data Acquisition and Analysis Fall 2006
MIT OpenCourseWare http://ocw.mit.edu HST.583 Functional Magnetic Resonance Imaging: Data Acquisition and Analysis Fall 2006 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationDiffusion Weighted MRI. Zanqi Liang & Hendrik Poernama
Diffusion Weighted MRI Zanqi Liang & Hendrik Poernama 1 Outline MRI Quick Review What is Diffusion MRI? Detecting Diffusion Stroke and Tumor Detection Presenting Diffusion Anisotropy and Diffusion Tensor
More informationInformation About Ellipses
Information About Ellipses David Eberly, Geometric Tools, Redmond WA 9805 https://www.geometrictools.com/ This work is licensed under the Creative Commons Attribution 4.0 International License. To view
More informationChapter 3 Salient Feature Inference
Chapter 3 Salient Feature Inference he building block of our computational framework for inferring salient structure is the procedure that simultaneously interpolates smooth curves, or surfaces, or region
More informationImproved Correspondence for DTI Population Studies via Unbiased Atlas Building
Improved Correspondence for DTI Population Studies via Unbiased Atlas Building Casey Goodlett 1, Brad Davis 1,2, Remi Jean 3, John Gilmore 3, and Guido Gerig 1,3 1 Department of Computer Science, University
More informationA geometric proof of the spectral theorem for real symmetric matrices
0 0 0 A geometric proof of the spectral theorem for real symmetric matrices Robert Sachs Department of Mathematical Sciences George Mason University Fairfax, Virginia 22030 rsachs@gmu.edu January 6, 2011
More informationPrincipal Component Analysis (PCA)
Principal Component Analysis (PCA) Salvador Dalí, Galatea of the Spheres CSC411/2515: Machine Learning and Data Mining, Winter 2018 Michael Guerzhoy and Lisa Zhang Some slides from Derek Hoiem and Alysha
More informationMath 302 Outcome Statements Winter 2013
Math 302 Outcome Statements Winter 2013 1 Rectangular Space Coordinates; Vectors in the Three-Dimensional Space (a) Cartesian coordinates of a point (b) sphere (c) symmetry about a point, a line, and a
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationAssignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:
Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition Due date: Friday, May 4, 2018 (1:35pm) Name: Section Number Assignment #10: Diagonalization
More informationUnsupervised Learning: Dimensionality Reduction
Unsupervised Learning: Dimensionality Reduction CMPSCI 689 Fall 2015 Sridhar Mahadevan Lecture 3 Outline In this lecture, we set about to solve the problem posed in the previous lecture Given a dataset,
More information1 Singular Value Decomposition and Principal Component
Singular Value Decomposition and Principal Component Analysis In these lectures we discuss the SVD and the PCA, two of the most widely used tools in machine learning. Principal Component Analysis (PCA)
More informationRician Noise Removal in Diffusion Tensor MRI
Rician Noise Removal in Diffusion Tensor MRI Saurav Basu, Thomas Fletcher, and Ross Whitaker University of Utah, School of Computing, Salt Lake City, UT 84112, USA Abstract. Rician noise introduces a bias
More informationLecture: Face Recognition and Feature Reduction
Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab 1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed in the
More information