Human Activity Recognition and Estimation of Calorie Expenditure - A Data Analytic Approach

Size: px
Start display at page:

Download "Human Activity Recognition and Estimation of Calorie Expenditure - A Data Analytic Approach"

Transcription

1 Human Activity Recognition and Estimation of Calorie Expenditure - A Data Analytic Approach Mushtaque Ahamed A Dr. Snehanshu Saha PESIT-Bangalore South Campus December 4, 2015 Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 1 / Appr 24

2 1 Introduction 2 Experimental Setup 3 Data Collection 4 Feature Extraction Feature Selection and Normalization Dimensionality Reduction 5 Multiclass Support Vector Machines Formulation of Binary Support Vector Machine Training Cross Validation 6 Estimation of Calorie Expenditure 7 Results and Dicussion 8 Refereneces Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 2 / Appr 24

3 Introduction Recognition of human activity is useful in monitoring the health status of a person. Information about the type of activities that a person has done in a day is vital for doctors who remotely monitor the health their clients. The model helps determine the type of activity of a person, the amount of time and calories spent during that activity. The information thus gathered can profile the activities of a person towards maintaining their health. The challenge here is to accurately detect the type of activity done by a person. The following set of measures was executed to solve this problem which includes data Collection, feature extraction, dimensionality reduction, support vector machine training and cross validation and testing in real time. The amount of time were determined, we estimated the amount of calories burnt using a standard table of Metabolic Equivalents Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 3 / Appr 24

4 Experimental Setup Figure : Arduino - An Open Source Electronics Hardware Platform Figure : MPU A Tri-axes Accelerometer by Invensense Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 4 / Appr 24

5 . Two test subjects were considered for the experiment. A Tri-axis Accelerometer called MPU6050[6] and Arduino Uno, a micro controller were used to collect the data. Five activities were defined for classification. The activities are sleeping, sitting, standing, walking and running, necessarily in that order. The sensor was worn by the subjects as lockets and different activities were performed, each for about a minute. The activities were labeled 1 to 5 in increasing levels of intensity from sleeping to running Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 5 / Appr 24

6 Data collection A 4 dimensional dataset was recorded, which includes acceleration along the three co-ordinate axes and the time when it was sampled, accurate to milliseconds. The sensor is sensitive to accelerations from -2g to +2g, with a resolution of 16 bits. The raw data from the sensor consisted of values from to It was normalized in range from -2 to +2. The sensor was set to sample at a rate of 20Hz. The data was recorded in CSV file format. 10 samples were taken for each of the five activities. Each file contains 1000 records. Subsequently 500 samples were created from 50 files by splitting them evenly with a window size 100 without any overlap. Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 6 / Appr 24

7 Feature Extraction A MATLAB script was written to extract these feature from the samples and a dataset of features, consisting of 500 rows and 23 columns was created. Each of the 23 features listed above were placed in a row. The data set was labeled from 1 to 5 as shown in table 2. Metabolic Equivalent [4, 5] (MET) values were extracted from the National Cancer Institute. The MET values in the following table is the amount of calories spent by a human body in one hour, per kg of the body weight, while engaged in the corresponding activity. Activity Label Metabolic Equivalent Sleeping Sitting Standing Walking Running Table : List of Activities and their MET values. Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 7 / Appr 24

8 Feature Selection and Normalization The following 23 features[2, 2] were extracted from the 500 samples in the data set, where each sample consisted of 100 values of x, y and z axes data, as shown in table below. Feature Feature Feature Feature Number Description Number Description 1 mean of x axis 13 kurtosis of x axis 2 mean of y axis 14 kurtosis of y axis 3 mean of z axis 15 kurtosis of z axis 4 avg. mean of 3 axes 16 avg. kurtosis of 3 axes 5 std. dev of x axis 17 energy of x axis 6 std. dev of y axis 18 energy of y axis 7 std. dev of z axis 19 energy of z axis 8 avg. std. dev of 3 axes 20 avg. eneryg of 3 axes 9 skewness of x axis 21 corr. between x and y axis 10 skewness of y axis 22 corr. between x and z axis 11 skewness of z axis 23 corr. between y and z axis Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 8 / Appr 24

9 Dimensionality Reduction The feature vector was of 23 dimensions and the data set was of 500 samples in size, with 5 classes. Principal Component Analysis [1] was performed on the data set and it was found that the principal 8 dimensions captured 85% of the variance of the data. Dimensionality Reduction[3] was performed on the data set in which the dimensionality of the data set was reduced from 23 to 8. This new data set with 500 samples and 8 dimensions was considered for Principal component were calculated as follows. Letting X be matrix of 500 rows and 23 columns, representing the data set, with each sample in a row. Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December - A4, Data 2015 Analytic 9 / Appr 24

10 The covariance matrix Σ was calculated as Σ = X T X m where m was the number of samples. The covariance matrix Σ was decomposed into the matrices U, S and V using the singular value decomposition such that, USV = Σ The first eight diagonal elements constituted the top eight eigen values of Σ such that they added up to 85% of the total of all eigen values. the corresponding 8 eigen vectors formed the matrix U and were exploited to reduce the dimension. The dataset X was projected onto the reduced dimension space( subspace) as X reduced = X U column(1:8) Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 10 / Appr 24

11 Figure : Original 23 Dimensional data set Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 11 / Appr 24

12 Multiclass Support Vector Machine Five Binary Support Vector Machine models were trained on the labeled, dimensionally reduced data set of 350 samples Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 12 / Appr 24

13 Formulation of Binary Support Vector Machine Consider the reduced dataset X reduced which has 500 samples arranged in rows, and each row has 8 dimensions( columns). Let X i represent the ith sample of the datasetx reduced and let y i represent the label of the data set. If the model is to classify the activity a then y i is 1 if the the sample i is a positive sample and y i is -1 if the sample i is a negative sample. The Support vector machine model creates a hyperplane of the feature vector space which is identified by its normal vector w and the bias b which represents the offset hyperplane incurs from the origin. Once the normal vector w and the bias b are known, given a data sample x to classify, the following classification function may be deployed. Classify(x) = 1 if w T x b 1 +1 if w T x b +1 0 if 1 w T x b +1 Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 13 / Appr 24

14 To obtain the model parameters w and b, we formulate an optimization problem as follows. Combining the positive and the negative test cases, we can write the following equation. y i (w T x i b) 1, 1 i m where m is the number of the samples in the training data set. The hyperplane is such that it has maximum perpendicular distance from the data points in feature vector space. This perpendicular distance is inversely proportional to the magnitude of w, that is w. Thus the following optimization problem can be posed to solve for w and b. arg min (w,b) 1 2 w by introducing lagrangian multiplier α, the above constrained problem can be posed as follows arg min (w,b) max α 0 { 1 2 w m i=1 ] α i [y } i (w T x i b) 1 Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 14 / Appr 24

15 This can be solved by using quadratic programming technique.the stationary Karush Kuhn Tucker [4] condition implies that the solution can be expressed as a linear combination of the training vectors. and w = b = 1 m SV m α i y i x i i=1 m SV (w T x i y i ) where m SV is the number of support vectors selected by the model. i=1 Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 15 / Appr 24

16 Training Each of the five models were trained to test for positivity of a sample to a class. 70% of the data set was used to train the model where each class had about 70 positive samples and about 280 negative samples. The model was used to test accuracy of the training by making it predict the class of a sample. Five Binary classification models were used to build the multi-class SVM. To predict the class to which a sample belonged, the sample was tested on all five models. The affiliation of a sample to a particular class was determined by checking which of the models tested the sample as positive. Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 16 / Appr 24

17 Cross Validation Cross Validation was performed to check the accuracy of the training model. 150 samples were used to test. Confusion matrix [1] was computed based the training data and the test data. Sensitivity, specificity and accuracy for each of the class was computed from confusion matrix. Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 17 / Appr 24

18 Activity Confusion Matrix Sensitivity Specificity Accuracy Sleeping Sitting Standing Walking Running Table : Summary of test results for training data set Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 18 / Appr 24

19 Activity Confusion Matrix Sensitivity Specificity Accuracy Sleeping Sitting Standing Walking Running Table : Summary of test results for testing data set Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 19 / Appr 24

20 Estimation of Calories The following is the proposed method to estimate the amount of calories spent of various activities. The sensor will be worn by the user as a locket. The sensor will send the stream of data via bluetooth or some other wireless interface to the smart phone. The model will run on the sample as input and predict the type of activity the person is undergoing. The time-stamp of the sample is recorded. Using the standard MET table, the weight of the user and the time and activity data, the following formula is employed to calculate amount of calories burnt. Energy calorie = MET activity Weight user Time hours Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 20 / Appr 24

21 Results and Dicussion In this paper, the authors have discussed a model to classify human activity by using multi-class support vector machine. We have collected the data, extracted the features, normalized it and dimensionally reduced it from 23 dimensions to 8 dimensions. The data set was used to train five SVM classifiers, one for each class of activity. We tested the model with the cross validation data set and found it to be % accurate in classifying the accelerometer sample into one of the five activities. The proposed model elucidates vital information about the health of a person. It estimates the calorie expenditure for each of the activities. Such a model is very useful for the doctors who are monitoring their clients remotely. With our model, the doctors can remotely diagnose the user and predict the user s health. All this is achieved with a wearable device and a simple but trustworthy computational model. Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 21 / Appr 24

22 References Abdi, Hervé, and Lynne J. Williams. Principal component analysis. Wiley Interdisciplinary Reviews: Computational Statistics 2.4 (2010): Cleland I, Kikhia B, Nugent C, Boytsov A, Hallberg J, Synnes K, McClean S, Finlay D. Optimal Placement of Accelerometers for the Detection of Everyday Activities. Sensors. 2013; 13(7): Keogh, Eamonn, et al. Dimensionality reduction for fast similarity search in large time series databases. Knowledge and information Systems 3.3 (2001): Kuhn, H. W.; Tucker, A. W. (1951). Nonlinear programming. Proceedings of 2nd Berkeley Symposium. Berkeley: University of California Press. pp MR Lavie, Carl J., and Richard V. Milani. Metabolic equivalent (MET) inflation-not the MET we used to know. Journal of cardiopulmonary rehabilitation and prevention 27.3 (2007): Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 22 / Appr 24

23 References Contd... Schuldt, Christian, Ivan Laptev, and Barbara Caputo. Recognizing human actions: a local SVM approach. Pattern Recognition, ICPR Proceedings of the 17th International Conference on. Vol. 3. IEEE, Soliman, Samir S., and S-Z. Hsue. Signal classification using statistical moments. Communications, IEEE Transactions on 40.5 (1992): Suykens, Johan AK, and Joos Vandewalle. Least squares support vector machine classifiers. Neural processing letters 9.3 (1999): http : //appliedresearch.cancer.gov/atus met/met.php http : //playground.arduino.cc/main/mpu 6050 http : // MPU 6000A 00v 3.4.pdf Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 23 / Appr 24

24 Acknowledgement Thanks to ICCC VVIT, Bangalore And Thank You Everyone Mushtaque Ahamed A Dr. Snehanshu Saha Human (PESIT-Bangalore Activity Recognition South Campus) and Estimation of Calorie Expenditure December -4, A2015 Data Analytic 24 / Appr 24

Data Mining. Linear & nonlinear classifiers. Hamid Beigy. Sharif University of Technology. Fall 1396

Data Mining. Linear & nonlinear classifiers. Hamid Beigy. Sharif University of Technology. Fall 1396 Data Mining Linear & nonlinear classifiers Hamid Beigy Sharif University of Technology Fall 1396 Hamid Beigy (Sharif University of Technology) Data Mining Fall 1396 1 / 31 Table of contents 1 Introduction

More information

Myoelectrical signal classification based on S transform and two-directional 2DPCA

Myoelectrical signal classification based on S transform and two-directional 2DPCA Myoelectrical signal classification based on S transform and two-directional 2DPCA Hong-Bo Xie1 * and Hui Liu2 1 ARC Centre of Excellence for Mathematical and Statistical Frontiers Queensland University

More information

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to: System 2 : Modelling & Recognising Modelling and Recognising Classes of Classes of Shapes Shape : PDM & PCA All the same shape? System 1 (last lecture) : limited to rigidly structured shapes System 2 :

More information

Constrained Optimization and Support Vector Machines

Constrained Optimization and Support Vector Machines Constrained Optimization and Support Vector Machines Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/

More information

Dimensionality Reduction

Dimensionality Reduction Dimensionality Reduction Le Song Machine Learning I CSE 674, Fall 23 Unsupervised learning Learning from raw (unlabeled, unannotated, etc) data, as opposed to supervised data where a classification of

More information

Sparse Kernel Machines - SVM

Sparse Kernel Machines - SVM Sparse Kernel Machines - SVM Henrik I. Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I. Christensen (RIM@GT) Support

More information

An Improved Conjugate Gradient Scheme to the Solution of Least Squares SVM

An Improved Conjugate Gradient Scheme to the Solution of Least Squares SVM An Improved Conjugate Gradient Scheme to the Solution of Least Squares SVM Wei Chu Chong Jin Ong chuwei@gatsby.ucl.ac.uk mpeongcj@nus.edu.sg S. Sathiya Keerthi mpessk@nus.edu.sg Control Division, Department

More information

Linear & nonlinear classifiers

Linear & nonlinear classifiers Linear & nonlinear classifiers Machine Learning Hamid Beigy Sharif University of Technology Fall 1396 Hamid Beigy (Sharif University of Technology) Linear & nonlinear classifiers Fall 1396 1 / 44 Table

More information

LINEAR CLASSIFIERS. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

LINEAR CLASSIFIERS. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition LINEAR CLASSIFIERS Classification: Problem Statement 2 In regression, we are modeling the relationship between a continuous input variable x and a continuous target variable t. In classification, the input

More information

Linear vs Non-linear classifier. CS789: Machine Learning and Neural Network. Introduction

Linear vs Non-linear classifier. CS789: Machine Learning and Neural Network. Introduction Linear vs Non-linear classifier CS789: Machine Learning and Neural Network Support Vector Machine Jakramate Bootkrajang Department of Computer Science Chiang Mai University Linear classifier is in the

More information

Linear Support Vector Machine. Classification. Linear SVM. Huiping Cao. Huiping Cao, Slide 1/26

Linear Support Vector Machine. Classification. Linear SVM. Huiping Cao. Huiping Cao, Slide 1/26 Huiping Cao, Slide 1/26 Classification Linear SVM Huiping Cao linear hyperplane (decision boundary) that will separate the data Huiping Cao, Slide 2/26 Support Vector Machines rt Vector Find a linear Machines

More information

Co-Recognition of Human Activity and Sensor Location via Compressed Sensing in Wearable Body Sensor Networks

Co-Recognition of Human Activity and Sensor Location via Compressed Sensing in Wearable Body Sensor Networks 2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks Co-Recognition of Human Activity and Sensor Location via Compressed Sensing in Wearable Body Sensor Networks Wenyao

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System processes System Overview Previous Systems:

More information

CS798: Selected topics in Machine Learning

CS798: Selected topics in Machine Learning CS798: Selected topics in Machine Learning Support Vector Machine Jakramate Bootkrajang Department of Computer Science Chiang Mai University Jakramate Bootkrajang CS798: Selected topics in Machine Learning

More information

Linear & nonlinear classifiers

Linear & nonlinear classifiers Linear & nonlinear classifiers Machine Learning Hamid Beigy Sharif University of Technology Fall 1394 Hamid Beigy (Sharif University of Technology) Linear & nonlinear classifiers Fall 1394 1 / 34 Table

More information

Introduction to Support Vector Machines

Introduction to Support Vector Machines Introduction to Support Vector Machines Hsuan-Tien Lin Learning Systems Group, California Institute of Technology Talk in NTU EE/CS Speech Lab, November 16, 2005 H.-T. Lin (Learning Systems Group) Introduction

More information

Context-based Reasoning in Ambient Intelligence - CoReAmI -

Context-based Reasoning in Ambient Intelligence - CoReAmI - Context-based in Ambient Intelligence - CoReAmI - Hristijan Gjoreski Department of Intelligent Systems, Jožef Stefan Institute Supervisor: Prof. Dr. Matjaž Gams Co-supervisor: Dr. Mitja Luštrek Background

More information

Support Vector Machines

Support Vector Machines Support Vector Machines Le Song Machine Learning I CSE 6740, Fall 2013 Naïve Bayes classifier Still use Bayes decision rule for classification P y x = P x y P y P x But assume p x y = 1 is fully factorized

More information

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 206 Jonathan Pillow Homework 8: Logistic Regression & Information Theory Due: Tuesday, April 26, 9:59am Optimization Toolbox One

More information

Jeff Howbert Introduction to Machine Learning Winter

Jeff Howbert Introduction to Machine Learning Winter Classification / Regression Support Vector Machines Jeff Howbert Introduction to Machine Learning Winter 2012 1 Topics SVM classifiers for linearly separable classes SVM classifiers for non-linearly separable

More information

Support Vector Machines: Maximum Margin Classifiers

Support Vector Machines: Maximum Margin Classifiers Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 16, 2008 Piotr Mirowski Based on slides by Sumit Chopra and Fu-Jie Huang 1 Outline What is behind

More information

CS 231A Section 1: Linear Algebra & Probability Review

CS 231A Section 1: Linear Algebra & Probability Review CS 231A Section 1: Linear Algebra & Probability Review 1 Topics Support Vector Machines Boosting Viola-Jones face detector Linear Algebra Review Notation Operations & Properties Matrix Calculus Probability

More information

CS 231A Section 1: Linear Algebra & Probability Review. Kevin Tang

CS 231A Section 1: Linear Algebra & Probability Review. Kevin Tang CS 231A Section 1: Linear Algebra & Probability Review Kevin Tang Kevin Tang Section 1-1 9/30/2011 Topics Support Vector Machines Boosting Viola Jones face detector Linear Algebra Review Notation Operations

More information

Multimodal context analysis and prediction

Multimodal context analysis and prediction Multimodal context analysis and prediction Valeria Tomaselli (valeria.tomaselli@st.com) Sebastiano Battiato Giovanni Maria Farinella Tiziana Rotondo (PhD student) Outline 2 Context analysis vs prediction

More information

Exam Machine Learning for the Quantified Self with answers :00-14:45

Exam Machine Learning for the Quantified Self with answers :00-14:45 Exam Machine Learning for the Quantified Self with answers 21. 06. 2017 12:00-14:45 NOTES: 1. YOUR NAME MUST BE WRITTEN ON EACH SHEET IN CAPITALS. 2. Answer the questions in Dutch or English. 3. Points

More information

Operational modal analysis using forced excitation and input-output autoregressive coefficients

Operational modal analysis using forced excitation and input-output autoregressive coefficients Operational modal analysis using forced excitation and input-output autoregressive coefficients *Kyeong-Taek Park 1) and Marco Torbol 2) 1), 2) School of Urban and Environment Engineering, UNIST, Ulsan,

More information

EECS490: Digital Image Processing. Lecture #26

EECS490: Digital Image Processing. Lecture #26 Lecture #26 Moments; invariant moments Eigenvector, principal component analysis Boundary coding Image primitives Image representation: trees, graphs Object recognition and classes Minimum distance classifiers

More information

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx

More information

Machine learning for pervasive systems Classification in high-dimensional spaces

Machine learning for pervasive systems Classification in high-dimensional spaces Machine learning for pervasive systems Classification in high-dimensional spaces Department of Communications and Networking Aalto University, School of Electrical Engineering stephan.sigg@aalto.fi Version

More information

Kernel Methods. Machine Learning A W VO

Kernel Methods. Machine Learning A W VO Kernel Methods Machine Learning A 708.063 07W VO Outline 1. Dual representation 2. The kernel concept 3. Properties of kernels 4. Examples of kernel machines Kernel PCA Support vector regression (Relevance

More information

Learning with kernels and SVM

Learning with kernels and SVM Learning with kernels and SVM Šámalova chata, 23. května, 2006 Petra Kudová Outline Introduction Binary classification Learning with Kernels Support Vector Machines Demo Conclusion Learning from data find

More information

Academic Editor: Angelo Maria Sabatini Received: 15 February 2016; Accepted: 5 April 2016; Published: 13 April 2016

Academic Editor: Angelo Maria Sabatini Received: 15 February 2016; Accepted: 5 April 2016; Published: 13 April 2016 sensors Article MGRA: Motion Gesture Recognition via Accelerometer Feng Hong 1, *, Shujuan You 1, Meiyu Wei 1, Yongtuo Zhang 2 and Zhongwen Guo 1 1 Department of Computer Science and Technology, Ocean

More information

EE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015

EE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015 EE613 Machine Learning for Engineers Kernel methods Support Vector Machines jean-marc odobez 2015 overview Kernel methods introductions and main elements defining kernels Kernelization of k-nn, K-Means,

More information

Kernel Methods and Support Vector Machines

Kernel Methods and Support Vector Machines Kernel Methods and Support Vector Machines Oliver Schulte - CMPT 726 Bishop PRML Ch. 6 Support Vector Machines Defining Characteristics Like logistic regression, good for continuous input features, discrete

More information

Support Vector Machine. Industrial AI Lab.

Support Vector Machine. Industrial AI Lab. Support Vector Machine Industrial AI Lab. Classification (Linear) Autonomously figure out which category (or class) an unknown item should be categorized into Number of categories / classes Binary: 2 different

More information

Machine Learning 11. week

Machine Learning 11. week Machine Learning 11. week Feature Extraction-Selection Dimension reduction PCA LDA 1 Feature Extraction Any problem can be solved by machine learning methods in case of that the system must be appropriately

More information

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis .. December 20, 2013 Todays lecture. (PCA) (PLS-R) (LDA) . (PCA) is a method often used to reduce the dimension of a large dataset to one of a more manageble size. The new dataset can then be used to make

More information

Least Squares Optimization

Least Squares Optimization Least Squares Optimization The following is a brief review of least squares optimization and constrained optimization techniques. Broadly, these techniques can be used in data analysis and visualization

More information

Eigenface-based facial recognition

Eigenface-based facial recognition Eigenface-based facial recognition Dimitri PISSARENKO December 1, 2002 1 General This document is based upon Turk and Pentland (1991b), Turk and Pentland (1991a) and Smith (2002). 2 How does it work? The

More information

Least Squares Optimization

Least Squares Optimization Least Squares Optimization The following is a brief review of least squares optimization and constrained optimization techniques, which are widely used to analyze and visualize data. Least squares (LS)

More information

A methodology for fault detection in rolling element bearings using singular spectrum analysis

A methodology for fault detection in rolling element bearings using singular spectrum analysis A methodology for fault detection in rolling element bearings using singular spectrum analysis Hussein Al Bugharbee,1, and Irina Trendafilova 2 1 Department of Mechanical engineering, the University of

More information

Machine Learning Basics

Machine Learning Basics Security and Fairness of Deep Learning Machine Learning Basics Anupam Datta CMU Spring 2019 Image Classification Image Classification Image classification pipeline Input: A training set of N images, each

More information

Assignment 3. Introduction to Machine Learning Prof. B. Ravindran

Assignment 3. Introduction to Machine Learning Prof. B. Ravindran Assignment 3 Introduction to Machine Learning Prof. B. Ravindran 1. In building a linear regression model for a particular data set, you observe the coefficient of one of the features having a relatively

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

Classification with Perceptrons. Reading:

Classification with Perceptrons. Reading: Classification with Perceptrons Reading: Chapters 1-3 of Michael Nielsen's online book on neural networks covers the basics of perceptrons and multilayer neural networks We will cover material in Chapters

More information

HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems

HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems Silvia Chiappa and Samy Bengio {chiappa,bengio}@idiap.ch IDIAP, P.O. Box 592, CH-1920 Martigny, Switzerland Abstract. We compare the use

More information

Electrical Energy Modeling In Y2E2 Building Based On Distributed Sensors Information

Electrical Energy Modeling In Y2E2 Building Based On Distributed Sensors Information Electrical Energy Modeling In Y2E2 Building Based On Distributed Sensors Information Mahmoud Saadat Saman Ghili Introduction Close to 40% of the primary energy consumption in the U.S. comes from commercial

More information

ASCERTAIN The Description for the single-trial classification data

ASCERTAIN The Description for the single-trial classification data ASCERTAIN The Description for the single-trial classification data This Document belongs to the ASCERTAIN dataset documentations. Dear user, In this document we cover the description of the information

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

Condition Monitoring of Single Point Cutting Tool through Vibration Signals using Decision Tree Algorithm

Condition Monitoring of Single Point Cutting Tool through Vibration Signals using Decision Tree Algorithm Columbia International Publishing Journal of Vibration Analysis, Measurement, and Control doi:10.7726/jvamc.2015.1003 Research Paper Condition Monitoring of Single Point Cutting Tool through Vibration

More information

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory Bayesian decision theory 8001652 Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory Jussi Tohka jussi.tohka@tut.fi Institute of Signal Processing Tampere University of Technology

More information

Least Squares Optimization

Least Squares Optimization Least Squares Optimization The following is a brief review of least squares optimization and constrained optimization techniques. I assume the reader is familiar with basic linear algebra, including the

More information

A Novel Activity Detection Method

A Novel Activity Detection Method A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of

More information

Subcellular Localisation of Proteins in Living Cells Using a Genetic Algorithm and an Incremental Neural Network

Subcellular Localisation of Proteins in Living Cells Using a Genetic Algorithm and an Incremental Neural Network Subcellular Localisation of Proteins in Living Cells Using a Genetic Algorithm and an Incremental Neural Network Marko Tscherepanow and Franz Kummert Applied Computer Science, Faculty of Technology, Bielefeld

More information

CITS 4402 Computer Vision

CITS 4402 Computer Vision CITS 4402 Computer Vision A/Prof Ajmal Mian Adj/A/Prof Mehdi Ravanbakhsh Lecture 06 Object Recognition Objectives To understand the concept of image based object recognition To learn how to match images

More information

Homework 3. Convex Optimization /36-725

Homework 3. Convex Optimization /36-725 Homework 3 Convex Optimization 10-725/36-725 Due Friday October 14 at 5:30pm submitted to Christoph Dann in Gates 8013 (Remember to a submit separate writeup for each problem, with your name at the top)

More information

Linear classifiers selecting hyperplane maximizing separation margin between classes (large margin classifiers)

Linear classifiers selecting hyperplane maximizing separation margin between classes (large margin classifiers) Support vector machines In a nutshell Linear classifiers selecting hyperplane maximizing separation margin between classes (large margin classifiers) Solution only depends on a small subset of training

More information

A Cumulant-Based Method for Gait Identification Using Accelerometer Data with Principal Component Analysis and Support Vector Machine

A Cumulant-Based Method for Gait Identification Using Accelerometer Data with Principal Component Analysis and Support Vector Machine A Cumulant-Based Method for Gait Identification Using Accelerometer Data with Principal Component Analysis and Support Vector Machine SEBASTIJAN SPRAGER, DAMJAN ZAZULA System Software Laboratory University

More information

From Binary to Multiclass Classification. CS 6961: Structured Prediction Spring 2018

From Binary to Multiclass Classification. CS 6961: Structured Prediction Spring 2018 From Binary to Multiclass Classification CS 6961: Structured Prediction Spring 2018 1 So far: Binary Classification We have seen linear models Learning algorithms Perceptron SVM Logistic Regression Prediction

More information

Computation. For QDA we need to calculate: Lets first consider the case that

Computation. For QDA we need to calculate: Lets first consider the case that Computation For QDA we need to calculate: δ (x) = 1 2 log( Σ ) 1 2 (x µ ) Σ 1 (x µ ) + log(π ) Lets first consider the case that Σ = I,. This is the case where each distribution is spherical, around the

More information

Regularized Discriminant Analysis and Reduced-Rank LDA

Regularized Discriminant Analysis and Reduced-Rank LDA Regularized Discriminant Analysis and Reduced-Rank LDA Department of Statistics The Pennsylvania State University Email: jiali@stat.psu.edu Regularized Discriminant Analysis A compromise between LDA and

More information

STUDY ON METHODS FOR COMPUTER-AIDED TOOTH SHADE DETERMINATION

STUDY ON METHODS FOR COMPUTER-AIDED TOOTH SHADE DETERMINATION INTERNATIONAL JOURNAL OF INFORMATION AND SYSTEMS SCIENCES Volume 5, Number 3-4, Pages 351 358 c 2009 Institute for Scientific Computing and Information STUDY ON METHODS FOR COMPUTER-AIDED TOOTH SHADE DETERMINATION

More information

Pattern Recognition 2018 Support Vector Machines

Pattern Recognition 2018 Support Vector Machines Pattern Recognition 2018 Support Vector Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recognition 1 / 48 Support Vector Machines Ad Feelders ( Universiteit Utrecht

More information

CPSC 340: Machine Learning and Data Mining. More PCA Fall 2017

CPSC 340: Machine Learning and Data Mining. More PCA Fall 2017 CPSC 340: Machine Learning and Data Mining More PCA Fall 2017 Admin Assignment 4: Due Friday of next week. No class Monday due to holiday. There will be tutorials next week on MAP/PCA (except Monday).

More information

Support Vector Machines

Support Vector Machines Support Vector Machines Ryan M. Rifkin Google, Inc. 2008 Plan Regularization derivation of SVMs Geometric derivation of SVMs Optimality, Duality and Large Scale SVMs The Regularization Setting (Again)

More information

Support Vector Machine (SVM) and Kernel Methods

Support Vector Machine (SVM) and Kernel Methods Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2014 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

CS534 Machine Learning - Spring Final Exam

CS534 Machine Learning - Spring Final Exam CS534 Machine Learning - Spring 2013 Final Exam Name: You have 110 minutes. There are 6 questions (8 pages including cover page). If you get stuck on one question, move on to others and come back to the

More information

Generative MaxEnt Learning for Multiclass Classification

Generative MaxEnt Learning for Multiclass Classification Generative Maximum Entropy Learning for Multiclass Classification A. Dukkipati, G. Pandey, D. Ghoshdastidar, P. Koley, D. M. V. S. Sriram Dept. of Computer Science and Automation Indian Institute of Science,

More information

Towards a Ptolemaic Model for OCR

Towards a Ptolemaic Model for OCR Towards a Ptolemaic Model for OCR Sriharsha Veeramachaneni and George Nagy Rensselaer Polytechnic Institute, Troy, NY, USA E-mail: nagy@ecse.rpi.edu Abstract In style-constrained classification often there

More information

STATISTICS 407 METHODS OF MULTIVARIATE ANALYSIS TOPICS

STATISTICS 407 METHODS OF MULTIVARIATE ANALYSIS TOPICS STATISTICS 407 METHODS OF MULTIVARIATE ANALYSIS TOPICS Principal Component Analysis (PCA): Reduce the, summarize the sources of variation in the data, transform the data into a new data set where the variables

More information

ECE 592 Topics in Data Science

ECE 592 Topics in Data Science ECE 592 Topics in Data Science Final Fall 2017 December 11, 2017 Please remember to justify your answers carefully, and to staple your test sheet and answers together before submitting. Name: Student ID:

More information

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University Lecture 4: Principal Component Analysis Aykut Erdem May 016 Hacettepe University This week Motivation PCA algorithms Applications PCA shortcomings Autoencoders Kernel PCA PCA Applications Data Visualization

More information

Chapter 6 Classification and Prediction (2)

Chapter 6 Classification and Prediction (2) Chapter 6 Classification and Prediction (2) Outline Classification and Prediction Decision Tree Naïve Bayes Classifier Support Vector Machines (SVM) K-nearest Neighbors Accuracy and Error Measures Feature

More information

Data Preprocessing Tasks

Data Preprocessing Tasks Data Tasks 1 2 3 Data Reduction 4 We re here. 1 Dimensionality Reduction Dimensionality reduction is a commonly used approach for generating fewer features. Typically used because too many features can

More information

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION 1 Outline Basic terminology Features Training and validation Model selection Error and loss measures Statistical comparison Evaluation measures 2 Terminology

More information

15 Singular Value Decomposition

15 Singular Value Decomposition 15 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing

More information

Machine Learning. Lecture 6: Support Vector Machine. Feng Li.

Machine Learning. Lecture 6: Support Vector Machine. Feng Li. Machine Learning Lecture 6: Support Vector Machine Feng Li fli@sdu.edu.cn https://funglee.github.io School of Computer Science and Technology Shandong University Fall 2018 Warm Up 2 / 80 Warm Up (Contd.)

More information

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name: Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition Due date: Friday, May 4, 2018 (1:35pm) Name: Section Number Assignment #10: Diagonalization

More information

Compressed Sensing Meets Machine Learning - Classification of Mixture Subspace Models via Sparse Representation

Compressed Sensing Meets Machine Learning - Classification of Mixture Subspace Models via Sparse Representation Compressed Sensing Meets Machine Learning - Classification of Mixture Subspace Models via Sparse Representation Allen Y Yang Feb 25, 2008 UC Berkeley What is Sparsity Sparsity A

More information

component risk analysis

component risk analysis 273: Urban Systems Modeling Lec. 3 component risk analysis instructor: Matteo Pozzi 273: Urban Systems Modeling Lec. 3 component reliability outline risk analysis for components uncertain demand and uncertain

More information

Support Vector Machine (SVM) and Kernel Methods

Support Vector Machine (SVM) and Kernel Methods Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2015 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

Support Vector Machine. Industrial AI Lab. Prof. Seungchul Lee

Support Vector Machine. Industrial AI Lab. Prof. Seungchul Lee Support Vector Machine Industrial AI Lab. Prof. Seungchul Lee Classification (Linear) Autonomously figure out which category (or class) an unknown item should be categorized into Number of categories /

More information

L5 Support Vector Classification

L5 Support Vector Classification L5 Support Vector Classification Support Vector Machine Problem definition Geometrical picture Optimization problem Optimization Problem Hard margin Convexity Dual problem Soft margin problem Alexander

More information

DETECTING HUMAN ACTIVITIES IN THE ARCTIC OCEAN BY CONSTRUCTING AND ANALYZING SUPER-RESOLUTION IMAGES FROM MODIS DATA INTRODUCTION

DETECTING HUMAN ACTIVITIES IN THE ARCTIC OCEAN BY CONSTRUCTING AND ANALYZING SUPER-RESOLUTION IMAGES FROM MODIS DATA INTRODUCTION DETECTING HUMAN ACTIVITIES IN THE ARCTIC OCEAN BY CONSTRUCTING AND ANALYZING SUPER-RESOLUTION IMAGES FROM MODIS DATA Shizhi Chen and YingLi Tian Department of Electrical Engineering The City College of

More information

Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012

Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012 Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Linear classifier Which classifier? x 2 x 1 2 Linear classifier Margin concept x 2

More information

Small sample size generalization

Small sample size generalization 9th Scandinavian Conference on Image Analysis, June 6-9, 1995, Uppsala, Sweden, Preprint Small sample size generalization Robert P.W. Duin Pattern Recognition Group, Faculty of Applied Physics Delft University

More information

Robust Speaker Identification

Robust Speaker Identification Robust Speaker Identification by Smarajit Bose Interdisciplinary Statistical Research Unit Indian Statistical Institute, Kolkata Joint work with Amita Pal and Ayanendranath Basu Overview } } } } } } }

More information

Support Vector Machine

Support Vector Machine Support Vector Machine Kernel: Kernel is defined as a function returning the inner product between the images of the two arguments k(x 1, x 2 ) = ϕ(x 1 ), ϕ(x 2 ) k(x 1, x 2 ) = k(x 2, x 1 ) modularity-

More information

Hidden CRFs for Human Activity Classification from RGBD Data

Hidden CRFs for Human Activity Classification from RGBD Data H Hidden s for Human from RGBD Data Avi Singh & Ankit Goyal IIT-Kanpur CS679: Machine Learning for Computer Vision April 13, 215 Overview H 1 2 3 H 4 5 6 7 Statement H Input An RGBD video with a human

More information

Lecture 4 Discriminant Analysis, k-nearest Neighbors

Lecture 4 Discriminant Analysis, k-nearest Neighbors Lecture 4 Discriminant Analysis, k-nearest Neighbors Fredrik Lindsten Division of Systems and Control Department of Information Technology Uppsala University. Email: fredrik.lindsten@it.uu.se fredrik.lindsten@it.uu.se

More information

Model Selection for LS-SVM : Application to Handwriting Recognition

Model Selection for LS-SVM : Application to Handwriting Recognition Model Selection for LS-SVM : Application to Handwriting Recognition Mathias M. Adankon and Mohamed Cheriet Synchromedia Laboratory for Multimedia Communication in Telepresence, École de Technologie Supérieure,

More information

Single Channel Signal Separation Using MAP-based Subspace Decomposition

Single Channel Signal Separation Using MAP-based Subspace Decomposition Single Channel Signal Separation Using MAP-based Subspace Decomposition Gil-Jin Jang, Te-Won Lee, and Yung-Hwan Oh 1 Spoken Language Laboratory, Department of Computer Science, KAIST 373-1 Gusong-dong,

More information

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition ace Recognition Identify person based on the appearance of face CSED441:Introduction to Computer Vision (2017) Lecture10: Subspace Methods and ace Recognition Bohyung Han CSE, POSTECH bhhan@postech.ac.kr

More information

Kronecker Decomposition for Image Classification

Kronecker Decomposition for Image Classification university of innsbruck institute of computer science intelligent and interactive systems Kronecker Decomposition for Image Classification Sabrina Fontanella 1,2, Antonio Rodríguez-Sánchez 1, Justus Piater

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Principal Component Analysis Barnabás Póczos Contents Motivation PCA algorithms Applications Some of these slides are taken from Karl Booksh Research

More information

Chemometrics: Classification of spectra

Chemometrics: Classification of spectra Chemometrics: Classification of spectra Vladimir Bochko Jarmo Alander University of Vaasa November 1, 2010 Vladimir Bochko Chemometrics: Classification 1/36 Contents Terminology Introduction Big picture

More information

Gopalkrishna Veni. Project 4 (Active Shape Models)

Gopalkrishna Veni. Project 4 (Active Shape Models) Gopalkrishna Veni Project 4 (Active Shape Models) Introduction Active shape Model (ASM) is a technique of building a model by learning the variability patterns from training datasets. ASMs try to deform

More information

Mid-year Report Linear and Non-linear Dimentionality. Reduction. applied to gene expression data of cancer tissue samples

Mid-year Report Linear and Non-linear Dimentionality. Reduction. applied to gene expression data of cancer tissue samples Mid-year Report Linear and Non-linear Dimentionality applied to gene expression data of cancer tissue samples Franck Olivier Ndjakou Njeunje Applied Mathematics, Statistics, and Scientific Computation

More information

MACHINE LEARNING. Support Vector Machines. Alessandro Moschitti

MACHINE LEARNING. Support Vector Machines. Alessandro Moschitti MACHINE LEARNING Support Vector Machines Alessandro Moschitti Department of information and communication technology University of Trento Email: moschitti@dit.unitn.it Summary Support Vector Machines

More information