Kaviti Sai Saurab. October 23,2014. IIT Kanpur. Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23, / 8

Similar documents
Quantum Recommendation Systems

Quantum machine learning for quantum anomaly detection CQT AND SUTD, SINGAPORE ARXIV:

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

9/26/17. Ridge regression. What our model needs to do. Ridge Regression: L2 penalty. Ridge coefficients. Ridge coefficients

Logic and machine learning review. CS 540 Yingyu Liang

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization

Classification of handwritten digits using supervised locally linear embedding algorithm and support vector machine

Time Series Classification

Riemannian Metric Learning for Symmetric Positive Definite Matrices

Support Vector Machine (SVM) and Kernel Methods

Oil Field Production using Machine Learning. CS 229 Project Report

Iterative Laplacian Score for Feature Selection

Machine Learning. Nonparametric Methods. Space of ML Problems. Todo. Histograms. Instance-Based Learning (aka non-parametric methods)

What is semi-supervised learning?

A graph based approach to semi-supervised learning

Cutting Plane Training of Structural SVM

Beyond the Point Cloud: From Transductive to Semi-Supervised Learning

Quantum Convolutional Neural Networks

Support Vector Machine (SVM) and Kernel Methods

Quantum Artificial Intelligence and Machine Learning: The Path to Enterprise Deployments. Randall Correll. +1 (703) Palo Alto, CA

Geometric View of Machine Learning Nearest Neighbor Classification. Slides adapted from Prof. Carpuat

Semi-Supervised Learning of Speech Sounds

6.036 midterm review. Wednesday, March 18, 15

HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH

Nearest Neighbor. Machine Learning CSE546 Kevin Jamieson University of Washington. October 26, Kevin Jamieson 2

Machine Learning. CUNY Graduate Center, Spring Lectures 11-12: Unsupervised Learning 1. Professor Liang Huang.

Lecture 2: Linear Algebra Review

Analysis of Spectral Kernel Design based Semi-supervised Learning

Qualifying Exam in Machine Learning

Microarray Data Analysis: Discovery

CSC321 Lecture 20: Autoencoders

Dimensionality Reduction

Kernel Methods. Charles Elkan October 17, 2007

An Overview of Outlier Detection Techniques and Applications

The Quantum Landscape

Hamiltonian simulation and solving linear systems

Classification Semi-supervised learning based on network. Speakers: Hanwen Wang, Xinxin Huang, and Zeyu Li CS Winter

Your Project Proposals

Overview of Statistical Tools. Statistical Inference. Bayesian Framework. Modeling. Very simple case. Things are usually more complicated

arxiv: v2 [quant-ph] 2 Sep 2008

Supervised locally linear embedding

Chap 1. Overview of Statistical Learning (HTF, , 2.9) Yongdai Kim Seoul National University

Probabilistic Time Series Classification

CS5112: Algorithms and Data Structures for Applications

Course in Data Science

Principles of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION

Machine Learning! in just a few minutes. Jan Peters Gerhard Neumann

CSCI-567: Machine Learning (Spring 2019)

Bayesian Networks Inference with Probabilistic Graphical Models

Learning Theory Continued

Semi Supervised Distance Metric Learning

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima.

EEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1

Improved Local Coordinate Coding using Local Tangents

Semi-supervised Learning

Modern Information Retrieval

Instance-based Learning CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2016

Classification 2: Linear discriminant analysis (continued); logistic regression

When Dictionary Learning Meets Classification

CMSC 422 Introduction to Machine Learning Lecture 4 Geometry and Nearest Neighbors. Furong Huang /

Physics and Machine Learning: Emerging Paradigms

Machine Learning Basics

Support Vector Machine (SVM) and Kernel Methods

Metric Embedding of Task-Specific Similarity. joint work with Trevor Darrell (MIT)

Unlabeled Data: Now It Helps, Now It Doesn t

Learning Tetris. 1 Tetris. February 3, 2009

Learning Particle Physics by Example:

Statistical Learning. Dong Liu. Dept. EEIS, USTC

CSC Neural Networks. Perceptron Learning Rule

The role of dimensionality reduction in classification

Linear & Non-Linear Discriminant Analysis! Hugh R. Wilson

Machine Learning and Deep Learning! Vincent Lepetit!

Machine Learning Theory (CS 6783)

CS7267 MACHINE LEARNING

Topics in Natural Language Processing

Multiple Similarities Based Kernel Subspace Learning for Image Classification

Batch Mode Sparse Active Learning. Lixin Shi, Yuhang Zhao Tsinghua University

Large-Scale Nearest Neighbor Classification with Statistical Guarantees

Text Mining. Dr. Yanjun Li. Associate Professor. Department of Computer and Information Sciences Fordham University

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT

Learning Vector Quantization

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project

Compute the Fourier transform on the first register to get x {0,1} n x 0.

Department of Computer Science and Engineering

Non-parametric Methods

Data Mining. 3.6 Regression Analysis. Fall Instructor: Dr. Masoud Yaghini. Numeric Prediction

Lecture 4: Elementary Quantum Algorithms

Predictive analysis on Multivariate, Time Series datasets using Shapelets

Clustering by Mixture Models. General background on clustering Example method: k-means Mixture model based clustering Model estimation

Holdout and Cross-Validation Methods Overfitting Avoidance

Integrating Global and Local Structures: A Least Squares Framework for Dimensionality Reduction

PAC Learning Introduction to Machine Learning. Matt Gormley Lecture 14 March 5, 2018

COMP3702/7702 Artificial Intelligence Lecture 11: Introduction to Machine Learning and Reinforcement Learning. Hanna Kurniawati

CPSC 340: Machine Learning and Data Mining. Linear Least Squares Fall 2016

Selection of Classifiers based on Multiple Classifier Behaviour

Neural Networks. Single-layer neural network. CSE 446: Machine Learning Emily Fox University of Washington March 10, /9/17

Introduction to Machine Learning

Introduction to Machine Learning. Introduction to ML - TAU 2016/7 1

Scuola di Calcolo Scientifico con MATLAB (SCSM) 2017 Palermo 31 Luglio - 4 Agosto 2017

Transcription:

A review of Quantum Algorithms for Nearest-Neighbor Methods for Supervised and Unsupervised Learning Nathan Wiebe,Ashish Kapoor,Krysta M. Svore Jan.2014 Kaviti Sai Saurab IIT Kanpur October 23,2014 Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23,2014 1 / 8

Overview 1 Background 2 Layout of the paper 3 What are the ideas here? Nearest-neighbours 4 Conclusion Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23,2014 2 / 8

Background Quantum speedups have long been known for problems such as factoring, quantum simulation and optimization. Only recently have quantum algorithms begun to emerge that promise quantum advantages for solving problems in data processing, machine learning and classification. Practical algorithms for these problems promise to provide applications outside of the physical sciences where quantum computation may be of great importance. This work tries to provide an important step towards the goal of understanding the value of quantum computing to machine learning by rigorously showing that quantum speedups can be achieved for nearestneighbor classification. Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23,2014 3 / 8

Layout of the paper The paper is laid out as follows. Review of nearest-neighbor classification Discussion of existing approaches to quantum classification and outline our quantum algorithms in and the oracles that they use in Main results for the costs of performing nearest-neighbor classification using the inner product method and the Euclidean method Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23,2014 4 / 8

New ideas in N-N implementation A major challenge facing the development of practical quantum machine learning algorithms is the need for an oracle to return, for example, the distances between elements in the test and training sets. Lloyd, Mohseni, and Rebentrost recently proposed a quantum algorithm that addresses this problem. Namely, their algorithm computes a representative vector for each set, referred to as a centroid, by averaging the vectors in {A} and {B}, respectively. Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23,2014 5 / 8

New ideas in N-N implementation This algorithm, which can be referred to as nearest centroid classification is a form of nearest-neighbor clas- sification, where the nearest centroid is used to determine the label, as opposed to the nearest training point. If the number of clusters equals the number of points, then it reduces to nearestneighbor classification. In practice, nearestcentroid classification can perform poorly because A and B are often embedded in a complicated man- ifold where the mean values of the sets are not within the manifold. In contrast, nearestneighbor classification tends work well in practice and often outperforms centroidbased classification but can be prohibitively expensive on classical computers. Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23,2014 6 / 8

Conclusion Final Review Quite a bit of excellent new work has been by the authors. But there is great scope for future work. Possibilities for future work include examining width/depth tradeoffs that arise when executing these algorithms on quantum computers and how well the algorithm performs in cases where the oracle represents a database. Also it would be interesting to see whether these methods can be directly adapted to practical cases of characterizing data yielded by an efficient quantum circuit. Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23,2014 7 / 8

conclusions It remains an open question whether exponential speedups can be obtained by a quantum algorithm for supervised, unsupervised, or semi-supervised machine learning tasks. Beyond computational speedups, a final interesting question is whether quantum computation allows new classes of learning algorithms that do not have natural classical analogs. The search for inherently quantum machine learning algorithms may not only reveal potentially useful methods but also shed light on deeper questions of what it means to learn and whether quantum physics places inherent limitations on a systems ability to learn from data. Kaviti Sai Saurab (UCLA) Quantum methods in ML October 23,2014 8 / 8