Problem sets. Late policy (5% off per day, but the weekend counts as only one day). E.g., Friday: -5% Monday: -15% Tuesday: -20% Thursday: -30%

Size: px
Start display at page:

Download "Problem sets. Late policy (5% off per day, but the weekend counts as only one day). E.g., Friday: -5% Monday: -15% Tuesday: -20% Thursday: -30%"

Transcription

1 Problem sets Late policy (5% off per day, but the weekend counts as only one day). E.g., Friday: -5% Monday: -5% Tuesday: -0% Thursday: -30%

2 Outline Final thoughts on hierarchical Bayesian models and MCMC Bayesian classification Bayesian concept learning

3 Gibbs sampling MCMC methods Factorize hypotheses h = <h, h,, h n > Cycle through variables h, h,, h n Draw h i (t+) from P(h i h -i, evidence) Metropolis-Hastings Propose changes to hypothesis from some distribution Q(h (t+) h (t) ) Accept proposals with probability A(h (t+) h (t) ) = min{, } P(h (t+) evidence) Q(h (t) h (t +) ) P(h (t) evidence) Q(h (t+) h (t) )

4 Why MCMC is important Simple Can be used with just about any kind of probabilistic model, including complex hierarchical structures Always works pretty well, if you re willing to wait a long time (cf. Back-propagation for neural networks.)

5 A model for cognitive development? Some features of cognitive development: Small, random, dumb, local steps Takes a long time Can get stuck in plateaus or stages Two steps forward, one step back Over time, intuitive theories get consistently better (more veridical, more powerful, broader scope). Everyone reaches basically the same state (though some take longer than others).

6 Topic models of semantic structure: e.g., Latent Dirichlet Allocation (Blei, Ng, Jordan) Each document in a corpus is associated with a distribution θ over topics. Each topic t is associated with a distribution φ(t) over words. Image removed due to copyright considerations. Please see: Blei, David, Andrew Ng, and Michael Jordan. "Latent Dirichlet Allocation." Journal of Machine Learning Research 3 (Jan 003):

7 Choose mixture weights for each document, generate bag of words θ = {P(z = ), P(z = )} {0, } {0.5, 0.75} {0.5, 0.5} {0.75, 0.5} {, 0} MATHEMATICS KNOWLEDGE RESEARCH WORK MATHEMATICS RESEARCH WORK SCIENTIFIC MATHEMATICS WORK SCIENTIFIC KNOWLEDGE MATHEMATICS SCIENTIFIC HEART LOVE TEARS KNOWLEDGE HEART MATHEMATICS HEART RESEARCH LOVE MATHEMATICS WORK TEARS SOUL KNOWLEDGE HEART WORK JOY SOUL TEARS MATHEMATICS TEARS LOVE LOVE LOVE SOUL TEARS LOVE JOY SOUL LOVE TEARS SOUL SOUL TEARS JOY

8

9 Gibbs sampling i w i d i z i MATHEMATICS KNOWLEDGE RESEARCH WORK MATHEMATICS RESEARCH WORK SCIENTIFIC MATHEMATICS WORK SCIENTIFIC KNOWLEDGE... JOY... 5 iteration...

10 Gibbs sampling iteration i w i d i z i z i MATHEMATICS KNOWLEDGE RESEARCH WORK MATHEMATICS RESEARCH WORK SCIENTIFIC MATHEMATICS WORK SCIENTIFIC KNOWLEDGE... JOY ?

11

12

13

14

15

16

17

18 A selection of topics (TASA) FIELD MAGNETIC MAGNET WIRE NEEDLE CURRENT COIL POLES IRON COMPASS LINES CORE ELECTRIC DIRECTION FORCE MAGNETS BE MAGNETISM POLE INDUCED SCIENCE STUDY SCIENTISTS SCIENTIFIC KNOWLEDGE WORK RESEARCH CHEMISTRY TECHNOLOGY MANY MATHEMATICS BIOLOGY FIELD PHYSICS LABORATORY STUDIES WORLD SCIENTIST STUDYING SCIENCES BALL GAME TEAM FOOTBALL BASEBALL PLAYERS PLAY FIELD PLAYER BASKETBALL COACH PLAYED PLAYING HIT TENNIS TEAMS GAMES SPORTS BAT TERRY JOB WORK JOBS CAREER EXPERIENCE EMPLOYMENT OPPORTUNITIES WORKING TRAINING SKILLS CAREERS POSITIONS FIND POSITION FIELD OCCUPATIONS REQUIRE OPPORTUNITY EARN ABLE STORY STORIES TELL CHARACTER CHARACTERS AUTHOR READ TOLD SETTING TALES PLOT TELLING SHORT FICTION ACTION TRUE EVENTS TELLS TALE NOVEL MIND WORLD DREAM DREAMS THOUGHT IMAGINATION MOMENT THOUGHTS OWN REAL LIFE IMAGINE SENSE CONSCIOUSNESS STRANGE FEELING WHOLE BEING MIGHT HOPE WATER FISH SEA SWIM SWIMMING POOL LIKE SHELL SHARK TANK SHELLS SHARKS DIVING DOLPHINS SWAM LONG SEAL DIVE DOLPHIN UNDERWATER DISEASE BACTERIA DISEASES GERMS FEVER CAUSE CAUSED SPREAD VIRUSES INFECTION VIRUS MICROORGANISMS PERSON INFECTIOUS COMMON CAUSING SMALLPOX BODY INFECTIONS CERTAIN

19 A selection of topics (TASA) FIELD MAGNETIC MAGNET WIRE NEEDLE CURRENT COIL POLES IRON COMPASS LINES CORE ELECTRIC DIRECTION FORCE MAGNETS BE MAGNETISM POLE INDUCED SCIENCE STUDY SCIENTISTS SCIENTIFIC KNOWLEDGE WORK RESEARCH CHEMISTRY TECHNOLOGY MANY MATHEMATICS BIOLOGY FIELD PHYSICS LABORATORY STUDIES WORLD SCIENTIST STUDYING SCIENCES BALL GAME TEAM FOOTBALL BASEBALL PLAYERS PLAY FIELD PLAYER BASKETBALL COACH PLAYED PLAYING HIT TENNIS TEAMS GAMES SPORTS BAT TERRY JOB WORK JOBS CAREER EXPERIENCE EMPLOYMENT OPPORTUNITIES WORKING TRAINING SKILLS CAREERS POSITIONS FIND POSITION FIELD OCCUPATIONS REQUIRE OPPORTUNITY EARN ABLE STORY STORIES TELL CHARACTER CHARACTERS AUTHOR READ TOLD SETTING TALES PLOT TELLING SHORT FICTION ACTION TRUE EVENTS TELLS TALE NOVEL MIND WORLD DREAM DREAMS THOUGHT IMAGINATION MOMENT THOUGHTS OWN REAL LIFE IMAGINE SENSE CONSCIOUSNESS STRANGE FEELING WHOLE BEING MIGHT HOPE WATER FISH SEA SWIM SWIMMING POOL LIKE SHELL SHARK TANK SHELLS SHARKS DIVING DOLPHINS SWAM LONG SEAL DIVE DOLPHIN UNDERWATER DISEASE BACTERIA DISEASES GERMS FEVER CAUSE CAUSED SPREAD VIRUSES INFECTION VIRUS MICROORGANISMS PERSON INFECTIOUS COMMON CAUSING SMALLPOX BODY INFECTIONS CERTAIN

20 The 4 shape 7 of 4 a 3 female 5 mating 5 preference 5 is 3 the 4 relationship 7 between 4 a 3 male 5 trait 5 and 37 the 4 probability 7 of 4 acceptance as 43 a 3 mating 5 partner 0, The 4 shape 7 of 4 preferences 5 is 3 important 49 in 5 many 39 models 6 of 4 sexual 5 selection 46, mate 5 recognition 5, communication 9, and 37 speciation 46, yet 50 it 4 has 8 rarely 9 been 33 measured 7 precisely 9, Here I 9 examine 34 preference 7 shape 7 for 5 male 5 calling 5 song 5 in a 3 bushcricket *3 (katydid *48 ). Preferences 5 change 46 dramatically 9 between races 46 of 4 a 3 species 5, from strongly 9 directional to 3 broadly 9 stabilizing 45 (but 50 with a 3 net 49 directional 46 effect 46 ), Preference 5 shape 46 generally 9 matches 0 the 4 distribution 6 of 4 the 4 male 5 trait 5, This 4 is 3 compatible 9 with a 3 coevolutionary 46 model 0 of 4 signal 9 -preference 5 evolution 46, although 50 it 4 does 33 not 37 rule 0 out 7 an 3 alternative model 0, sensory 5 exploitation 50. Preference 46 shapes 40 are 8 shown 35 to 3 be 44 genetic in 5 origin 7. (graylevel = membership in topic 5) Ritchie, Michael G. "The Shape of Female Mating Preferences." PNAS 93 (996): Copyright 996. Courtesy of the National Academy of Sciences, U.S.A. Used with permission.

21 The 4 shape 7 of 4 a 3 female 5 mating 5 preference 5 is 3 the 4 relationship 7 between 4 a 3 male 5 trait 5 and 37 the 4 probability 7 of 4 acceptance as 43 a 3 mating 5 partner 0, The 4 shape 7 of 4 preferences 5 is 3 important 49 in 5 many 39 models 6 of 4 sexual 5 selection 46, mate 5 recognition 5, communication 9, and 37 speciation 46, yet 50 it 4 has 8 rarely 9 been 33 measured 7 precisely 9, Here I 9 examine 34 preference 7 shape 7 for 5 male 5 calling 5 song 5 in a 3 bushcricket *3 (katydid *48 ). Preferences 5 change 46 dramatically 9 between races 46 of 4 a 3 species 5, from strongly 9 directional to 3 broadly 9 stabilizing 45 (but 50 with a 3 net 49 directional 46 effect 46 ), Preference 5 shape 46 generally 9 matches 0 the 4 distribution 6 of 4 the 4 male 5 trait 5, This 4 is 3 compatible 9 with a 3 coevolutionary 46 model 0 of 4 signal 9 -preference 5 evolution 46, although 50 it 4 does 33 not 37 rule 0 out 7 an 3 alternative model 0, sensory 5 exploitation 50. Preference 46 shapes 40 are 8 shown 35 to 3 be 44 genetic in 5 origin 7. (graylevel = membership in topic 5, 46) Ritchie, Michael G. "The Shape of Female Mating Preferences." PNAS 93 (996): Copyright 996. Courtesy of the National Academy of Sciences, U.S.A. Used with permission.

22 The 4 shape 7 of 4 a 3 female 5 mating 5 preference 5 is 3 the 4 relationship 7 between 4 a 3 male 5 trait 5 and 37 the 4 probability 7 of 4 acceptance as 43 a 3 mating 5 partner 0, The 4 shape 7 of 4 preferences 5 is 3 important 49 in 5 many 39 models 6 of 4 sexual 5 selection 46, mate 5 recognition 5, communication 9, and 37 speciation 46, yet 50 it 4 has 8 rarely 9 been 33 measured 7 precisely 9, Here I 9 examine 34 preference 7 shape 7 for 5 male 5 calling 5 song 5 in a 3 bushcricket *3 (katydid *48 ). Preferences 5 change 46 dramatically 9 between races 46 of 4 a 3 species 5, from strongly 9 directional to 3 broadly 9 stabilizing 45 (but 50 with a 3 net 49 directional 46 effect 46 ), Preference 5 shape 46 generally 9 matches 0 the 4 distribution 6 of 4 the 4 male 5 trait 5, This 4 is 3 compatible 9 with a 3 coevolutionary 46 model 0 of 4 signal 9 -preference 5 evolution 46, although 50 it 4 does 33 not 37 rule 0 out 7 an 3 alternative model 0, sensory 5 exploitation 50. Preference 46 shapes 40 are 8 shown 35 to 3 be 44 genetic in 5 origin 7. (graylevel = membership in topic 5, 46, 5) Ritchie, Michael G. "The Shape of Female Mating Preferences." PNAS 93 (996): Copyright 996. Courtesy of the National Academy of Sciences, U.S.A. Used with permission.

23 Joint models of syntax and semantics (Griffiths, Steyvers, Blei & Tenenbaum, NIPS 004) Embed topics model inside an nth order Hidden Markov Model: Image removed due to copyright considerations. Please see: Griffiths, T. L., M. Steyvers, D. M. Blei, and J. B. Tenenbaum. "Integrating Topics and Syntax." Advances in Neural Information Processing Systems 7 (005).

24 Image removed due to copyright considerations. Please see: Griffiths, T. L., M. Steyvers, D. M. Blei, and J. B. Tenenbaum. "Integrating Topics and Syntax." Advances in Neural Information Processing Systems 7 (005). Semantic classes FOOD FOODS BODY NUTRIENTS DIET FAT SUGAR ENERGY MILK EATING FRUITS VEGETABLES WEIGHT FATS NEEDS CARBOHYDRATES VITAMINS CALORIES PROTEIN MINERALS MAP NORTH EARTH SOUTH POLE MAPS EQUATOR WEST LINES EAST AUSTRALIA GLOBE POLES HEMISPHERE LATITUDE PLACES LAND WORLD COMPASS CONTINENTS DOCTOR PATIENT HEALTH HOSPITAL MEDICAL CARE PATIENTS NURSE DOCTORS MEDICINE NURSING TREATMENT NURSES PHYSICIAN HOSPITALS DR SICK ASSISTANT EMERGENCY PRACTICE BOOK BOOKS READING INFORMATION LIBRARY REPORT PAGE TITLE SUBJECT PAGES GUIDE WORDS MATERIAL ARTICLE ARTICLES WORD FACTS AUTHOR REFERENCE NOTE GOLD IRON SILVER COPPER METAL METALS STEEL CLAY LEAD ADAM ORE ALUMINUM MINERAL MINE STONE MINERALS POT MINING MINERS TIN BEHAVIOR SELF INDIVIDUAL PERSONALITY RESPONSE SOCIAL EMOTIONAL LEARNING FEELINGS PSYCHOLOGISTS INDIVIDUALS PSYCHOLOGICAL EXPERIENCES ENVIRONMENT HUMAN RESPONSES BEHAVIORS ATTITUDES PSYCHOLOGY PERSON CELLS CELL ORGANISMS ALGAE BACTERIA MICROSCOPE MEMBRANE ORGANISM FOOD LIVING FUNGI MOLD MATERIALS NUCLEUS CELLED STRUCTURES MATERIAL STRUCTURE GREEN MOLDS PLANTS PLANT LEAVES SEEDS SOIL ROOTS FLOWERS WATER FOOD GREEN SEED STEMS FLOWER STEM LEAF ANIMALS ROOT POLLEN GROWING GROW

25 Image removed due to copyright considerations. Please see: Griffiths, T. L., M. Steyvers, D. M. Blei, and J. B. Tenenbaum. "Integrating Topics and Syntax." Advances in Neural Information Processing Systems 7 (005). Syntactic classes SAID ASKED THOUGHT TOLD SAYS MEANS CALLED CRIED SHOWS ANSWERED TELLS REPLIED SHOUTED EXPLAINED LAUGHED MEANT WROTE SHOWED BELIEVED WHISPERED THE HIS THEIR YOUR HER ITS MY OUR THIS THESE A AN THAT NEW THOSE EACH MR ANY MRS ALL MORE SUCH LESS MUCH KNOWN JUST BETTER RATHER GREATER HIGHER LARGER LONGER FASTER EXACTLY SMALLER SOMETHING BIGGER FEWER LOWER ALMOST ON AT INTO FROM WITH THROUGH OVER AROUND AGAINST ACROSS UPON TOWARD UNDER ALONG NEAR BEHIND OFF ABOVE DOWN BEFORE GOOD SMALL NEW IMPORTANT GREAT LITTLE LARGE * BIG LONG HIGH DIFFERENT SPECIAL OLD STRONG YOUNG COMMON WHITE SINGLE CERTAIN ONE SOME MANY TWO EACH ALL MOST ANY THREE THIS EVERY SEVERAL FOUR FIVE BOTH TEN SIX MUCH TWENTY EIGHT HE YOU THEY I SHE WE IT PEOPLE EVERYONE OTHERS SCIENTISTS SOMEONE WHO NOBODY ONE SOMETHING ANYONE EVERYBODY SOME THEN BE MAKE GET HAVE GO TAKE DO FIND USE SEE HELP KEEP GIVE LOOK COME WORK MOVE LIVE EAT BECOME

26 Corpus-specific factorization (NIPS) Image removed due to copyright considerations. Please see: Griffiths, T. L., M. Steyvers, D. M. Blei, and J. B. Tenenbaum. "Integrating Topics and Syntax." Advances in Neural Information Processing Systems 7 (005).

27 Syntactic classes in PNAS 5 IN FOR ON BETWEEN DURING AMONG FROM UNDER WITHIN THROUGHOUT THROUGH TOWARD INTO AT INVOLVING AFTER ACROSS AGAINST WHEN ALONG 8 ARE WERE WAS IS WHEN REMAIN REMAINS REMAINED PREVIOUSLY BECOME BECAME BEING BUT GIVE MERE APPEARED APPEAR ALLOWED NORMALLY EACH 4 THE THIS ITS THEIR AN EACH ONE ANY INCREASED EXOGENOUS OUR RECOMBINANT ENDOGENOUS TOTAL PURIFIED TILE FULL CHRONIC ANOTHER EXCESS 5 SUGGEST INDICATE SUGGESTING SUGGESTS SHOWED REVEALED SHOW DEMONSTRATE INDICATING PROVIDE SUPPORT INDICATES PROVIDES INDICATED DEMONSTRATED SHOWS SO REVEAL DEMONSTRATES SUGGESTED 6 LEVELS NUMBER LEVEL RATE TIME CONCENTRATIONS VARIETY RANGE CONCENTRATION DOSE FAMILY SET FREQUENCY SERIES AMOUNTS RATES CLASS VALUES AMOUNT SITES 30 RESULTS ANALYSIS DATA STUDIES STUDY FINDINGS EXPERIMENTS OBSERVATIONS HYPOTHESIS ANALYSES ASSAYS POSSIBILITY MICROSCOPY PAPER WORK EVIDENCE FINDING MUTAGENESIS OBSERVATION MEASUREMENTS REMAINED 33 BEEN MAY CAN COULD WELL DID DOES DO MIGHT SHOULD WILL WOULD MUST CANNOT THEY ALSO BECOME MAG LIKELY

28 Semantic highlighting Darker words are more likely to have been generated from the topic-based semantics module:

29 Outline Final thoughts on hierarchical Bayesian models and MCMC Bayesian classification Bayesian concept learning

30 Concepts and categories A category is a set of objects that are treated equivalently for some purpose. A concept is a mental representation of the category. Functions for concepts: Categorization/classification Prediction Inductive generalization Explanation Reference in communication and thought

31 Classical view of concepts (950 s-960 s): Concepts are rules or symbolic representations for classifying. Examples Psychology: Bruner et al. "Striped and Three Borders": Conjunctive Concept Figure by MIT OCW.

32 Classical view of concepts (950 s-960 s): Concepts are rules or symbolic representations Examples AI: Winston s arch learner Image removed due to copyright considerations. Please see: Winston, P. H., ed. The Psychology of Computer Vision. New York, NY: McGaw-Hill, 975. ISBN:

33 Statistical view of concepts (960 s-970 s) Examples Machine learning/statistics: Iris classification Images removed due to copyright considerations.

34 Standard version (960 s-970 s): Concepts are statistical representations for classifying. Examples Psychology: Posner and Keele Image removed due to copyright considerations. Please see: Posner, M. I., and S. W. Keele. "On the Genesis of Abstract Ideas." Journal of Experimental Psychology 77 (968):

35 Different levels of random distortion: Images removed due to copyright considerations.

36 Statistical pattern recognition Two-class classification problem: Images removed due to copyright considerations. The task: Given an object generated from class or class, infer the generating class.

37 Formalizing two-class classification: Images removed due to copyright considerations. The task: Observe x generated from c or c, compute: ) ( ) ( ) ( ) ( ) ( ) ( ) ( c p c x p c p c x p c p c x p x c p + = Different approaches vary in how they represent p(x c j ).

38 Parametric approach Assume a simple canonical form for p(x c j ). E.g., Gaussian distributions: Images removed due to copyright considerations.

39 Parametric approach Assume a simple canonical form for p(x c j ). The simplest Gaussians have all dimensions independent, variances equal for all classes: Classification based on distance to means. Covariance ellipse determines the distance metric.

40 Parametric approach p Assume a simple canonical form for p(x c j ). The simplest Gaussians have all dimensions independent, variances equal for all classes: Bayes net representation: C x x ( x c j ) = p( x c j ) p( x c j ) p ( x ij ) i ( x c ) e i µ i j naïve Bayes /(σ )

41 Parametric approach Other possible forms: All dimensions independent with variances equal across dimensions and classes: C x x naïve Bayes p ( x c j ) = p( x c j ) p( x p( x i c j ) e ( x i j µ ij ) /(σ ) c )

42 Parametric approach Other possible forms: All dimensions independent with equal variances, but variances differ across classes: C x x naïve Bayes p ( x c j ) = p( x c j ) p( x p j ( x ij ) j ( x c ) e i µ i j /(σ ) c )

43 Parametric approach Other possible forms: All dimensions independent, variances differ across dimensions and across classes: C x x naïve Bayes p ( x c j ) = p( x c j ) p( x p j ( x ij ) ij ( x c ) e i µ i j /(σ ) c )

44 Parametric approach Other possible forms: Arbitrary covariance matrices for each class. C x = {x, x } Board formula

45 Parametric approach p Assume a simple canonical form for p(x c j ). The simplest Gaussians have all dimensions independent, variances equal for all classes: Bayes net representation: C x x ( x c j ) = p( x c j ) p( x c j ) p ( x ij ) i ( x c ) e i µ i j naïve Bayes /(σ )

46 Learning Hypothesis space of possible Gaussians: Images removed due to copyright considerations. Find parameters that maximize likelihood of examples. µ r = mean of examples of class j. σij = standard deviation along dimension i, for examples in each class.

47 Relevance to human concept learning Natural categories often have Gaussian (or other simple parametric forms) in perceptual feature spaces. Prototype effects in categorization (Rosch) Posner & Keele studies of prototype abstraction in concept learning.

48 Posner and Keele: design Image removed due to copyright considerations. Please see: Posner, M. I., and S. W. Keele. "On the Genesis of Abstract Ideas." Journal of Experimental Psychology 77 (968):

49 Posner and Keele: results Image removed due to copyright considerations. Please see: Posner, M. I., and S. W. Keele. "On the Genesis of Abstract Ideas." Journal of Experimental Psychology 77 (968): Unseen prototype ( Schema ) classified as well as memorized variants, and much better than new random variants ( 5 ).

50 Parametric approach Other possible forms: All dimensions independent with variances equal across dimensions and classes: C x x naïve Bayes p ( x c j ) = p( x c j ) p( x j c ) p( x i c j ) e ( x i µ ij ) /(σ ) Equivalent to prototype model: r Prototype of class j: µ j = { µ j, µ Variability of categories: σ j }

51 Limitations Of this empirical paradigm? Of this computational approach?

52 Limitations Is categorization just discrimination among mutually exclusive classes? Overlapping concepts? Hierarchies? None of the above? Can we learn a single new concept? How do we learn concepts from just a few positive examples? Learning with high certainty from little data. Schema abstraction from one imperfect example. Are most categories Gaussian, or any simple parametric shape? What about superordinate categories? What about learning rule-based categories?

53 Limitations Is prototypicality = degree of membership? Armstrong et al.: No, for classical rule-based categories Not for complex real-world categories either: Christmas eve, Hollywood actress, Californian, Professor For natural kinds, huge variability in prototypicality independent of membership. Richer concepts? Meaningful stimuli, background knowledge, theories? Role of causal reasoning? Essentialism? Difference between perceptual and cognitive concepts?

Machine Learning

Machine Learning Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University April 5, 2011 Today: Latent Dirichlet Allocation topic models Social network analysis based on latent probabilistic

More information

Concepts & Categorization. Measurement of Similarity

Concepts & Categorization. Measurement of Similarity Concepts & Categorization Measurement of Similarity Geometric approach Featural approach both are vector representations Vector-representation for words Words represented as vectors of feature values Similar

More information

Outline. Limits of Bayesian classification Bayesian concept learning Probabilistic models for unsupervised and semi-supervised category learning

Outline. Limits of Bayesian classification Bayesian concept learning Probabilistic models for unsupervised and semi-supervised category learning Outline Limits of Bayesian classification Bayesian concept learning Probabilistic models for unsupervised and semi-supervised category learning Limitations Is categorization just discrimination among mutually

More information

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make

More information

Information retrieval LSI, plsi and LDA. Jian-Yun Nie

Information retrieval LSI, plsi and LDA. Jian-Yun Nie Information retrieval LSI, plsi and LDA Jian-Yun Nie Basics: Eigenvector, Eigenvalue Ref: http://en.wikipedia.org/wiki/eigenvector For a square matrix A: Ax = λx where x is a vector (eigenvector), and

More information

Topic Models. Brandon Malone. February 20, Latent Dirichlet Allocation Success Stories Wrap-up

Topic Models. Brandon Malone. February 20, Latent Dirichlet Allocation Success Stories Wrap-up Much of this material is adapted from Blei 2003. Many of the images were taken from the Internet February 20, 2014 Suppose we have a large number of books. Each is about several unknown topics. How can

More information

Brief Introduction of Machine Learning Techniques for Content Analysis

Brief Introduction of Machine Learning Techniques for Content Analysis 1 Brief Introduction of Machine Learning Techniques for Content Analysis Wei-Ta Chu 2008/11/20 Outline 2 Overview Gaussian Mixture Model (GMM) Hidden Markov Model (HMM) Support Vector Machine (SVM) Overview

More information

Introduction: MLE, MAP, Bayesian reasoning (28/8/13)

Introduction: MLE, MAP, Bayesian reasoning (28/8/13) STA561: Probabilistic machine learning Introduction: MLE, MAP, Bayesian reasoning (28/8/13) Lecturer: Barbara Engelhardt Scribes: K. Ulrich, J. Subramanian, N. Raval, J. O Hollaren 1 Classifiers In this

More information

Latent Dirichlet Allocation (LDA)

Latent Dirichlet Allocation (LDA) Latent Dirichlet Allocation (LDA) D. Blei, A. Ng, and M. Jordan. Journal of Machine Learning Research, 3:993-1022, January 2003. Following slides borrowed ant then heavily modified from: Jonathan Huang

More information

Latent Dirichlet Allocation Introduction/Overview

Latent Dirichlet Allocation Introduction/Overview Latent Dirichlet Allocation Introduction/Overview David Meyer 03.10.2016 David Meyer http://www.1-4-5.net/~dmm/ml/lda_intro.pdf 03.10.2016 Agenda What is Topic Modeling? Parametric vs. Non-Parametric Models

More information

Topic Models and Applications to Short Documents

Topic Models and Applications to Short Documents Topic Models and Applications to Short Documents Dieu-Thu Le Email: dieuthu.le@unitn.it Trento University April 6, 2011 1 / 43 Outline Introduction Latent Dirichlet Allocation Gibbs Sampling Short Text

More information

Year 7 Science 7B1: Microscopes, Cells and Plant Reproduction PPA Challenge

Year 7 Science 7B1: Microscopes, Cells and Plant Reproduction PPA Challenge Year 7 Science 7B1: Microscopes, Cells and Plant Reproduction PPA Challenge Name: Form: Task Sheet 1 (Bronze Challenge): What parts does a microscope have? Use the words in the boxes below to label the

More information

TOTAL SCIENCE KEY STAGE LEVELS TEST B. Borderline check TEST B. First Name. Last Name. School

TOTAL SCIENCE KEY STAGE LEVELS TEST B. Borderline check TEST B. First Name. Last Name. School SCIENCE KEY STAGE 2 2003 TEST B LEVELS 3 5 PAGE 5 7 9 11 13 15 17 19 TOTAL Borderline check MARKS TEST B First Name Last Name School INSTRUCTIONS Read this carefully. Answers This shows where you will

More information

Lecture 13 : Variational Inference: Mean Field Approximation

Lecture 13 : Variational Inference: Mean Field Approximation 10-708: Probabilistic Graphical Models 10-708, Spring 2017 Lecture 13 : Variational Inference: Mean Field Approximation Lecturer: Willie Neiswanger Scribes: Xupeng Tong, Minxing Liu 1 Problem Setup 1.1

More information

Non-Parametric Bayes

Non-Parametric Bayes Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian

More information

Part IV: Monte Carlo and nonparametric Bayes

Part IV: Monte Carlo and nonparametric Bayes Part IV: Monte Carlo and nonparametric Bayes Outline Monte Carlo methods Nonparametric Bayesian models Outline Monte Carlo methods Nonparametric Bayesian models The Monte Carlo principle The expectation

More information

Lincoln County Schools Patriot Day Instructional Expectations Patriot Day 1 School: Course/Subject: Biology Teacher: Cox Brock Gilbert Carr

Lincoln County Schools Patriot Day Instructional Expectations Patriot Day 1 School: Course/Subject: Biology Teacher: Cox Brock Gilbert Carr Lincoln County Schools Patriot Day Instructional Expectations Patriot Day 1 School: Course/Subject: Biology Teacher: Cox Brock Gilbert Carr Learning Target: B.1.a Analyze the similarities and differences

More information

Generative Clustering, Topic Modeling, & Bayesian Inference

Generative Clustering, Topic Modeling, & Bayesian Inference Generative Clustering, Topic Modeling, & Bayesian Inference INFO-4604, Applied Machine Learning University of Colorado Boulder December 12-14, 2017 Prof. Michael Paul Unsupervised Naïve Bayes Last week

More information

16 : Approximate Inference: Markov Chain Monte Carlo

16 : Approximate Inference: Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models 10-708, Spring 2017 16 : Approximate Inference: Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Yuan Yang, Chao-Ming Yen 1 Introduction As the target distribution

More information

Kernel Density Topic Models: Visual Topics Without Visual Words

Kernel Density Topic Models: Visual Topics Without Visual Words Kernel Density Topic Models: Visual Topics Without Visual Words Konstantinos Rematas K.U. Leuven ESAT-iMinds krematas@esat.kuleuven.be Mario Fritz Max Planck Institute for Informatics mfrtiz@mpi-inf.mpg.de

More information

Science Unit Learning Summary

Science Unit Learning Summary Learning Summary Inheritance, variation and evolution Content Sexual and asexual reproduction. Meiosis leads to non-identical cells being formed while mitosis leads to identical cells being formed. In

More information

Algorithmisches Lernen/Machine Learning

Algorithmisches Lernen/Machine Learning Algorithmisches Lernen/Machine Learning Part 1: Stefan Wermter Introduction Connectionist Learning (e.g. Neural Networks) Decision-Trees, Genetic Algorithms Part 2: Norman Hendrich Support-Vector Machines

More information

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry May 2015 1 Abstract In this article, we explore

More information

Statistical Models. David M. Blei Columbia University. October 14, 2014

Statistical Models. David M. Blei Columbia University. October 14, 2014 Statistical Models David M. Blei Columbia University October 14, 2014 We have discussed graphical models. Graphical models are a formalism for representing families of probability distributions. They are

More information

CS534 Machine Learning - Spring Final Exam

CS534 Machine Learning - Spring Final Exam CS534 Machine Learning - Spring 2013 Final Exam Name: You have 110 minutes. There are 6 questions (8 pages including cover page). If you get stuck on one question, move on to others and come back to the

More information

Hidden Markov Models

Hidden Markov Models 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Hidden Markov Models Matt Gormley Lecture 22 April 2, 2018 1 Reminders Homework

More information

Pachinko Allocation: DAG-Structured Mixture Models of Topic Correlations

Pachinko Allocation: DAG-Structured Mixture Models of Topic Correlations : DAG-Structured Mixture Models of Topic Correlations Wei Li and Andrew McCallum University of Massachusetts, Dept. of Computer Science {weili,mccallum}@cs.umass.edu Abstract Latent Dirichlet allocation

More information

Goodman s problem. Distinguishing lawlike hypotheses from accidental hypotheses is not easy:

Goodman s problem. Distinguishing lawlike hypotheses from accidental hypotheses is not easy: Goodman s problem Why do some hypotheses receive confirmation from examples but not others? All piece of copper conduct electricity : yes All men in this room are third sons : no Distinguishing lawlike

More information

The Bayes classifier

The Bayes classifier The Bayes classifier Consider where is a random vector in is a random variable (depending on ) Let be a classifier with probability of error/risk given by The Bayes classifier (denoted ) is the optimal

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Revealing inductive biases through iterated learning

Revealing inductive biases through iterated learning Revealing inductive biases through iterated learning Tom Griffiths Department of Psychology Cognitive Science Program UC Berkeley with Mike Kalish, Brian Christian, Simon Kirby, Mike Dowman, Stephan Lewandowsky

More information

CS Lecture 18. Topic Models and LDA

CS Lecture 18. Topic Models and LDA CS 6347 Lecture 18 Topic Models and LDA (some slides by David Blei) Generative vs. Discriminative Models Recall that, in Bayesian networks, there could be many different, but equivalent models of the same

More information

Topic Modelling and Latent Dirichlet Allocation

Topic Modelling and Latent Dirichlet Allocation Topic Modelling and Latent Dirichlet Allocation Stephen Clark (with thanks to Mark Gales for some of the slides) Lent 2013 Machine Learning for Language Processing: Lecture 7 MPhil in Advanced Computer

More information

Generative Models for Sentences

Generative Models for Sentences Generative Models for Sentences Amjad Almahairi PhD student August 16 th 2014 Outline 1. Motivation Language modelling Full Sentence Embeddings 2. Approach Bayesian Networks Variational Autoencoders (VAE)

More information

Understanding The Law of Attraction

Understanding The Law of Attraction Psychic Insight 4-5-17 New Blog Below Understanding The Law of Attraction Everything in the universe is energy and vibrates at a distinct frequency. Understanding that these energies are attracted to similar

More information

Recent Advances in Bayesian Inference Techniques

Recent Advances in Bayesian Inference Techniques Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian

More information

Latent Variable Models Probabilistic Models in the Study of Language Day 4

Latent Variable Models Probabilistic Models in the Study of Language Day 4 Latent Variable Models Probabilistic Models in the Study of Language Day 4 Roger Levy UC San Diego Department of Linguistics Preamble: plate notation for graphical models Here is the kind of hierarchical

More information

RECSM Summer School: Facebook + Topic Models. github.com/pablobarbera/big-data-upf

RECSM Summer School: Facebook + Topic Models. github.com/pablobarbera/big-data-upf RECSM Summer School: Facebook + Topic Models Pablo Barberá School of International Relations University of Southern California pablobarbera.com Networked Democracy Lab www.netdem.org Course website: github.com/pablobarbera/big-data-upf

More information

What are Cells? How is this bacterium similar to a human? organism: a living thing. The cell is the basic unit of life.

What are Cells? How is this bacterium similar to a human? organism: a living thing. The cell is the basic unit of life. Have you ever wondered how people are similar to bacteria? It may seem like a silly question. After all, humans and bacteria are very different in size and complexity. Yet scientists have learned that

More information

Bayesian Networks Inference with Probabilistic Graphical Models

Bayesian Networks Inference with Probabilistic Graphical Models 4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning

More information

Interpretable Latent Variable Models

Interpretable Latent Variable Models Interpretable Latent Variable Models Fernando Perez-Cruz Bell Labs (Nokia) Department of Signal Theory and Communications, University Carlos III in Madrid 1 / 24 Outline 1 Introduction to Machine Learning

More information

CS242: Probabilistic Graphical Models Lecture 4B: Learning Tree-Structured and Directed Graphs

CS242: Probabilistic Graphical Models Lecture 4B: Learning Tree-Structured and Directed Graphs CS242: Probabilistic Graphical Models Lecture 4B: Learning Tree-Structured and Directed Graphs Professor Erik Sudderth Brown University Computer Science October 6, 2016 Some figures and materials courtesy

More information

The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.

The exam is closed book, closed calculator, and closed notes except your one-page crib sheet. CS 188 Fall 2018 Introduction to Artificial Intelligence Practice Final You have approximately 2 hours 50 minutes. The exam is closed book, closed calculator, and closed notes except your one-page crib

More information

Generative classifiers: The Gaussian classifier. Ata Kaban School of Computer Science University of Birmingham

Generative classifiers: The Gaussian classifier. Ata Kaban School of Computer Science University of Birmingham Generative classifiers: The Gaussian classifier Ata Kaban School of Computer Science University of Birmingham Outline We have already seen how Bayes rule can be turned into a classifier In all our examples

More information

1 Probabilities. 1.1 Basics 1 PROBABILITIES

1 Probabilities. 1.1 Basics 1 PROBABILITIES 1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability

More information

Hidden Markov Models

Hidden Markov Models 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Hidden Markov Models Matt Gormley Lecture 19 Nov. 5, 2018 1 Reminders Homework

More information

CS145: INTRODUCTION TO DATA MINING

CS145: INTRODUCTION TO DATA MINING CS145: INTRODUCTION TO DATA MINING Text Data: Topic Model Instructor: Yizhou Sun yzsun@cs.ucla.edu December 4, 2017 Methods to be Learnt Vector Data Set Data Sequence Data Text Data Classification Clustering

More information

Unsupervised Learning with Permuted Data

Unsupervised Learning with Permuted Data Unsupervised Learning with Permuted Data Sergey Kirshner skirshne@ics.uci.edu Sridevi Parise sparise@ics.uci.edu Padhraic Smyth smyth@ics.uci.edu School of Information and Computer Science, University

More information

11 CHI-SQUARED Introduction. Objectives. How random are your numbers? After studying this chapter you should

11 CHI-SQUARED Introduction. Objectives. How random are your numbers? After studying this chapter you should 11 CHI-SQUARED Chapter 11 Chi-squared Objectives After studying this chapter you should be able to use the χ 2 distribution to test if a set of observations fits an appropriate model; know how to calculate

More information

Structure learning in human causal induction

Structure learning in human causal induction Structure learning in human causal induction Joshua B. Tenenbaum & Thomas L. Griffiths Department of Psychology Stanford University, Stanford, CA 94305 jbt,gruffydd @psych.stanford.edu Abstract We use

More information

Final Exam, Fall 2002

Final Exam, Fall 2002 15-781 Final Exam, Fall 22 1. Write your name and your andrew email address below. Name: Andrew ID: 2. There should be 17 pages in this exam (excluding this cover sheet). 3. If you need more room to work

More information

10/17/04. Today s Main Points

10/17/04. Today s Main Points Part-of-speech Tagging & Hidden Markov Model Intro Lecture #10 Introduction to Natural Language Processing CMPSCI 585, Fall 2004 University of Massachusetts Amherst Andrew McCallum Today s Main Points

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Grades 6 8 Overview of Science and Engineering Practices

Grades 6 8 Overview of Science and Engineering Practices Grades 6 8 Overview of Science and Engineering Practices Active engagement of middle school students with the science and engineering practices is critical as students generally make up their minds about

More information

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)

More information

Applying Latent Dirichlet Allocation to Group Discovery in Large Graphs

Applying Latent Dirichlet Allocation to Group Discovery in Large Graphs Lawrence Livermore National Laboratory Applying Latent Dirichlet Allocation to Group Discovery in Large Graphs Keith Henderson and Tina Eliassi-Rad keith@llnl.gov and eliassi@llnl.gov This work was performed

More information

Lecture 3: Pattern Classification

Lecture 3: Pattern Classification EE E6820: Speech & Audio Processing & Recognition Lecture 3: Pattern Classification 1 2 3 4 5 The problem of classification Linear and nonlinear classifiers Probabilistic classification Gaussians, mixtures

More information

Bayesian probability theory and generative models

Bayesian probability theory and generative models Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

EEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1

EEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1 EEL 851: Biometrics An Overview of Statistical Pattern Recognition EEL 851 1 Outline Introduction Pattern Feature Noise Example Problem Analysis Segmentation Feature Extraction Classification Design Cycle

More information

Bayes Rule. CS789: Machine Learning and Neural Network Bayesian learning. A Side Note on Probability. What will we learn in this lecture?

Bayes Rule. CS789: Machine Learning and Neural Network Bayesian learning. A Side Note on Probability. What will we learn in this lecture? Bayes Rule CS789: Machine Learning and Neural Network Bayesian learning P (Y X) = P (X Y )P (Y ) P (X) Jakramate Bootkrajang Department of Computer Science Chiang Mai University P (Y ): prior belief, prior

More information

Dimension Reduction (PCA, ICA, CCA, FLD,

Dimension Reduction (PCA, ICA, CCA, FLD, Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction

More information

The Origin of Deep Learning. Lili Mou Jan, 2015

The Origin of Deep Learning. Lili Mou Jan, 2015 The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets

More information

Click Prediction and Preference Ranking of RSS Feeds

Click Prediction and Preference Ranking of RSS Feeds Click Prediction and Preference Ranking of RSS Feeds 1 Introduction December 11, 2009 Steven Wu RSS (Really Simple Syndication) is a family of data formats used to publish frequently updated works. RSS

More information

DATA MINING LECTURE 10

DATA MINING LECTURE 10 DATA MINING LECTURE 10 Classification Nearest Neighbor Classification Support Vector Machines Logistic Regression Naïve Bayes Classifier Supervised Learning 10 10 Illustrating Classification Task Tid Attrib1

More information

Naïve Bayes Introduction to Machine Learning. Matt Gormley Lecture 18 Oct. 31, 2018

Naïve Bayes Introduction to Machine Learning. Matt Gormley Lecture 18 Oct. 31, 2018 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Naïve Bayes Matt Gormley Lecture 18 Oct. 31, 2018 1 Reminders Homework 6: PAC Learning

More information

1 Probabilities. 1.1 Basics 1 PROBABILITIES

1 Probabilities. 1.1 Basics 1 PROBABILITIES 1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability

More information

Machine Learning, Fall 2009: Midterm

Machine Learning, Fall 2009: Midterm 10-601 Machine Learning, Fall 009: Midterm Monday, November nd hours 1. Personal info: Name: Andrew account: E-mail address:. You are permitted two pages of notes and a calculator. Please turn off all

More information

Computational Cognitive Science

Computational Cognitive Science Computational Cognitive Science Lecture 9: A Bayesian model of concept learning Chris Lucas School of Informatics University of Edinburgh October 16, 218 Reading Rules and Similarity in Concept Learning

More information

Latent Dirichlet Allocation (LDA)

Latent Dirichlet Allocation (LDA) Latent Dirichlet Allocation (LDA) A review of topic modeling and customer interactions application 3/11/2015 1 Agenda Agenda Items 1 What is topic modeling? Intro Text Mining & Pre-Processing Natural Language

More information

Gaussian and Linear Discriminant Analysis; Multiclass Classification

Gaussian and Linear Discriminant Analysis; Multiclass Classification Gaussian and Linear Discriminant Analysis; Multiclass Classification Professor Ameet Talwalkar Slide Credit: Professor Fei Sha Professor Ameet Talwalkar CS260 Machine Learning Algorithms October 13, 2015

More information

Pig organ transplants within 5 years

Pig organ transplants within 5 years www.breaking News English.com Ready-to-use ESL / EFL Lessons Pig organ transplants within 5 years URL: http://www.breakingnewsenglish.com/0509/050911-xenotransplant.html Today s contents The Article 2

More information

CMPT Machine Learning. Bayesian Learning Lecture Scribe for Week 4 Jan 30th & Feb 4th

CMPT Machine Learning. Bayesian Learning Lecture Scribe for Week 4 Jan 30th & Feb 4th CMPT 882 - Machine Learning Bayesian Learning Lecture Scribe for Week 4 Jan 30th & Feb 4th Stephen Fagan sfagan@sfu.ca Overview: Introduction - Who was Bayes? - Bayesian Statistics Versus Classical Statistics

More information

DEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE

DEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE Data Provided: None DEPARTMENT OF COMPUTER SCIENCE Autumn Semester 203 204 MACHINE LEARNING AND ADAPTIVE INTELLIGENCE 2 hours Answer THREE of the four questions. All questions carry equal weight. Figures

More information

CPSC 340: Machine Learning and Data Mining. More PCA Fall 2017

CPSC 340: Machine Learning and Data Mining. More PCA Fall 2017 CPSC 340: Machine Learning and Data Mining More PCA Fall 2017 Admin Assignment 4: Due Friday of next week. No class Monday due to holiday. There will be tutorials next week on MAP/PCA (except Monday).

More information

Text Mining for Economics and Finance Latent Dirichlet Allocation

Text Mining for Economics and Finance Latent Dirichlet Allocation Text Mining for Economics and Finance Latent Dirichlet Allocation Stephen Hansen Text Mining Lecture 5 1 / 45 Introduction Recall we are interested in mixed-membership modeling, but that the plsi model

More information

KEY CONCEPTS AND PROCESS SKILLS. 2. Most infectious diseases are caused by microbes.

KEY CONCEPTS AND PROCESS SKILLS. 2. Most infectious diseases are caused by microbes. Who s Who? 44 40- to 1 50-minute session ACTIVITY OVERVIEW I N V E S T I O N I G AT SUMMARY Cards with images of the major groups of disease-causing microbes (s, bacteria, and es) are presented. Students

More information

CS281 Section 4: Factor Analysis and PCA

CS281 Section 4: Factor Analysis and PCA CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we

More information

Language Supportive Teaching and Textbooks in Tanzania. Course for textbook writers, editors and illustrators John Clegg, July 2013

Language Supportive Teaching and Textbooks in Tanzania. Course for textbook writers, editors and illustrators John Clegg, July 2013 LSTT Language Supportive Teaching and Textbooks in Tanzania Course for textbook writers, editors and illustrators John Clegg, July 2013 Parts 1-3 are in separate document Part 4: Biology book lesson structure

More information

The Little Chicken Named

The Little Chicken Named The Little Chicken Named By Wanda Steward READING GUIDE At Pearson we are dedicated to helping people make progress in their lives through learning and we re proud that our work with Project Literacy,

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

STA 414/2104: Machine Learning

STA 414/2104: Machine Learning STA 414/2104: Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistics! rsalakhu@cs.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 9 Sequential Data So far

More information

Learning the Semantic Correlation: An Alternative Way to Gain from Unlabeled Text

Learning the Semantic Correlation: An Alternative Way to Gain from Unlabeled Text Learning the Semantic Correlation: An Alternative Way to Gain from Unlabeled Text Yi Zhang Machine Learning Department Carnegie Mellon University yizhang1@cs.cmu.edu Jeff Schneider The Robotics Institute

More information

Machine Learning! in just a few minutes. Jan Peters Gerhard Neumann

Machine Learning! in just a few minutes. Jan Peters Gerhard Neumann Machine Learning! in just a few minutes Jan Peters Gerhard Neumann 1 Purpose of this Lecture Foundations of machine learning tools for robotics We focus on regression methods and general principles Often

More information

Latent Dirichlet Allocation Based Multi-Document Summarization

Latent Dirichlet Allocation Based Multi-Document Summarization Latent Dirichlet Allocation Based Multi-Document Summarization Rachit Arora Department of Computer Science and Engineering Indian Institute of Technology Madras Chennai - 600 036, India. rachitar@cse.iitm.ernet.in

More information

Approximate Bayesian Computation

Approximate Bayesian Computation Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki and Aalto University 1st December 2015 Content Two parts: 1. The basics of approximate

More information

where Female = 0 for males, = 1 for females Age is measured in years (22, 23, ) GPA is measured in units on a four-point scale (0, 1.22, 3.45, etc.

where Female = 0 for males, = 1 for females Age is measured in years (22, 23, ) GPA is measured in units on a four-point scale (0, 1.22, 3.45, etc. Notes on regression analysis 1. Basics in regression analysis key concepts (actual implementation is more complicated) A. Collect data B. Plot data on graph, draw a line through the middle of the scatter

More information

Human-level concept learning through probabilistic program induction

Human-level concept learning through probabilistic program induction B.M Lake, R. Salakhutdinov, J.B. Tenenbaum Human-level concept learning through probabilistic program induction journal club at two aspects in which machine learning spectacularly lags behind human learning

More information

Lecture 19, November 19, 2012

Lecture 19, November 19, 2012 Machine Learning 0-70/5-78, Fall 0 Latent Space Analysis SVD and Topic Models Eric Xing Lecture 9, November 9, 0 Reading: Tutorial on Topic Model @ ACL Eric Xing @ CMU, 006-0 We are inundated with data

More information

Grade Seven Science Focus on Life Sciences. Main Ideas in the Study of Cells

Grade Seven Science Focus on Life Sciences. Main Ideas in the Study of Cells Grade Seven Science Focus on Life Sciences Main Ideas in the Study of Cells Research is an effective way to develop a deeper understanding of challenging content. The following fill-in-the-blanks activity

More information

Tutorial on Approximate Bayesian Computation

Tutorial on Approximate Bayesian Computation Tutorial on Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology 16 May 2016

More information

Bayesian Learning. Examples. Conditional Probability. Two Roles for Bayesian Methods. Prior Probability and Random Variables. The Chain Rule P (B)

Bayesian Learning. Examples. Conditional Probability. Two Roles for Bayesian Methods. Prior Probability and Random Variables. The Chain Rule P (B) Examples My mood can take 2 possible values: happy, sad. The weather can take 3 possible vales: sunny, rainy, cloudy My friends know me pretty well and say that: P(Mood=happy Weather=rainy) = 0.25 P(Mood=happy

More information

McDougal Littell Science, Cells and Heredity MAZER PDF. IL Essential Lesson. IL Extend Lesson. Program Planning Guide LP page.

McDougal Littell Science, Cells and Heredity MAZER PDF. IL Essential Lesson. IL Extend Lesson. Program Planning Guide LP page. s7an-ppg-pc-il-002-012.indd 2 7/18/05 2:46:40 PM 2 McDougal Littell Science, Cells and Heredity Chapter 1: The Cell, pp. 6 37 1.1 The cell is the basic unit of living things. pp. 9 17 Explore: Activity

More information

Leigha Schultze 10/13/15 Blog Period 3

Leigha Schultze 10/13/15 Blog Period 3 Leigha Schultze 10/13/15 Blog Period 3 Monday We did not have school on Monday due to Columbus day. Tuesday Today we did an exciting Gizmo on plate tectonics! Before we began, we answered two questions

More information

Introduction to Biology

Introduction to Biology 2- Introduction to Biology Why is Biology important? To study DNA: forensics Health, medicine. Agriculture Animals Bacteria/ Viruses! BIO=life LOGY=study Biology : The study of life 1- Copyright The McGraw-Hill

More information

Draft Proof - Do not copy, post, or distribute

Draft Proof - Do not copy, post, or distribute 1 LEARNING OBJECTIVES After reading this chapter, you should be able to: 1. Distinguish between descriptive and inferential statistics. Introduction to Statistics 2. Explain how samples and populations,

More information

Dynamic Probabilistic Models for Latent Feature Propagation in Social Networks

Dynamic Probabilistic Models for Latent Feature Propagation in Social Networks Dynamic Probabilistic Models for Latent Feature Propagation in Social Networks Creighton Heaukulani and Zoubin Ghahramani University of Cambridge TU Denmark, June 2013 1 A Network Dynamic network data

More information

Latent Dirichlet Alloca/on

Latent Dirichlet Alloca/on Latent Dirichlet Alloca/on Blei, Ng and Jordan ( 2002 ) Presented by Deepak Santhanam What is Latent Dirichlet Alloca/on? Genera/ve Model for collec/ons of discrete data Data generated by parameters which

More information