24 Multiple Eigenvectors; Latent Factor Analysis; Nearest Neighbors
|
|
- Samson Richard
- 6 years ago
- Views:
Transcription
1 Multiple Eigenvectrs; Latent Factr Analysis; Nearest Neighbrs Multiple Eigenvectrs; Latent Factr Analysis; Nearest Neighbrs Clustering w/multiple Eigenvectrs [When we use the Fiedler vectr fr spectral graph clustering, it tells us hw t divide a graph int tw graphs. If we want mre than tw clusters, we can use divisive clustering: we repeatedly cut the subgraphs int smaller subgraphs by cmputing their Fiedler vectrs. Hwever, there are several ther methds t subdivide a graph int k clusters in ne sht that use multiple eigenvectrs rather than just the Fiedler vectr v 2. These methds are usually faster and smetimes give better results. They use k eigenvectrs in a natural way t cluster a graph int k subgraphs.] Fr k clusters, cmpute first k eigenvectrs v =, v 2,...,v k f generalized eigensystem Lv = Mv. V = V [V s clumns are the eigenvectrs with the k smallest eigenvalues.] v 2 [Yes, we d include the all- s vectr v as ne f = the clumns f V.] v k V n [Draw this by hand. eigenvectrs.pdf ] n k Rw V i is spectral vectr [my name] fr vertex i. [The rws are vectrs in a k-dimensinal space I ll call the spectral space. When we were using just ne eigenvectr, it made sense t cluster vertices tgether if their cmpnents were clse tgether. When we use mre than ne eigenvectr, it turns ut that it makes sense t cluster vertices tgether if their spectral vectrs pint in similar directins.] Nrmalize each rw V i t unit length. [Nw yu can think f the spectral vectrs as pints n a unit sphere centered at the rigin.] [Draw this by hand vectrclusters.png ] [A 2D example shwing tw clusters n a circle. If the graph has k cmpnents, the pints in each cluster will have identical spectral vectrs that are exactly rthgnal t all the ther cmpnents spectral vectrs (left). If we mdify the graph by cnnecting these cmpnents with small-weight edges, we get vectrs mre like thse at right nt exactly rthgnal, but still tending tward distinct clusters.] k-means cluster these vectrs. [Because all the spectral vectrs lie n the sphere, k-means clustering will cluster tgether vectrs that are separated by small angles.]
2 48 Jnathan Richard Shewchuk cmpkmeans.png, cmpspectral.png [Cmparisn f pint sets clustered by k-means just k-means by itself, that is vs. a spectral methd. T create a graph fr the spectral methd, we use an expnentially decaying functin t assign weights t pairs f pints, like we used fr image segmentatin but withut the brightnesses.] Invented by [ur wn] Prf. Michael Jrdan, Andrew Ng [when he was still a student at Berkeley], Yair Weiss. [This wasn t the first algrithm t use multiple eigenvectrs fr spectral clustering, but it has becme ne f the mst ppular.]
3 Multiple Eigenvectrs; Latent Factr Analysis; Nearest Neighbrs 49 LATENT FACTOR ANALYSIS [aka Latent Semantic Indexing] [Yu can think f this as dimensinality reductin fr matrices.] Suppse X is a term-dcument matrix: [aka bag-f-wrds mdel] rw i represents dcument i; clumn j represents term j. [Term = wrd.] [Term-dcument matrices are usually sparse, meaning mst entries are zer.] X ij = ccurrences f term j in dc i better: lg (+ ccurrences) [S frequent wrds dn t dminate.] [Better still is t weight the entries s rare wrds give big entries and cmmn wrds like the give small entries. T d that, yu need t knw hw frequently each wrd ccurs in general. I ll mit the details, but this is the cmmn practice.] Recall SVD X = UDV > = dx i= iu i v > i. Suppse i apple j fr i j. Unlike PCA, we usually dn t center X. Fr greatest i, each v i lists terms in a genre/cluster f dcuments each u i dcus in a genre using similar/related terms E.g. u might have large cmpnents fr the rmance nvels, v fr terms passin, ravish, bdice... [... and wuld give us an idea hw much bigger the rmance nvel market is than the markets fr every ther genre f bks.] [v and u tell us that there is a large subset f bks that tend t use the same large subset f wrds. We can read the wrds by lking at the larger cmpnents f v, and we can read the bks by lking at the larger cmpnents f u.] [The prperty f being a rmance nvel is an example f a latent factr. S is the prperty f being the srt f wrd used in rmance nvels. There s nthing in X that tells yu explicitly that rmance nvels exist, but the genre is a hidden cnnectin between them that gives them a large singular value. The vectr u reveals which bks have that genre, and v reveals which wrds are emphasized in that genre.] Like clustering, but clusters verlap: if u picks ut rmances & u 2 picks ut histries, they bth pick ut histrical rmances. [S yu can think f latent factr analysis as a srt f clustering that permits clusters t verlap. Anther way in which it di ers frm traditinal clustering is that the u-vectrs cntain real numbers, and s sme pints have strnger cluster membership than thers. One bk might be just a bit rmance, anther a lt.]
4 50 Jnathan Richard Shewchuk Applicatin in market research: identifying cnsumer types (hipster, sccer mm) & items bught tgether. [Fr applicatins like this, the first few singular vectrs are the mst useful. Mst f the singular vectrs are mstly nise, and they have small singular values t tell yu s. This mtivates apprximating a matrix by using nly sme f its singular vectrs.] Truncated sum X 0 = rx i= iu i v > i is a lw-rank apprximatin f X, f rank r. [We chse the singular vectrs with the largest singular values, because they carry the mst infrmatin.] X 0 = u... 0 v 0 v r r r r r d u r n d n r [Draw this by hand. truncate.pdf ] X 0 is the rank-r matrix that minimizes the [squared] Frbenius nrm X kx X 0 k 2 F = Xij Xij 0 2 Applicatins: i, j Fuzzy search. [Suppse yu want t find a dcument abut gasline prices, but the dcument yu want desn t have the wrd gasline ; it has the wrd petrl. One cl thing abut the reducedrank matrix X 0 is that it will prbably assciate that dcument with gasline, because the SVD tends t grup synnyms tgether.] Denising. [The idea is t assume that X is a nisy measurement f sme unknwn matrix that prbably has lw rank. If that assumptin is partly true, then the reduced-rank matrix X 0 might be better than the input X.] Matrix cmpressin. [As yu can see abve, if we use a lw-rank apprximatin with a small rank r, we can express the apprximate matrix as an SVD that takes up much less space than the riginal matrix. Often this rw-rank apprximatin supprts faster matrix cmputatins.] Cllabrative filtering: fills in unknwn values, e.g. user ratings. [Suppse the rws f X represents Netflix users and the clumns represent mvies. The entry X ij is the review scre that user i gave t mvie j. But mst users haven t reviewed mst mvies. We want t fill in the missing values. Just as the rank reductin will assciate petrl with gasline, it will tend t assciate users with similar tastes in mvies, s the reduced-rank matrix X 0 can predict ratings fr users wh didn t supply any. Yu ll try this ut in the last hmewrk.]
5 Multiple Eigenvectrs; Latent Factr Analysis; Nearest Neighbrs 5 NEAREST NEIGHBOR CLASSIFICATION [We re dne with unsupervised learning. Nw I m ging back t classifiers, and I saved the simplest fr the end f the semester.] Idea: Given query pint q, find the k sample pts nearest q. Distance metric f yur chice. Regressin: Return average label f the k pts. Classificatin: Return class with the mst vtes frm the k pts OR return histgram f class prbabilities. [The histgram f class prbabilities tries t estimate the psterir prbabilities f the classes. Obviusly, the histgram has limited precisin. If k = 3, then the nly prbabilities yu ll ever return are 0, /3, 2/3, r. Yu can imprve the precisin by making k larger, but yu might underfit. The histgram wrks best when yu have a huge amunt f data.] KNN: K=0 KNN: K= KNN: K=00 allnn.pdf (ISL, Figures 2.5, 2.6) [Examples f -NN, 0-NN, and 00-NN. A larger k smths ut the bundary. In this example, the -NN classifier is badly verfitting the data, and the 00-NN classifier is badly underfitting. The 0-NN classifier des well: it s reasnably clse t the Bayes decisin bundary. Generally, the ideal k depends n hw dense yur data is. As yur data gets denser, the best k increases.] [There are therems shwing that if yu have a lt f data, nearest neighbrs can wrk quite well.] Therem (Cver & Hart, 967): As n!, the -NN errr rate is < B(2 B) where B = Bayes risk. if nly 2 classes, apple 2B( B) [There are a few technical requirements f this therem. The mst imprtant is that the training pints and the test pints all have t be drawn independently frm the same prbability distributin just like in ur last lecture, n learning thery. The therem applies t any separable metric space, s it s nt just fr the Euclidean metric.] [By the way, this Cver is the same Thmas Cver f Cver s Therem in the last lecture. He s a prfessr in Electrical Engineering and Statistics at Stanfrd, and these are the first and third jurnal articles he published.] Therem (Fix & Hdges, 95): As n!, k!, k/n! 0, k-nn errr rate cnverges t B. [Which means ptimal.]
6 52 Jnathan Richard Shewchuk The Gemetry f High-Dimensinal Spaces Cnsider shell between spheres f radii r & r. [Draw this by hand cncentric.png ] [Cncentric balls. In high dimensins, almst every pint chsen unifrmly at randm in the uter ball lies utside the inner ball.] Vlume f uter ball / r d Vlume f inner ball / (r ) d Rati f inner ball vlume t uter = (r ) d r d = d exp r d r! which is small fr large d. E.g. if r = 0.&d = 00, inner ball has % f vlume. Randm pints frm unifrm distributin in ball: nearly all are in uter shell. Gaussian : nearly all are in sme shell. [If the dimensin is very high, the majrity f the randm pints generated frm an istrpic Gaussian distributin are apprximately at the same distance frm the center. S they lie in a thin shell. As the dimensin grws, the standard deviatin f a randm pint s distance t the center gets smaller and smaller cmpared t the distance itself. Yu can think f a pint frm a multivariate Gaussian distributin as a sample f d scalar values frm a univariate Gaussian. As d gets bigger, the mean f the squares f the cmpnents cnverges t the true mean fr the ppulatin.] [This is ne f the things that makes machine learning hard in high dimensins. Smetimes the nearest neighbr and the farthest neighbr aren t much di erent.] Exhaustive k-nn Alg. Given query pint q: Scan thrugh all n sample pts, cmputing (squared) distances t q. Maintain a max-heap with the k shrtest distances seen s far. [Whenever yu encunter a sample pint clser t q than the pint at the tp f the heap, yu remve the heap-tp pint and insert the better pint. Obviusly yu dn t need a heap if k = r even 3, but if k = 0 a heap will substantially speed up keeping track f the distance t beat.] Time t cnstruct classifier: 0 [This is the nly O(0)-time algrithm we ll learn this semester.] Query time: O(nd + n lg k) expected O(nd + k lg 2 k) if randm pt rder [It s a cute theretical bservatin that yu can slightly imprve the expected running time by randmizing the pint rder s that nly expected O(k lg k) heap peratins ccur. But in practice I dn t recmmend it; yu ll prbably lse mre frm cache misses than yu ll gain frm fewer heap peratins.]
7 Multiple Eigenvectrs; Latent Factr Analysis; Nearest Neighbrs 53 Randm Prjectin Prjectins int lw-d space like PCA speed up NN, but distances are apprximate. [Mst fast nearest-neighbr algrithms in mre than a few dimensins are apprximate nearest neighbr algrithms; we dn t necessarily expect t find the exact nearest neighbrs. Fr classificatin, that s usually su cient.] Randm prjectin is cheap alternative t PCA as preprcess fr NN r clustering. [Prjects nt a randm subspace instead f the best subspace, but takes a fractin f the time f PCA.] Pick a randm subspace S R d f dimensin k, where k = 2 ln(/ ) 2 /2 3 /3. q d Fr any pt q, let ˆq be k times [rthgnal] prjectin f q nt S. Fr any tw pts q, w 2 R d, ( ) qw 2 apple ˆqŵ 2 apple ( + ) qw 2 with prbability 2. [S the distance between the tw pints after prjecting is rarely much di erent than the distance befre. Fr reasnably reliable clustering r apprximate nearest neighbr search, it s custmary t chse apple /n 2. In practice, yu can experiment with k t find the best speed-accuracy trade. But the key bservatin is that yu need a subspace f dimensin (lg n). The hidden cnstant is large, thugh. Fr example, yu can bring,000,000 sample pints dwn t a 0,362-dimensinal space with a 0% errr in the distances.] [If yu want a prf f this, lk up references abut the Jhnsn Lindenstrauss Lemma.] 00000t000.pdf [Cmparisn f inter-pint distances befre and after prjecting pints in 00,000-dimensinal space dwn t,000 dimensins. This example suggests that the theretical bunds are a bit pessimistic cmpared t practice.] [Why des this wrk? A randm prjectin f a vectr is equivalent t taking a randm vectr and selecting k cmpnents. The mean f the squares f thse k sampled cmpnents apprximates the mean fr the whle ppulatin.] [Hw d yu get a unifrmly distributed randm prjectin directin? Yu can chse each cmpnent frm a univariate Gaussian distributin, then nrmalize the vectr t unit length. Hw d yu get a randm subspace? Yu can chse k randm vectrs, then use Gram-Schmidt rthgnalizatin t make them mutually rthnrmal. Interestingly, Indyk and Mtwani shw that if yu skip the expensive nrmalizatin and Gram-Schmidt steps, randm prjectin still wrks almst as well, because randm vectrs in a highdimensinal space are nearly equal in length and nearly rthgnal t each ther with high prbability.]
T Algorithmic methods for data mining. Slide set 6: dimensionality reduction
T-61.5060 Algrithmic methds fr data mining Slide set 6: dimensinality reductin reading assignment LRU bk: 11.1 11.3 PCA tutrial in mycurses (ptinal) ptinal: An Elementary Prf f a Therem f Jhnsn and Lindenstrauss,
More informationWhat is Statistical Learning?
What is Statistical Learning? Sales 5 10 15 20 25 Sales 5 10 15 20 25 Sales 5 10 15 20 25 0 50 100 200 300 TV 0 10 20 30 40 50 Radi 0 20 40 60 80 100 Newspaper Shwn are Sales vs TV, Radi and Newspaper,
More informationk-nearest Neighbor How to choose k Average of k points more reliable when: Large k: noise in attributes +o o noise in class labels
Mtivating Example Memry-Based Learning Instance-Based Learning K-earest eighbr Inductive Assumptin Similar inputs map t similar utputs If nt true => learning is impssible If true => learning reduces t
More information, which yields. where z1. and z2
The Gaussian r Nrmal PDF, Page 1 The Gaussian r Nrmal Prbability Density Functin Authr: Jhn M Cimbala, Penn State University Latest revisin: 11 September 13 The Gaussian r Nrmal Prbability Density Functin
More informationLecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff
Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised
More informationLecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff
Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised
More informationPattern Recognition 2014 Support Vector Machines
Pattern Recgnitin 2014 Supprt Vectr Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recgnitin 1 / 55 Overview 1 Separable Case 2 Kernel Functins 3 Allwing Errrs (Sft
More informationChapter 3: Cluster Analysis
Chapter 3: Cluster Analysis } 3.1 Basic Cncepts f Clustering 3.1.1 Cluster Analysis 3.1. Clustering Categries } 3. Partitining Methds 3..1 The principle 3.. K-Means Methd 3..3 K-Medids Methd 3..4 CLARA
More informationResampling Methods. Cross-validation, Bootstrapping. Marek Petrik 2/21/2017
Resampling Methds Crss-validatin, Btstrapping Marek Petrik 2/21/2017 Sme f the figures in this presentatin are taken frm An Intrductin t Statistical Learning, with applicatins in R (Springer, 2013) with
More informationCOMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification
COMP 551 Applied Machine Learning Lecture 5: Generative mdels fr linear classificatin Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Jelle Pineau Class web page: www.cs.mcgill.ca/~hvanh2/cmp551
More informationDistributions, spatial statistics and a Bayesian perspective
Distributins, spatial statistics and a Bayesian perspective Dug Nychka Natinal Center fr Atmspheric Research Distributins and densities Cnditinal distributins and Bayes Thm Bivariate nrmal Spatial statistics
More informationPart 3 Introduction to statistical classification techniques
Part 3 Intrductin t statistical classificatin techniques Machine Learning, Part 3, March 07 Fabi Rli Preamble ØIn Part we have seen that if we knw: Psterir prbabilities P(ω i / ) Or the equivalent terms
More informationIAML: Support Vector Machines
1 / 22 IAML: Supprt Vectr Machines Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester 1 2 / 22 Outline Separating hyperplane with maimum margin Nn-separable training data Epanding the input int
More informationIn SMV I. IAML: Support Vector Machines II. This Time. The SVM optimization problem. We saw:
In SMV I IAML: Supprt Vectr Machines II Nigel Gddard Schl f Infrmatics Semester 1 We sa: Ma margin trick Gemetry f the margin and h t cmpute it Finding the ma margin hyperplane using a cnstrained ptimizatin
More informationHomology groups of disks with holes
Hmlgy grups f disks with hles THEOREM. Let p 1,, p k } be a sequence f distinct pints in the interir unit disk D n where n 2, and suppse that fr all j the sets E j Int D n are clsed, pairwise disjint subdisks.
More informationAdmin. MDP Search Trees. Optimal Quantities. Reinforcement Learning
Admin Reinfrcement Learning Cntent adapted frm Berkeley CS188 MDP Search Trees Each MDP state prjects an expectimax-like search tree Optimal Quantities The value (utility) f a state s: V*(s) = expected
More informationTree Structured Classifier
Tree Structured Classifier Reference: Classificatin and Regressin Trees by L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stne, Chapman & Hall, 98. A Medical Eample (CART): Predict high risk patients
More informationDifferentiation Applications 1: Related Rates
Differentiatin Applicatins 1: Related Rates 151 Differentiatin Applicatins 1: Related Rates Mdel 1: Sliding Ladder 10 ladder y 10 ladder 10 ladder A 10 ft ladder is leaning against a wall when the bttm
More informationPrincipal Components
Principal Cmpnents Suppse we have N measurements n each f p variables X j, j = 1,..., p. There are several equivalent appraches t principal cmpnents: Given X = (X 1,... X p ), prduce a derived (and small)
More informationSection 5.8 Notes Page Exponential Growth and Decay Models; Newton s Law
Sectin 5.8 Ntes Page 1 5.8 Expnential Grwth and Decay Mdels; Newtn s Law There are many applicatins t expnential functins that we will fcus n in this sectin. First let s lk at the expnential mdel. Expnential
More informationAP Statistics Notes Unit Two: The Normal Distributions
AP Statistics Ntes Unit Tw: The Nrmal Distributins Syllabus Objectives: 1.5 The student will summarize distributins f data measuring the psitin using quartiles, percentiles, and standardized scres (z-scres).
More informationHiding in plain sight
Hiding in plain sight Principles f stegangraphy CS349 Cryptgraphy Department f Cmputer Science Wellesley Cllege The prisners prblem Stegangraphy 1-2 1 Secret writing Lemn juice is very nearly clear s it
More informationLecture 10, Principal Component Analysis
Principal Cmpnent Analysis Lecture 10, Principal Cmpnent Analysis Ha Helen Zhang Fall 2017 Ha Helen Zhang Lecture 10, Principal Cmpnent Analysis 1 / 16 Principal Cmpnent Analysis Lecture 10, Principal
More informationBootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >
Btstrap Methd > # Purpse: understand hw btstrap methd wrks > bs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(bs) > mean(bs) [1] 21.64625 > # estimate f lambda > lambda = 1/mean(bs);
More informationCAUSAL INFERENCE. Technical Track Session I. Phillippe Leite. The World Bank
CAUSAL INFERENCE Technical Track Sessin I Phillippe Leite The Wrld Bank These slides were develped by Christel Vermeersch and mdified by Phillippe Leite fr the purpse f this wrkshp Plicy questins are causal
More informationSupport-Vector Machines
Supprt-Vectr Machines Intrductin Supprt vectr machine is a linear machine with sme very nice prperties. Haykin chapter 6. See Alpaydin chapter 13 fr similar cntent. Nte: Part f this lecture drew material
More informationSPH3U1 Lesson 06 Kinematics
PROJECTILE MOTION LEARNING GOALS Students will: Describe the mtin f an bject thrwn at arbitrary angles thrugh the air. Describe the hrizntal and vertical mtins f a prjectile. Slve prjectile mtin prblems.
More informationModelling of Clock Behaviour. Don Percival. Applied Physics Laboratory University of Washington Seattle, Washington, USA
Mdelling f Clck Behaviur Dn Percival Applied Physics Labratry University f Washingtn Seattle, Washingtn, USA verheads and paper fr talk available at http://faculty.washingtn.edu/dbp/talks.html 1 Overview
More informationResampling Methods. Chapter 5. Chapter 5 1 / 52
Resampling Methds Chapter 5 Chapter 5 1 / 52 1 51 Validatin set apprach 2 52 Crss validatin 3 53 Btstrap Chapter 5 2 / 52 Abut Resampling An imprtant statistical tl Pretending the data as ppulatin and
More informationx 1 Outline IAML: Logistic Regression Decision Boundaries Example Data
Outline IAML: Lgistic Regressin Charles Suttn and Victr Lavrenk Schl f Infrmatics Semester Lgistic functin Lgistic regressin Learning lgistic regressin Optimizatin The pwer f nn-linear basis functins Least-squares
More informationVersatility of Singular Value Decomposition (SVD) January 7, 2015
Versatility f Singular Value Decmpsitin (SVD) January 7, 2015 Assumptin : Data = Real Data + Nise Each Data Pint is a clumn f the n d Data Matrix A. Assumptin : Data = Real Data + Nise Each Data Pint is
More informationPhys. 344 Ch 7 Lecture 8 Fri., April. 10 th,
Phys. 344 Ch 7 Lecture 8 Fri., April. 0 th, 009 Fri. 4/0 8. Ising Mdel f Ferrmagnets HW30 66, 74 Mn. 4/3 Review Sat. 4/8 3pm Exam 3 HW Mnday: Review fr est 3. See n-line practice test lecture-prep is t
More informationKinetic Model Completeness
5.68J/10.652J Spring 2003 Lecture Ntes Tuesday April 15, 2003 Kinetic Mdel Cmpleteness We say a chemical kinetic mdel is cmplete fr a particular reactin cnditin when it cntains all the species and reactins
More informationThe blessing of dimensionality for kernel methods
fr kernel methds Building classifiers in high dimensinal space Pierre Dupnt Pierre.Dupnt@ucluvain.be Classifiers define decisin surfaces in sme feature space where the data is either initially represented
More informationPhysics 2010 Motion with Constant Acceleration Experiment 1
. Physics 00 Mtin with Cnstant Acceleratin Experiment In this lab, we will study the mtin f a glider as it accelerates dwnhill n a tilted air track. The glider is supprted ver the air track by a cushin
More information3.4 Shrinkage Methods Prostate Cancer Data Example (Continued) Ridge Regression
3.3.4 Prstate Cancer Data Example (Cntinued) 3.4 Shrinkage Methds 61 Table 3.3 shws the cefficients frm a number f different selectin and shrinkage methds. They are best-subset selectin using an all-subsets
More informationBiplots in Practice MICHAEL GREENACRE. Professor of Statistics at the Pompeu Fabra University. Chapter 13 Offprint
Biplts in Practice MICHAEL GREENACRE Prfessr f Statistics at the Pmpeu Fabra University Chapter 13 Offprint CASE STUDY BIOMEDICINE Cmparing Cancer Types Accrding t Gene Epressin Arrays First published:
More informationActivity Guide Loops and Random Numbers
Unit 3 Lessn 7 Name(s) Perid Date Activity Guide Lps and Randm Numbers CS Cntent Lps are a relatively straightfrward idea in prgramming - yu want a certain chunk f cde t run repeatedly - but it takes a
More informationLCAO APPROXIMATIONS OF ORGANIC Pi MO SYSTEMS The allyl system (cation, anion or radical).
Principles f Organic Chemistry lecture 5, page LCAO APPROIMATIONS OF ORGANIC Pi MO SYSTEMS The allyl system (catin, anin r radical).. Draw mlecule and set up determinant. 2 3 0 3 C C 2 = 0 C 2 3 0 = -
More informationAP Physics Kinematic Wrap Up
AP Physics Kinematic Wrap Up S what d yu need t knw abut this mtin in tw-dimensin stuff t get a gd scre n the ld AP Physics Test? First ff, here are the equatins that yu ll have t wrk with: v v at x x
More informationCOMP 551 Applied Machine Learning Lecture 11: Support Vector Machines
COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse
More informationCS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007
CS 477/677 Analysis f Algrithms Fall 2007 Dr. Gerge Bebis Curse Prject Due Date: 11/29/2007 Part1: Cmparisn f Srting Algrithms (70% f the prject grade) The bjective f the first part f the assignment is
More informationCHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came.
MATH 1342 Ch. 24 April 25 and 27, 2013 Page 1 f 5 CHAPTER 24: INFERENCE IN REGRESSION Chapters 4 and 5: Relatinships between tw quantitative variables. Be able t Make a graph (scatterplt) Summarize the
More informationThermodynamics Partial Outline of Topics
Thermdynamics Partial Outline f Tpics I. The secnd law f thermdynamics addresses the issue f spntaneity and invlves a functin called entrpy (S): If a prcess is spntaneus, then Suniverse > 0 (2 nd Law!)
More informationNAME: Prof. Ruiz. 1. [5 points] What is the difference between simple random sampling and stratified random sampling?
CS4445 ata Mining and Kwledge iscery in atabases. B Term 2014 Exam 1 Nember 24, 2014 Prf. Carlina Ruiz epartment f Cmputer Science Wrcester Plytechnic Institute NAME: Prf. Ruiz Prblem I: Prblem II: Prblem
More informationCHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS
CHAPTER 4 DIAGNOSTICS FOR INFLUENTIAL OBSERVATIONS 1 Influential bservatins are bservatins whse presence in the data can have a distrting effect n the parameter estimates and pssibly the entire analysis,
More informationDataflow Analysis and Abstract Interpretation
Dataflw Analysis and Abstract Interpretatin Cmputer Science and Artificial Intelligence Labratry MIT Nvember 9, 2015 Recap Last time we develped frm first principles an algrithm t derive invariants. Key
More informationFall 2013 Physics 172 Recitation 3 Momentum and Springs
Fall 03 Physics 7 Recitatin 3 Mmentum and Springs Purpse: The purpse f this recitatin is t give yu experience wrking with mmentum and the mmentum update frmula. Readings: Chapter.3-.5 Learning Objectives:.3.
More informationMATHEMATICS SYLLABUS SECONDARY 5th YEAR
Eurpean Schls Office f the Secretary-General Pedaggical Develpment Unit Ref. : 011-01-D-8-en- Orig. : EN MATHEMATICS SYLLABUS SECONDARY 5th YEAR 6 perid/week curse APPROVED BY THE JOINT TEACHING COMMITTEE
More informationEmphases in Common Core Standards for Mathematical Content Kindergarten High School
Emphases in Cmmn Cre Standards fr Mathematical Cntent Kindergarten High Schl Cntent Emphases by Cluster March 12, 2012 Describes cntent emphases in the standards at the cluster level fr each grade. These
More informationInterference is when two (or more) sets of waves meet and combine to produce a new pattern.
Interference Interference is when tw (r mre) sets f waves meet and cmbine t prduce a new pattern. This pattern can vary depending n the riginal wave directin, wavelength, amplitude, etc. The tw mst extreme
More informationThis section is primarily focused on tools to aid us in finding roots/zeros/ -intercepts of polynomials. Essentially, our focus turns to solving.
Sectin 3.2: Many f yu WILL need t watch the crrespnding vides fr this sectin n MyOpenMath! This sectin is primarily fcused n tls t aid us in finding rts/zers/ -intercepts f plynmials. Essentially, ur fcus
More informationInternal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9.
Sectin 7 Mdel Assessment This sectin is based n Stck and Watsn s Chapter 9. Internal vs. external validity Internal validity refers t whether the analysis is valid fr the ppulatin and sample being studied.
More informationThe Law of Total Probability, Bayes Rule, and Random Variables (Oh My!)
The Law f Ttal Prbability, Bayes Rule, and Randm Variables (Oh My!) Administrivia Hmewrk 2 is psted and is due tw Friday s frm nw If yu didn t start early last time, please d s this time. Gd Milestnes:
More information5 th grade Common Core Standards
5 th grade Cmmn Cre Standards In Grade 5, instructinal time shuld fcus n three critical areas: (1) develping fluency with additin and subtractin f fractins, and develping understanding f the multiplicatin
More informationCells though to send feedback signals from the medulla back to the lamina o L: Lamina Monopolar cells
Classificatin Rules (and Exceptins) Name: Cell type fllwed by either a clumn ID (determined by the visual lcatin f the cell) r a numeric identifier t separate ut different examples f a given cell type
More informationCOMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)
COMP 551 Applied Machine Learning Lecture 9: Supprt Vectr Machines (cnt d) Instructr: Herke van Hf (herke.vanhf@mail.mcgill.ca) Slides mstly by: Class web page: www.cs.mcgill.ca/~hvanh2/cmp551 Unless therwise
More informationCOMP 551 Applied Machine Learning Lecture 4: Linear classification
COMP 551 Applied Machine Learning Lecture 4: Linear classificatin Instructr: Jelle Pineau (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted
More informationThe standards are taught in the following sequence.
B L U E V A L L E Y D I S T R I C T C U R R I C U L U M MATHEMATICS Third Grade In grade 3, instructinal time shuld fcus n fur critical areas: (1) develping understanding f multiplicatin and divisin and
More informationMath Foundations 20 Work Plan
Math Fundatins 20 Wrk Plan Units / Tpics 20.8 Demnstrate understanding f systems f linear inequalities in tw variables. Time Frame December 1-3 weeks 6-10 Majr Learning Indicatrs Identify situatins relevant
More informationI. Analytical Potential and Field of a Uniform Rod. V E d. The definition of electric potential difference is
Length L>>a,b,c Phys 232 Lab 4 Ch 17 Electric Ptential Difference Materials: whitebards & pens, cmputers with VPythn, pwer supply & cables, multimeter, crkbard, thumbtacks, individual prbes and jined prbes,
More informationLesson Plan. Recode: They will do a graphic organizer to sequence the steps of scientific method.
Lessn Plan Reach: Ask the students if they ever ppped a bag f micrwave ppcrn and nticed hw many kernels were unppped at the bttm f the bag which made yu wnder if ther brands pp better than the ne yu are
More informationLecture 3: Principal Components Analysis (PCA)
Lecture 3: Principal Cmpnents Analysis (PCA) Reading: Sectins 6.3.1, 10.1, 10.2, 10.4 STATS 202: Data mining and analysis Jnathan Taylr, 9/28 Slide credits: Sergi Bacallad 1 / 24 The bias variance decmpsitin
More informationIntroduction to Spacetime Geometry
Intrductin t Spacetime Gemetry Let s start with a review f a basic feature f Euclidean gemetry, the Pythagrean therem. In a twdimensinal crdinate system we can relate the length f a line segment t the
More informationChapter 2 GAUSS LAW Recommended Problems:
Chapter GAUSS LAW Recmmended Prblems: 1,4,5,6,7,9,11,13,15,18,19,1,7,9,31,35,37,39,41,43,45,47,49,51,55,57,61,6,69. LCTRIC FLUX lectric flux is a measure f the number f electric filed lines penetrating
More informationPublic Key Cryptography. Tim van der Horst & Kent Seamons
Public Key Cryptgraphy Tim van der Hrst & Kent Seamns Last Updated: Oct 5, 2017 Asymmetric Encryptin Why Public Key Crypt is Cl Has a linear slutin t the key distributin prblem Symmetric crypt has an expnential
More informationLinear Classification
Linear Classificatin CS 54: Machine Learning Slides adapted frm Lee Cper, Jydeep Ghsh, and Sham Kakade Review: Linear Regressin CS 54 [Spring 07] - H Regressin Given an input vectr x T = (x, x,, xp), we
More informationName: Period: Date: ATOMIC STRUCTURE NOTES ADVANCED CHEMISTRY
Name: Perid: Date: ATOMIC STRUCTURE NOTES ADVANCED CHEMISTRY Directins: This packet will serve as yur ntes fr this chapter. Fllw alng with the PwerPint presentatin and fill in the missing infrmatin. Imprtant
More informationAdmissibility Conditions and Asymptotic Behavior of Strongly Regular Graphs
Admissibility Cnditins and Asympttic Behavir f Strngly Regular Graphs VASCO MOÇO MANO Department f Mathematics University f Prt Oprt PORTUGAL vascmcman@gmailcm LUÍS ANTÓNIO DE ALMEIDA VIEIRA Department
More informationLecture 13: Markov Chain Monte Carlo. Gibbs sampling
Lecture 13: Markv hain Mnte arl Gibbs sampling Gibbs sampling Markv chains 1 Recall: Apprximate inference using samples Main idea: we generate samples frm ur Bayes net, then cmpute prbabilities using (weighted)
More informationFive Whys How To Do It Better
Five Whys Definitin. As explained in the previus article, we define rt cause as simply the uncvering f hw the current prblem came int being. Fr a simple causal chain, it is the entire chain. Fr a cmplex
More informationComputational modeling techniques
Cmputatinal mdeling techniques Lecture 2: Mdeling change. In Petre Department f IT, Åb Akademi http://users.ab.fi/ipetre/cmpmd/ Cntent f the lecture Basic paradigm f mdeling change Examples Linear dynamical
More informationCS 109 Lecture 23 May 18th, 2016
CS 109 Lecture 23 May 18th, 2016 New Datasets Heart Ancestry Netflix Our Path Parameter Estimatin Machine Learning: Frmally Many different frms f Machine Learning We fcus n the prblem f predictin Want
More informationPhysics 212. Lecture 12. Today's Concept: Magnetic Force on moving charges. Physics 212 Lecture 12, Slide 1
Physics 1 Lecture 1 Tday's Cncept: Magnetic Frce n mving charges F qv Physics 1 Lecture 1, Slide 1 Music Wh is the Artist? A) The Meters ) The Neville rthers C) Trmbne Shrty D) Michael Franti E) Radiatrs
More informationSUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical model for microarray data analysis
SUPPLEMENTARY MATERIAL GaGa: a simple and flexible hierarchical mdel fr micrarray data analysis David Rssell Department f Bistatistics M.D. Andersn Cancer Center, Hustn, TX 77030, USA rsselldavid@gmail.cm
More informationChemistry 20 Lesson 11 Electronegativity, Polarity and Shapes
Chemistry 20 Lessn 11 Electrnegativity, Plarity and Shapes In ur previus wrk we learned why atms frm cvalent bnds and hw t draw the resulting rganizatin f atms. In this lessn we will learn (a) hw the cmbinatin
More information[COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t o m a k e s u r e y o u a r e r e a d y )
(Abut the final) [COLLEGE ALGEBRA EXAM I REVIEW TOPICS] ( u s e t h i s t m a k e s u r e y u a r e r e a d y ) The department writes the final exam s I dn't really knw what's n it and I can't very well
More information4th Indian Institute of Astrophysics - PennState Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur. Correlation and Regression
4th Indian Institute f Astrphysics - PennState Astrstatistics Schl July, 2013 Vainu Bappu Observatry, Kavalur Crrelatin and Regressin Rahul Ry Indian Statistical Institute, Delhi. Crrelatin Cnsider a tw
More informationComputational modeling techniques
Cmputatinal mdeling techniques Lecture 4: Mdel checing fr ODE mdels In Petre Department f IT, Åb Aademi http://www.users.ab.fi/ipetre/cmpmd/ Cntent Stichimetric matrix Calculating the mass cnservatin relatins
More informationMidwest Big Data Summer School: Machine Learning I: Introduction. Kris De Brabanter
Midwest Big Data Summer Schl: Machine Learning I: Intrductin Kris De Brabanter kbrabant@iastate.edu Iwa State University Department f Statistics Department f Cmputer Science June 24, 2016 1/24 Outline
More informationChecking the resolved resonance region in EXFOR database
Checking the reslved resnance regin in EXFOR database Gttfried Bertn Sciété de Calcul Mathématique (SCM) Oscar Cabells OECD/NEA Data Bank JEFF Meetings - Sessin JEFF Experiments Nvember 0-4, 017 Bulgne-Billancurt,
More informationSlide04 (supplemental) Haykin Chapter 4 (both 2nd and 3rd ed): Multi-Layer Perceptrons
Slide04 supplemental) Haykin Chapter 4 bth 2nd and 3rd ed): Multi-Layer Perceptrns CPSC 636-600 Instructr: Ynsuck Che Heuristic fr Making Backprp Perfrm Better 1. Sequential vs. batch update: fr large
More informationGetting Involved O. Responsibilities of a Member. People Are Depending On You. Participation Is Important. Think It Through
f Getting Invlved O Literature Circles can be fun. It is exciting t be part f a grup that shares smething. S get invlved, read, think, and talk abut bks! Respnsibilities f a Member Remember a Literature
More informationSIZE BIAS IN LINE TRANSECT SAMPLING: A FIELD TEST. Mark C. Otto Statistics Research Division, Bureau of the Census Washington, D.C , U.S.A.
SIZE BIAS IN LINE TRANSECT SAMPLING: A FIELD TEST Mark C. Ott Statistics Research Divisin, Bureau f the Census Washingtn, D.C. 20233, U.S.A. and Kenneth H. Pllck Department f Statistics, Nrth Carlina State
More informationmaking triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y=
Intrductin t Vectrs I 21 Intrductin t Vectrs I 22 I. Determine the hrizntal and vertical cmpnents f the resultant vectr by cunting n the grid. X= y= J. Draw a mangle with hrizntal and vertical cmpnents
More informationWriting Guidelines. (Updated: November 25, 2009) Forwards
Writing Guidelines (Updated: Nvember 25, 2009) Frwards I have fund in my review f the manuscripts frm ur students and research assciates, as well as thse submitted t varius jurnals by thers that the majr
More informationIn the OLG model, agents live for two periods. they work and divide their labour income between consumption and
1 The Overlapping Generatins Mdel (OLG) In the OLG mdel, agents live fr tw perids. When ung the wrk and divide their labur incme between cnsumptin and savings. When ld the cnsume their savings. As the
More information37 Maxwell s Equations
37 Maxwell s quatins In this chapter, the plan is t summarize much f what we knw abut electricity and magnetism in a manner similar t the way in which James Clerk Maxwell summarized what was knwn abut
More informationFunction notation & composite functions Factoring Dividing polynomials Remainder theorem & factor property
Functin ntatin & cmpsite functins Factring Dividing plynmials Remainder therem & factr prperty Can d s by gruping r by: Always lk fr a cmmn factr first 2 numbers that ADD t give yu middle term and MULTIPLY
More informationLHS Mathematics Department Honors Pre-Calculus Final Exam 2002 Answers
LHS Mathematics Department Hnrs Pre-alculus Final Eam nswers Part Shrt Prblems The table at the right gives the ppulatin f Massachusetts ver the past several decades Using an epnential mdel, predict the
More information2004 AP CHEMISTRY FREE-RESPONSE QUESTIONS
2004 AP CHEMISTRY FREE-RESPONSE QUESTIONS 6. An electrchemical cell is cnstructed with an pen switch, as shwn in the diagram abve. A strip f Sn and a strip f an unknwn metal, X, are used as electrdes.
More information1996 Engineering Systems Design and Analysis Conference, Montpellier, France, July 1-4, 1996, Vol. 7, pp
THE POWER AND LIMIT OF NEURAL NETWORKS T. Y. Lin Department f Mathematics and Cmputer Science San Jse State University San Jse, Califrnia 959-003 tylin@cs.ssu.edu and Bereley Initiative in Sft Cmputing*
More informationChapter 3 Kinematics in Two Dimensions; Vectors
Chapter 3 Kinematics in Tw Dimensins; Vectrs Vectrs and Scalars Additin f Vectrs Graphical Methds (One and Tw- Dimensin) Multiplicatin f a Vectr b a Scalar Subtractin f Vectrs Graphical Methds Adding Vectrs
More informationA New Evaluation Measure. J. Joiner and L. Werner. The problems of evaluation and the needed criteria of evaluation
III-l III. A New Evaluatin Measure J. Jiner and L. Werner Abstract The prblems f evaluatin and the needed criteria f evaluatin measures in the SMART system f infrmatin retrieval are reviewed and discussed.
More informationTHE LIFE OF AN OBJECT IT SYSTEMS
THE LIFE OF AN OBJECT IT SYSTEMS Persns, bjects, r cncepts frm the real wrld, which we mdel as bjects in the IT system, have "lives". Actually, they have tw lives; the riginal in the real wrld has a life,
More informationCompressibility Effects
Definitin f Cmpressibility All real substances are cmpressible t sme greater r lesser extent; that is, when yu squeeze r press n them, their density will change The amunt by which a substance can be cmpressed
More informationWe can see from the graph above that the intersection is, i.e., [ ).
MTH 111 Cllege Algebra Lecture Ntes July 2, 2014 Functin Arithmetic: With nt t much difficulty, we ntice that inputs f functins are numbers, and utputs f functins are numbers. S whatever we can d with
More informationCHAPTER 2 Algebraic Expressions and Fundamental Operations
CHAPTER Algebraic Expressins and Fundamental Operatins OBJECTIVES: 1. Algebraic Expressins. Terms. Degree. Gruping 5. Additin 6. Subtractin 7. Multiplicatin 8. Divisin Algebraic Expressin An algebraic
More informationOn Huntsberger Type Shrinkage Estimator for the Mean of Normal Distribution ABSTRACT INTRODUCTION
Malaysian Jurnal f Mathematical Sciences 4(): 7-4 () On Huntsberger Type Shrinkage Estimatr fr the Mean f Nrmal Distributin Department f Mathematical and Physical Sciences, University f Nizwa, Sultanate
More informationLab 1 The Scientific Method
INTRODUCTION The fllwing labratry exercise is designed t give yu, the student, an pprtunity t explre unknwn systems, r universes, and hypthesize pssible rules which may gvern the behavir within them. Scientific
More information