Connotation: A Dash of Sentiment Beneath the Surface Meaning
|
|
- Roderick Pierce
- 6 years ago
- Views:
Transcription
1 Connotation: A Dash of Sentiment Beneath the Surface Meaning
2 Con notation com- ( together or with ) notare ( to mark ) 1
3 Con notation com- ( together or with ) notare ( to mark ) Commonly understood cultural or emotional association that some word carries, in addition to its explicit or literal meaning (denotation). Generally described as positive or negative. 2
4 Good News? Bad News? Decrease in deforestation drives the trend, but emissions from energy and agriculture grow. 3
5 Good News? Bad News? the removal of trees Decrease in deforestation drives the trend, but emissions from energy and agriculture grow. The production and discharge of something, esp. gas or 4
6 Good News? Bad News? the removal of trees Decrease in deforestation drives the trend, but emissions from energy and agriculture grow. The production and discharge of something 5
7 Good News? Bad News? the removal of trees Decrease in deforestation drives the trend, but emissions from energy and agriculture grow. The production and discharge of something,,esp. gas or radiation 6
8 Learning the General Connotation Decrease in deforestation drives the trend, but emissions from energy and agriculture grow. Negatively connotative in general 7
9 Con notation com- ( together or with ) notare ( to mark ) Commonly understood cultural or emotional association that some word carries, in addition to its explicit or literal meaning (denotation). Generally described as positive or negative. 8
10 Sentiment vs. Connotation 9
11 Sentiment vs. Connotation joy sick 10
12 Sentiment vs. Connotation Neutral joy Sentiment sick surfing scientist grid rose header blister emission salt bedbug 11
13 Sentiment vs. Connotation Neutral joy scientist music surfing rose Sentiment sick flu Connotation deforestation emission surfing scientist grid rose header blister emission salt bedbug 12
14 Sentiment vs. Connotation Neutral joy scientist music surfing rose sick Connotation flu bedbug deforestation emission surfing scientist grid rose header blister emission salt bedbug 13
15 Learning the General Connotation } Data } Linguistic insights } Graph representation } Inference algorithms } Evaluations 14
16 Data } Web-driven data Google Web 1T (Brants and Franz (2006)) } N-grams (1 <= n <= 5) } Frequency of occurrences } Example: prevent financial malware 4130 } Dictionary-drive data WordNet (George A. Miller (1995)) } Synsets: synonyms, antonyms 15
17 Learning the General Connotation } Data } Linguistic insights } Graph representation } Inference algorithms } Evaluations 16
18 Diverse Linguistic Insights } Semantic prosody } Semantic parallelism of coordination } Distributional similarity } Semantic relations 17
19 Diverse Linguistic Insights } Semantic prosody } Semantic parallelism of coordination } Distributional similarity } Semantic relations 18
20 Semantic Prosody ptimization: ILP / LP Sinclair (1991, 2004), louw (1993) 930,000,000 2,650,000 19
21 Selectional Preference on Connotation accident ß avoid accident, cause accident, avoid à avoid bedbugs, avoid danger, enjoy Pred à [enjoy] Pred [music] Arg oid bedbugs, oid danger, art ß enjoy the art, appreciate the art, à enjoy the wine, enjoy life, prevent Pred à [prevent] Pred [bedbugs] Arg 20
22 Selectional Preference on Connotation accident ß avoid accident, cause accident, avoid à avoid bedbugs, avoid danger, enjoy Pred à [enjoy] Pred [music] Arg oid bedbugs, oid danger, Candidates art ß enjoy the art, appreciate the art, y à enjoy the wine, enjoy life, prevent Pred à [prevent] Pred [bedbugs] Arg 21
23 Connotative Predicate: Feng et al A predicate that has selectional preference on the connotative polarity of some of its semantic arguments. 22
24 Connotative Predicate: Feng et al A predicate that has selectional preference on the connotative polarity of some of its semantic arguments. Connotative Predicates Sentiment of predicate Preference on arguments Examples suffer negative negative suffering from cough cure positive negative cure backache cause neutral negative caused CO 2 emissions 23
25 Connotative Predicate: Feng et al A predicate that has selectional preference on the connotative polarity of some of its semantic arguments. Connotative Predicates Sentiment of predicate Preference on arguments Examples suffer negative negative suffering from cough cure positive negative cure backache cause neutral negative caused CO 2 emissions 24
26 Accomplish Achieve Advance Advocate Admire Applaud Appreciate Compliment Congratulate Develop 20 Positive Connotative Predicates Desire Enhance Enjoy Improve Praise Promote Respect Save Support Win Alleviate Accuse Avert Avoid Cause Complain Condemn Criticize Detect Eliminate 20 Negative Connotative Predicates Eradicate Mitigate Overcome Prevent Prohibit Protest Refrain Suffer Tolerate Withstand Feng et al
27 Diverse Linguistic Insights } Semantic prosody [Corpus: GoogleNgram] } [enjoy] Pred [music] Arg,S[prevent] Pred [begbugs] Arg antic parallelism of coordination (Google) Pattern * and *, e.g., music and wine Distributional similarity (Google) findings potential > findings corrections Semantic relations (WordNet) Synonyms Antonyms 26
28 Diverse Linguistic Insights } Semantic prosody [Corpus: GoogleNgram] } enjoy * ; prevent * } Semantic parallelism of coordination [Corpus: GoogleNgram] } Pattern * and *, e.g., music and wine Semantic relations (WorldNet) Synonyms Antonyms 27
29 Diverse Linguistic Insights } Semantic prosody [Corpus: GoogleNgram] } enjoy * ; prevent * } Semantic parallelism of coordination [Corpus: GoogleNgram] } Pattern * and *, e.g., music and wine } Distributional similarity [Corpus : GoogleNgram] } findings potentials > findings modifications Semantic relations (WorldNet) Synonyms Antonyms 28
30 Diverse Linguistic Insights } Semantic prosody [Corpus: GoogleNgram] } enjoy * ; prevent * } Semantic parallelism of coordination [Corpus: GoogleNgram] } Pattern * and *, e.g., music and wine } Distributional similarity [Corpus : GoogleNgram] } findings potentials > findings modifications } Semantic relations [Corpus: WordNet] } Synonyms } Antonyms relations (WorldNet) 29
31 Learning the General Connotation } Data } Linguistic insights } Graph representation } Inference algorithms } Evaluations 30
32 Graph Representation } Graph G = (V, E) } V = {Pred} {Arg} } E1: Pred Arg Pred-Arg enjoy thank avoid prevent writing profit help risk 31
33 Graph Representation } Graph G = (V, E) } V = {Pred} {Arg} } E1: Pred Arg Pred-Arg enjoy thank writing profit Arg-Arg reading investment } E2: Arg Arg avoid prevent help risk aid 32
34 Graph Representation } Graph G = (V, E) } V = {Pred} {Arg} } E1: Pred Arg Pred-Arg enjoy thank writing profit Arg-Arg reading investment } E2: Arg Arg avoid prevent help risk aid a 1 and w 1 à PMI(a 1, w 1 ) a 1 and w 2 à PMI(a 1, w 2 ) a 1 and w 3 à PMI(a 1, w n ) 33 a 1 := PMI(a 1, w 1 ) PMI(a 1, w 2 ) PMI(a 1, w n )
35 Graph Representation prevent suffer enjoy thank tax loss writing profit preventing gain bonus investment cold flu prosody coordination synonyms antonyms 34
36 Learning the General Connotation } Data } Linguistic insights } Graph representation } Inference algorithms } Evaluations 35
37 Inference Algorithm Algorithms Linguistic Insights HITS/PageRank Semantic Prosody Label Propagation Integer Linear Programing Semantic Parallelism of Coordination Distributional Similarity Belief Propagation Semantic Relations W 36
38 Inference Algorithm Algorithms Linguistic Insights HITS/PageRank Semantic Prosody Label Propagation Integer Linear Programing Semantic Parallelism of Coordination Distributional Similarity Belief Propagation Semantic Relations W 37
39 ILP: Problem Formulation } For each unlabeled word i, solve x i, y i, z i {0, 1}, x i à positive, y i à negative, z i à neutral; x i + y i + z i = 1. 38
40 ILP: Problem Formulation } For each unlabeled word i, solve x i, y i, z i {0, 1}, x i à positive, y i à negative, z i à neutral; x i + y i + z i = 1. } Initialization (Hard constraints) } Positive seed predicates (e.g. achieve ) à x i = 1 } Negative seed predicates (e.g. prevent ) à y i = 1 39
41 ILP: Objective Function Maximize 40
42 ILP: Objective Function Maximize prevent suffer enjoy thank tax loss writing profit preventing bonus flu cold 41
43 ILP: Objective Function Maximize prevent suffer enjoy thank tax loss writing profit preventing bonus flu cold 42
44 ILP: Objective Function 43
45 ILP: Objective Function 44
46 ILP: Soft Constraints } Predicate Argument } Argument Argument 45
47 ILP: Hard Constraints Semantic relations Antonym pairs will not have the same positive or negative polarity. Synonym pairs will not have the opposite polarity. 46
48 Inference Algorithm Algorithms Linguistic Insights HITS/PageRank Semantic Prosody Label Propagation Integer Linear Programing Semantic Parallelism of Coordination Distributional Similarity Belief Propagation Semantic Relations W 47
49 Graph LEMMA SENSE pain ache wound -injury -wound prevent suffer loss accident injure lose -lose -decrease enjoy life profit win gain -win -profits achieve success investment -gain -acquire put on -gain, -put on prosody (Pred-Arg) coordination memberships antonyms 48
50 Graph LEMMA SENSE pain ache wound -injury -wound prevent suffer loss accident injure lose -lose -decrease enjoy life profit win gain -win -profits achieve success investment -gain -acquire put on -gain, -put on prosody (Pred-Arg) coordination memberships antonyms 49
51 Graph Types of Nodes Lemmas prevent (115K) Synsets (63K) suffer pain loss LEMMA ache accident wound injure lose life Types of Edges win enjoy 1. Predicate-argument profit (179K) gain 2. Argumentachieve argument (144K) 3. Argument-synset (126K) success 4. Synset-synset (34K) investment put on prosody (Pred-Arg) coordination SENSE -injury -wound -lose -decrease -win -profits -gain -acquire -gain, -put on memberships antonyms 50
52 Problem Formulation } G Lemma + Sense = (V, E) } Nodes (random variables) } V = {v 1, v 2,, v n } } Unobserved variables: } Typed edges } E = {e(v i, v j, v k ) } where v i, v j V, T k T } T = {pre-arg, arg-arg, arg-syn, syn-syn} } Neighborhood function } N v = {u e(u, v) E} } Labels ϒ = {Y 1,Y 2,...,Y n }, y i } L = {+, -}, y i denotes the label of Y i. 51
53 Pairwise Markov Random Fields 52
54 Objective Function P(y x) = 1 Z(x) Yi ϒ ψ i (y i ) t ψ ij (y i, y j ) 53
55 Objective Function An assignment to all the unobserved variables P(y x) = 1 Z(x) Yi ϒ ψ i (y i ) t ψ ij (y i, y j ) 54
56 Objective Function An assignment to all the unobserved variables P(y x) = 1 Z(x) Yi ϒ Variables with known labels ψ i (y i ) t ψ ij (y i, y j ) 55
57 Objective Function P(y x) = 1 Z(x) Yi ϒ ψ i (y i ) t ψ ij (y i, y j ) Prior mapping: y i refers to Y i s label ψ i is prior mapping. 56
58 Objective Function P(y x) = 1 Z(x) Yi ϒ ψ i (y i ) t ψ ij (y i, y j ) Prior mapping: L à R 0 Compatibility mapping: L x L à R 0 57
59 Objective Function An assignment to all the unobserved variables 58
60 Loopy Belief Propagation } Message passing m i j (y i ) = α (ψ ij y i L t (y i, y j )ψ i (y i ) m k i (y i )), y j L Y k Ni ϒ \Y i } Belief b i (y i ) = βψ i (y i ) m j i (y i ), y i L Y j Ni ϒ 59
61 Loopy Belief Propagation } Initialize message between all node pairs connected by an edge. } Initialize priors for all nodes } Integrative message passing until all messages stop changing m i j (y i ) = α (ψ ij y i L } Compute beliefs. } Assign the label t (y i, y j )ψ i (y i ) m k i (y i )), y j L Y k Ni ϒ \Y i Y j Ni ϒ b i (y i ) = βψ i (y i ) m j i (y i ), y i L L i max i b i (y i ) 60
Introduction To Graphical Models
Peter Gehler Introduction to Graphical Models Introduction To Graphical Models Peter V. Gehler Max Planck Institute for Intelligent Systems, Tübingen, Germany ENS/INRIA Summer School, Paris, July 2013
More informationInference in Graphical Models Variable Elimination and Message Passing Algorithm
Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption
More informationEfficient Inference in Fully Connected CRFs with Gaussian Edge Potentials
Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials Philipp Krähenbühl and Vladlen Koltun Stanford University Presenter: Yuan-Ting Hu 1 Conditional Random Field (CRF) E x I = φ u
More informationInference and Representation
Inference and Representation David Sontag New York University Lecture 5, Sept. 30, 2014 David Sontag (NYU) Inference and Representation Lecture 5, Sept. 30, 2014 1 / 16 Today s lecture 1 Running-time of
More informationMarkov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials)
Markov Networks l Like Bayes Nets l Graphical model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov
More informationUndirected graphical models
Undirected graphical models Semantics of probabilistic models over undirected graphs Parameters of undirected models Example applications COMP-652 and ECSE-608, February 16, 2017 1 Undirected graphical
More informationMarkov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials)
Markov Networks l Like Bayes Nets l Graph model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov
More informationNatural Language Processing
Natural Language Processing Word vectors Many slides borrowed from Richard Socher and Chris Manning Lecture plan Word representations Word vectors (embeddings) skip-gram algorithm Relation to matrix factorization
More information12 : Variational Inference I
10-708: Probabilistic Graphical Models, Spring 2015 12 : Variational Inference I Lecturer: Eric P. Xing Scribes: Fattaneh Jabbari, Eric Lei, Evan Shapiro 1 Introduction Probabilistic inference is one of
More information13 : Variational Inference: Loopy Belief Propagation
10-708: Probabilistic Graphical Models 10-708, Spring 2014 13 : Variational Inference: Loopy Belief Propagation Lecturer: Eric P. Xing Scribes: Rajarshi Das, Zhengzhong Liu, Dishan Gupta 1 Introduction
More informationGraph-based Dependency Parsing. Ryan McDonald Google Research
Graph-based Dependency Parsing Ryan McDonald Google Research ryanmcd@google.com Reader s Digest Graph-based Dependency Parsing Ryan McDonald Google Research ryanmcd@google.com root ROOT Dependency Parsing
More informationLab 12: Structured Prediction
December 4, 2014 Lecture plan structured perceptron application: confused messages application: dependency parsing structured SVM Class review: from modelization to classification What does learning mean?
More informationGraphical Model Inference with Perfect Graphs
Graphical Model Inference with Perfect Graphs Tony Jebara Columbia University July 25, 2013 joint work with Adrian Weller Graphical models and Markov random fields We depict a graphical model G as a bipartite
More informationLecture 8: PGM Inference
15 September 2014 Intro. to Stats. Machine Learning COMP SCI 4401/7401 Table of Contents I 1 Variable elimination Max-product Sum-product 2 LP Relaxations QP Relaxations 3 Marginal and MAP X1 X2 X3 X4
More informationLecture 9: PGM Learning
13 Oct 2014 Intro. to Stats. Machine Learning COMP SCI 4401/7401 Table of Contents I Learning parameters in MRFs 1 Learning parameters in MRFs Inference and Learning Given parameters (of potentials) and
More informationAlgorithmic Game Theory and Applications. Lecture 7: The LP Duality Theorem
Algorithmic Game Theory and Applications Lecture 7: The LP Duality Theorem Kousha Etessami recall LP s in Primal Form 1 Maximize c 1 x 1 + c 2 x 2 +... + c n x n a 1,1 x 1 + a 1,2 x 2 +... + a 1,n x n
More informationUNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS
UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS JONATHAN YEDIDIA, WILLIAM FREEMAN, YAIR WEISS 2001 MERL TECH REPORT Kristin Branson and Ian Fasel June 11, 2003 1. Inference Inference problems
More information6.867 Machine learning, lecture 23 (Jaakkola)
Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context
More informationCoreference Resolution with! ILP-based Weighted Abduction
Coreference Resolution with! ILP-based Weighted Abduction Naoya Inoue, Ekaterina Ovchinnikova, Kentaro Inui, Jerry Hobbs Tohoku University, Japan! ISI/USC, USA Motivation Long-term goal: unified framework
More informationProbability and Structure in Natural Language Processing
Probability and Structure in Natural Language Processing Noah Smith, Carnegie Mellon University 2012 Interna@onal Summer School in Language and Speech Technologies Quick Recap Yesterday: Bayesian networks
More informationOPPA European Social Fund Prague & EU: We invest in your future.
OPPA European Social Fund Prague & EU: We invest in your future. Bayesian networks exercises Collected by: Jiří Kléma, klema@labe.felk.cvut.cz ZS 2012/2013 Goals: The text provides a pool of exercises
More informationIntroduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah
Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,
More informationTightness of LP Relaxations for Almost Balanced Models
Tightness of LP Relaxations for Almost Balanced Models Adrian Weller University of Cambridge AISTATS May 10, 2016 Joint work with Mark Rowland and David Sontag For more information, see http://mlg.eng.cam.ac.uk/adrian/
More informationNoisy Pattern Search using Hidden Markov Models
Noisy Pattern Search using Hidden Markov Models David Barber University College London March 3, 22 String search We have a pattern string david that we wish to search for in an observation string, say
More informationDecision trees COMS 4771
Decision trees COMS 4771 1. Prediction functions (again) Learning prediction functions IID model for supervised learning: (X 1, Y 1),..., (X n, Y n), (X, Y ) are iid random pairs (i.e., labeled examples).
More information5. Sum-product algorithm
Sum-product algorithm 5-1 5. Sum-product algorithm Elimination algorithm Sum-product algorithm on a line Sum-product algorithm on a tree Sum-product algorithm 5-2 Inference tasks on graphical models consider
More informationMarkov Random Fields for Computer Vision (Part 1)
Markov Random Fields for Computer Vision (Part 1) Machine Learning Summer School (MLSS 2011) Stephen Gould stephen.gould@anu.edu.au Australian National University 13 17 June, 2011 Stephen Gould 1/23 Pixel
More informationStatistical Methods for NLP
Statistical Methods for NLP Sequence Models Joakim Nivre Uppsala University Department of Linguistics and Philology joakim.nivre@lingfil.uu.se Statistical Methods for NLP 1(21) Introduction Structured
More information4 : Exact Inference: Variable Elimination
10-708: Probabilistic Graphical Models 10-708, Spring 2014 4 : Exact Inference: Variable Elimination Lecturer: Eric P. ing Scribes: Soumya Batra, Pradeep Dasigi, Manzil Zaheer 1 Probabilistic Inference
More informationExact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = =
Exact Inference I Mark Peot In this lecture we will look at issues associated with exact inference 10 Queries The objective of probabilistic inference is to compute a joint distribution of a set of query
More informationMessage Passing Algorithms and Junction Tree Algorithms
Message Passing lgorithms and Junction Tree lgorithms Le Song Machine Learning II: dvanced Topics S 8803ML, Spring 2012 Inference in raphical Models eneral form of the inference problem P X 1,, X n Ψ(
More informationTexas A&M University
Texas A&M University Electrical & Computer Engineering Department Graphical Modeling Course Project Author: Mostafa Karimi UIN: 225000309 Prof Krishna Narayanan May 10 1 Introduction Proteins are made
More information11 The Max-Product Algorithm
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 11 The Max-Product Algorithm In the previous lecture, we introduced
More informationMachine learning: lecture 20. Tommi S. Jaakkola MIT CSAIL
Machine learning: lecture 20 ommi. Jaakkola MI CAI tommi@csail.mit.edu opics Representation and graphical models examples Bayesian networks examples, specification graphs and independence associated distribution
More informationVariational Inference (11/04/13)
STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further
More informationMachine Learning 4771
Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation
More informationLecture 7: Word Embeddings
Lecture 7: Word Embeddings Kai-Wei Chang CS @ University of Virginia kw@kwchang.net Couse webpage: http://kwchang.net/teaching/nlp16 6501 Natural Language Processing 1 This lecture v Learning word vectors
More information6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008
MIT OpenCourseWare http://ocw.mit.edu 6.047 / 6.878 Computational Biology: Genomes, etworks, Evolution Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationDirected and Undirected Graphical Models
Directed and Undirected Graphical Models Adrian Weller MLSALT4 Lecture Feb 26, 2016 With thanks to David Sontag (NYU) and Tony Jebara (Columbia) for use of many slides and illustrations For more information,
More informationEfficient Inference in Fully Connected CRFs with Gaussian Edge Potentials
Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials by Phillip Krahenbuhl and Vladlen Koltun Presented by Adam Stambler Multi-class image segmentation Assign a class label to each
More informationPropositional Logic. Methods & Tools for Software Engineering (MTSE) Fall Prof. Arie Gurfinkel
Propositional Logic Methods & Tools for Software Engineering (MTSE) Fall 2017 Prof. Arie Gurfinkel References Chpater 1 of Logic for Computer Scientists http://www.springerlink.com/content/978-0-8176-4762-9/
More informationJunction Tree, BP and Variational Methods
Junction Tree, BP and Variational Methods Adrian Weller MLSALT4 Lecture Feb 21, 2018 With thanks to David Sontag (MIT) and Tony Jebara (Columbia) for use of many slides and illustrations For more information,
More informationThe Ising model and Markov chain Monte Carlo
The Ising model and Markov chain Monte Carlo Ramesh Sridharan These notes give a short description of the Ising model for images and an introduction to Metropolis-Hastings and Gibbs Markov Chain Monte
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Problem Set 3 Issued: Thursday, September 25, 2014 Due: Thursday,
More informationA Graphical Model for Simultaneous Partitioning and Labeling
A Graphical Model for Simultaneous Partitioning and Labeling Philip J Cowans Cavendish Laboratory, University of Cambridge, Cambridge, CB3 0HE, United Kingdom pjc51@camacuk Martin Szummer Microsoft Research
More informationClique trees & Belief Propagation. Siamak Ravanbakhsh Winter 2018
Graphical Models Clique trees & Belief Propagation Siamak Ravanbakhsh Winter 2018 Learning objectives message passing on clique trees its relation to variable elimination two different forms of belief
More informationDiscrete Inference and Learning Lecture 3
Discrete Inference and Learning Lecture 3 MVA 2017 2018 h
More informationPropositional Logic. Testing, Quality Assurance, and Maintenance Winter Prof. Arie Gurfinkel
Propositional Logic Testing, Quality Assurance, and Maintenance Winter 2018 Prof. Arie Gurfinkel References Chpater 1 of Logic for Computer Scientists http://www.springerlink.com/content/978-0-8176-4762-9/
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 295-P, Spring 213 Prof. Erik Sudderth Lecture 11: Inference & Learning Overview, Gaussian Graphical Models Some figures courtesy Michael Jordan s draft
More informationStatistical Approaches to Learning and Discovery
Statistical Approaches to Learning and Discovery Graphical Models Zoubin Ghahramani & Teddy Seidenfeld zoubin@cs.cmu.edu & teddy@stat.cmu.edu CALD / CS / Statistics / Philosophy Carnegie Mellon University
More informationSequence labeling. Taking collective a set of interrelated instances x 1,, x T and jointly labeling them
HMM, MEMM and CRF 40-957 Special opics in Artificial Intelligence: Probabilistic Graphical Models Sharif University of echnology Soleymani Spring 2014 Sequence labeling aking collective a set of interrelated
More informationDynamic Approaches: The Hidden Markov Model
Dynamic Approaches: The Hidden Markov Model Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Inference as Message
More informationIntroduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah
Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,
More informationSeventh Grade Social Studies Curriculum Map
Semester Unit Unit Focus NC Essential Standards Social Studies Literacy in History/ Social Studies 1 1st Unit 1 Intro to Geography Culture Explorers Constitution Day Terrorism (9-11) 1st Unit 2 Renaissance
More informationUsing first-order logic, formalize the following knowledge:
Probabilistic Artificial Intelligence Final Exam Feb 2, 2016 Time limit: 120 minutes Number of pages: 19 Total points: 100 You can use the back of the pages if you run out of space. Collaboration on the
More informationOutline. Spring It Introduction Representation. Markov Random Field. Conclusion. Conditional Independence Inference: Variable elimination
Probabilistic Graphical Models COMP 790-90 Seminar Spring 2011 The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline It Introduction ti Representation Bayesian network Conditional Independence Inference:
More informationMachine Learning Summer School
Machine Learning Summer School Lecture 1: Introduction to Graphical Models Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ epartment of ngineering University of ambridge, UK
More informationUndirected Graphical Models: Markov Random Fields
Undirected Graphical Models: Markov Random Fields 40-956 Advanced Topics in AI: Probabilistic Graphical Models Sharif University of Technology Soleymani Spring 2015 Markov Random Field Structure: undirected
More informationExample: multivariate Gaussian Distribution
School of omputer Science Probabilistic Graphical Models Representation of undirected GM (continued) Eric Xing Lecture 3, September 16, 2009 Reading: KF-chap4 Eric Xing @ MU, 2005-2009 1 Example: multivariate
More informationDublin City Schools Science Graded Course of Study Physical Science
I. Content Standard: Students demonstrate an understanding of the composition of physical systems and the concepts and principles that describe and predict physical interactions and events in the natural
More informationStructured Variational Inference
Structured Variational Inference Sargur srihari@cedar.buffalo.edu 1 Topics 1. Structured Variational Approximations 1. The Mean Field Approximation 1. The Mean Field Energy 2. Maximizing the energy functional:
More informationConditional Random Field
Introduction Linear-Chain General Specific Implementations Conclusions Corso di Elaborazione del Linguaggio Naturale Pisa, May, 2011 Introduction Linear-Chain General Specific Implementations Conclusions
More informationCS Lecture 4. Markov Random Fields
CS 6347 Lecture 4 Markov Random Fields Recap Announcements First homework is available on elearning Reminder: Office hours Tuesday from 10am-11am Last Time Bayesian networks Today Markov random fields
More informationSyntax versus Semantics:
Syntax versus Semantics: Analysis of Enriched Vector Space Models Benno Stein and Sven Meyer zu Eissen and Martin Potthast Bauhaus University Weimar Relevance Computation Information retrieval aims at
More informationBayesian Machine Learning - Lecture 7
Bayesian Machine Learning - Lecture 7 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 4, 2015 Today s lecture 1
More informationTornado Alley IMAX Film
Tornado Alley IMAX Film Theme: Documenting and Predicting Tornadoes! The educational value of NASM Theater programming is that the stunning visual images displayed engage the interest and desire to learn
More informationMachine Learning for Data Science (CS4786) Lecture 24
Machine Learning for Data Science (CS4786) Lecture 24 Graphical Models: Approximate Inference Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ BELIEF PROPAGATION OR MESSAGE PASSING Each
More informationSPIRITUAL GIFTS. ( ) ( ) 1. Would you describe yourself as an effective public speaker?
SPIRITUAL GIFTS QUESTIONNAIRE: SPIRITUAL GIFTS ( ) ( ) 1. Would you describe yourself as an effective public speaker? ( ) ( ) 2. Do you find it easy and enjoyable to spend time in intense study and research
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning Mark Schmidt University of British Columbia Winter 2018 Last Time: Learning and Inference in DAGs We discussed learning in DAG models, log p(x W ) = n d log p(x i j x i pa(j),
More informationMachine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li. https://funglee.github.io
Machine Learning Lecture 4: Regularization and Bayesian Statistics Feng Li fli@sdu.edu.cn https://funglee.github.io School of Computer Science and Technology Shandong University Fall 207 Overfitting Problem
More informationOutline. CSE 573: Artificial Intelligence Autumn Agent. Partial Observability. Markov Decision Process (MDP) 10/31/2012
CSE 573: Artificial Intelligence Autumn 2012 Reasoning about Uncertainty & Hidden Markov Models Daniel Weld Many slides adapted from Dan Klein, Stuart Russell, Andrew Moore & Luke Zettlemoyer 1 Outline
More informationCSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18
CSE 417T: Introduction to Machine Learning Final Review Henry Chai 12/4/18 Overfitting Overfitting is fitting the training data more than is warranted Fitting noise rather than signal 2 Estimating! "#$
More informationMessage Passing and Junction Tree Algorithms. Kayhan Batmanghelich
Message Passing and Junction Tree Algorithms Kayhan Batmanghelich 1 Review 2 Review 3 Great Ideas in ML: Message Passing Each soldier receives reports from all branches of tree 3 here 7 here 1 of me 11
More informationExtraction of Opposite Sentiments in Classified Free Format Text Reviews
Extraction of Opposite Sentiments in Classified Free Format Text Reviews Dong (Haoyuan) Li 1, Anne Laurent 2, Mathieu Roche 2, and Pascal Poncelet 1 1 LGI2P - École des Mines d Alès, Parc Scientifique
More informationUniversität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Language Models. Tobias Scheffer
Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Language Models Tobias Scheffer Stochastic Language Models A stochastic language model is a probability distribution over words.
More informationDoes Better Inference mean Better Learning?
Does Better Inference mean Better Learning? Andrew E. Gelfand, Rina Dechter & Alexander Ihler Department of Computer Science University of California, Irvine {agelfand,dechter,ihler}@ics.uci.edu Abstract
More informationNotes on Markov Networks
Notes on Markov Networks Lili Mou moull12@sei.pku.edu.cn December, 2014 This note covers basic topics in Markov networks. We mainly talk about the formal definition, Gibbs sampling for inference, and maximum
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Mining Graph/Network Data Instructor: Yizhou Sun yzsun@ccs.neu.edu November 16, 2015 Methods to Learn Classification Clustering Frequent Pattern Mining Matrix Data Decision
More informationLearning Features from Co-occurrences: A Theoretical Analysis
Learning Features from Co-occurrences: A Theoretical Analysis Yanpeng Li IBM T. J. Watson Research Center Yorktown Heights, New York 10598 liyanpeng.lyp@gmail.com Abstract Representing a word by its co-occurrences
More informationLoopy Belief Propagation for Bipartite Maximum Weight b-matching
Loopy Belief Propagation for Bipartite Maximum Weight b-matching Bert Huang and Tony Jebara Computer Science Department Columbia University New York, NY 10027 Outline 1. Bipartite Weighted b-matching 2.
More informationACTIVITY 3 Introducing Energy Diagrams for Atoms
Name: Class: SOLIDS & Visual Quantum Mechanics LIGHT ACTIVITY 3 Introducing Energy Diagrams for Atoms Goal Now that we have explored spectral properties of LEDs, incandescent lamps, and gas lamps, we will
More informationLecture 8: Inference. Kai-Wei Chang University of Virginia
Lecture 8: Inference Kai-Wei Chang CS @ University of Virginia kw@kwchang.net Some slides are adapted from Vivek Skirmar s course on Structured Prediction Advanced ML: Inference 1 So far what we learned
More information17 Variational Inference
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 17 Variational Inference Prompted by loopy graphs for which exact
More informationProbabilistic Graphical Models: MRFs and CRFs. CSE628: Natural Language Processing Guest Lecturer: Veselin Stoyanov
Probabilistic Graphical Models: MRFs and CRFs CSE628: Natural Language Processing Guest Lecturer: Veselin Stoyanov Why PGMs? PGMs can model joint probabilities of many events. many techniques commonly
More informationStructured Prediction
Structured Prediction Classification Algorithms Classify objects x X into labels y Y First there was binary: Y = {0, 1} Then multiclass: Y = {1,...,6} The next generation: Structured Labels Structured
More informationReview: Directed Models (Bayes Nets)
X Review: Directed Models (Bayes Nets) Lecture 3: Undirected Graphical Models Sam Roweis January 2, 24 Semantics: x y z if z d-separates x and y d-separation: z d-separates x from y if along every undirected
More informationLearning with Structured Inputs and Outputs
Learning with Structured Inputs and Outputs Christoph H. Lampert IST Austria (Institute of Science and Technology Austria), Vienna Microsoft Machine Learning and Intelligence School 2015 July 29-August
More informationLecture 6: Graphical Models: Learning
Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)
More informationApproximate Message Passing
Approximate Message Passing Mohammad Emtiyaz Khan CS, UBC February 8, 2012 Abstract In this note, I summarize Sections 5.1 and 5.2 of Arian Maleki s PhD thesis. 1 Notation We denote scalars by small letters
More informationStat 521A Lecture 18 1
Stat 521A Lecture 18 1 Outline Cts and discrete variables (14.1) Gaussian networks (14.2) Conditional Gaussian networks (14.3) Non-linear Gaussian networks (14.4) Sampling (14.5) 2 Hybrid networks A hybrid
More informationRecap: Inference in Probabilistic Graphical Models. R. Möller Institute of Information Systems University of Luebeck
Recap: Inference in Probabilistic Graphical Models R. Möller Institute of Information Systems University of Luebeck A Simple Example P(A,B,C) = P(A)P(B,C A) = P(A) P(B A) P(C B,A) = P(A) P(B A) P(C B)
More informationDiscriminative Learning of Sum-Product Networks. Robert Gens Pedro Domingos
Discriminative Learning of Sum-Product Networks Robert Gens Pedro Domingos X1 X1 X1 X1 X2 X2 X2 X2 X3 X3 X3 X3 X4 X4 X4 X4 X5 X5 X5 X5 X6 X6 X6 X6 Distributions X 1 X 1 X 1 X 1 X 2 X 2 X 2 X 2 X 3 X 3
More informationLatent Dirichlet Allocation
Outlines Advanced Artificial Intelligence October 1, 2009 Outlines Part I: Theoretical Background Part II: Application and Results 1 Motive Previous Research Exchangeability 2 Notation and Terminology
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationDirected Probabilistic Graphical Models CMSC 678 UMBC
Directed Probabilistic Graphical Models CMSC 678 UMBC Announcement 1: Assignment 3 Due Wednesday April 11 th, 11:59 AM Any questions? Announcement 2: Progress Report on Project Due Monday April 16 th,
More informationSpatial Representation
Spatial Representation Visual Culture LCC6311 Tony F. Lockhart 12/9/2009 Page 2 of 5 Table of Contents Introduction... 3 Grid Layout... 3 Public School Funding Visualization... 3 Inspiration... 4 Works
More informationOur Status in CSE 5522
Our Status in CSE 5522 We re done with Part I Search and Planning! Part II: Probabilistic Reasoning Diagnosis Speech recognition Tracking objects Robot mapping Genetics Error correcting codes lots more!
More informationStudy Notes on the Latent Dirichlet Allocation
Study Notes on the Latent Dirichlet Allocation Xugang Ye 1. Model Framework A word is an element of dictionary {1,,}. A document is represented by a sequence of words: =(,, ), {1,,}. A corpus is a collection
More informationMachine Learning Summer School
Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,
More informationProbabilistic Graphical Models & Applications
Probabilistic Graphical Models & Applications Learning of Graphical Models Bjoern Andres and Bernt Schiele Max Planck Institute for Informatics The slides of today s lecture are authored by and shown with
More information