Probabilistic Information Retrieval

Size: px
Start display at page:

Download "Probabilistic Information Retrieval"

Transcription

1 Probabilistic Information Retrieval Sumit Bhatia July 16, 2009 Sumit Bhatia Probabilistic Information Retrieval 1/23

2 Overview 1 Information Retrieval IR Models Probability Basics 2 Document Ranking Problem Probability Ranking Principle Sumit Bhatia Probabilistic Information Retrieval 2/23

3 Information Retrieval IR Models Probability Basics Information Retrieval(IR) Process 1 User has some information needs Sumit Bhatia Probabilistic Information Retrieval 3/23

4 Information Retrieval IR Models Probability Basics Information Retrieval(IR) Process 1 User has some information needs 2 Information Need Query using Query Representation Sumit Bhatia Probabilistic Information Retrieval 3/23

5 Information Retrieval IR Models Probability Basics Information Retrieval(IR) Process 1 User has some information needs 2 Information Need Query using Query Representation 3 Documents Document Representation Sumit Bhatia Probabilistic Information Retrieval 3/23

6 Information Retrieval IR Models Probability Basics Information Retrieval(IR) Process 1 User has some information needs 2 Information Need Query using Query Representation 3 Documents Document Representation 4 IR system matches the two representations to determine the documents that satisfy user s information needs. Sumit Bhatia Probabilistic Information Retrieval 3/23

7 Boolean Retrieval Model Information Retrieval IR Models Probability Basics Query = Boolean Expression of terms ex. Mitra AND Giles Sumit Bhatia Probabilistic Information Retrieval 4/23

8 Boolean Retrieval Model Information Retrieval IR Models Probability Basics Query = Boolean Expression of terms ex. Mitra AND Giles Document = Term-document Matrix A ij = 1 iff i th term is present in j th document. Sumit Bhatia Probabilistic Information Retrieval 4/23

9 Boolean Retrieval Model Information Retrieval IR Models Probability Basics Query = Boolean Expression of terms ex. Mitra AND Giles Document = Term-document Matrix Bag of words A ij = 1 iff i th term is present in j th document. Sumit Bhatia Probabilistic Information Retrieval 4/23

10 Boolean Retrieval Model Information Retrieval IR Models Probability Basics Query = Boolean Expression of terms ex. Mitra AND Giles Document = Term-document Matrix Bag of words No Ranking A ij = 1 iff i th term is present in j th document. Sumit Bhatia Probabilistic Information Retrieval 4/23

11 Vector Space Model Information Retrieval IR Models Probability Basics Query = free text query ex. Mitra Giles Sumit Bhatia Probabilistic Information Retrieval 5/23

12 Vector Space Model Information Retrieval IR Models Probability Basics Query = free text query ex. Mitra Giles Query and Document vectors in term space Sumit Bhatia Probabilistic Information Retrieval 5/23

13 Vector Space Model Information Retrieval IR Models Probability Basics Query = free text query ex. Mitra Giles Query and Document vectors in term space Cosine similarity between query and document vectors indicates similarity Sumit Bhatia Probabilistic Information Retrieval 5/23

14 Information Retrieval IR Models Probability Basics Information Retrieval(IR) Process-Revisited 1 User has some information needs 2 Information Need Query using Query Representation 3 Documents Document Representation 4 IR system matches the two representations to determine the documents that satisfy user s information needs. Sumit Bhatia Probabilistic Information Retrieval 6/23

15 Information Retrieval IR Models Probability Basics Information Retrieval(IR) Process-Revisited 1 User has some information needs 2 Information Need Query using Query Representation 3 Documents Document Representation 4 IR system matches the two representations to determine the documents that satisfy user s information needs. Problem! Both Query and Document Representations are Uncertain Sumit Bhatia Probabilistic Information Retrieval 6/23

16 Probability Basics Information Retrieval IR Models Probability Basics Chain Rule: P(A,B) = P(A B) = P(A B)P(B) = P(B A)P(A) Partition Rule: P(B) = P(A,B) + P(Ā,B) Bayes Rule: P(A B) = P(B A)P(A) P(B) = [ ] P P(B A) X {A,Ā} P(B X)P(X) P(A) Sumit Bhatia Probabilistic Information Retrieval 7/23

17 Document Ranking Problem Document Ranking Problem Probability Ranking Principle Problem Statement Given a set of documents D = {d 1,d 2,...,d n } and a query q, in what order the subset of relevant documents D r = {d r1,d r2...,d rm } should be returned to the user. Sumit Bhatia Probabilistic Information Retrieval 8/23

18 Document Ranking Problem Document Ranking Problem Probability Ranking Principle Problem Statement Given a set of documents D = {d 1,d 2,...,d n } and a query q, in what order the subset of relevant documents D r = {d r1,d r2...,d rm } should be returned to the user. Hint: We want the best document to be at rank 1, second best to be at rank 2 and so on. Sumit Bhatia Probabilistic Information Retrieval 8/23

19 Document Ranking Problem Document Ranking Problem Probability Ranking Principle Problem Statement Given a set of documents D = {d 1,d 2,...,d n } and a query q, in what order the subset of relevant documents D r = {d r1,d r2...,d rm } should be returned to the user. Hint: We want the best document to be at rank 1, second best to be at rank 2 and so on. Solution Rank by probability of relevance of the document w.r.t. information need (query). = by P(R = 1 d,q) Sumit Bhatia Probabilistic Information Retrieval 8/23

20 Probability Ranking Principle Document Ranking Problem Probability Ranking Principle Probability Ranking Principle (Rijsbergen, 1979) If a reference retrieval system s response to each request is a ranking of the documents in the collection in order of decreasing probability of relevance to the user who submitted the request, where the probabilities are estimated as accurately as possible on the basis of whatever data have been made available to the system for this purpose, the overall effectiveness of the system to its user will be the best that is obtainable on the basis of those data. Sumit Bhatia Probabilistic Information Retrieval 9/23

21 Probability Ranking Principle Document Ranking Problem Probability Ranking Principle Probability Ranking Principle (Rijsbergen, 1979) If a reference retrieval system s response to each request is a ranking of the documents in the collection in order of decreasing probability of relevance to the user who submitted the request, where the probabilities are estimated as accurately as possible on the basis of whatever data have been made available to the system for this purpose, the overall effectiveness of the system to its user will be the best that is obtainable on the basis of those data. Observation 1: PRP maximizes the mean probability at rank k. Sumit Bhatia Probabilistic Information Retrieval 9/23

22 Probability Ranking Principle Document Ranking Problem Probability Ranking Principle Case 1: 1/0 Loss = No selection/retrieval costs. Sumit Bhatia Probabilistic Information Retrieval 10/23

23 Probability Ranking Principle Document Ranking Problem Probability Ranking Principle Case 1: 1/0 Loss = No selection/retrieval costs. Bayes Optimal Decision Rule: d is relevant iff P(R = 1 d,q) > P(R = 0 d,q) Sumit Bhatia Probabilistic Information Retrieval 10/23

24 Probability Ranking Principle Document Ranking Problem Probability Ranking Principle Case 1: 1/0 Loss = No selection/retrieval costs. Bayes Optimal Decision Rule: d is relevant iff P(R = 1 d,q) > P(R = 0 d,q) Theorem 1 PRP is optimal, in the sense that it minimizes the expected loss (Bayes Risk) under 1/0 loss. Sumit Bhatia Probabilistic Information Retrieval 10/23

25 Probability Ranking Principle Document Ranking Problem Probability Ranking Principle Case 1: 1/0 Loss = No selection/retrieval costs. Bayes Optimal Decision Rule: d is relevant iff P(R = 1 d,q) > P(R = 0 d,q) Theorem 1 PRP is optimal, in the sense that it minimizes the expected loss (Bayes Risk) under 1/0 loss. Case 2: PRP with differential retrieval costs C 1.P(R = 1 d, q) + C 0.P(R = 0 d, q) C 1.P(R = 1 d, q) + C 0.P(R = 0 d, q) Sumit Bhatia Probabilistic Information Retrieval 10/23

26 Binary Independence Model (BIM) Assumptions: 1 Binary: documents are represented as binary incidence vectors of terms. d = {d 1,d 2,...,d n } d i = 1 iff term i is present in d, else it is 0. 1 This is the assumption for PRP in general. Sumit Bhatia Probabilistic Information Retrieval 11/23

27 Binary Independence Model (BIM) Assumptions: 1 Binary: documents are represented as binary incidence vectors of terms. d = {d 1,d 2,...,d n } d i = 1 iff term i is present in d, else it is 0. 2 Independence: terms occur in documents independent of other documents. 1 This is the assumption for PRP in general. Sumit Bhatia Probabilistic Information Retrieval 11/23

28 Binary Independence Model (BIM) Assumptions: 1 Binary: documents are represented as binary incidence vectors of terms. d = {d 1,d 2,...,d n } d i = 1 iff term i is present in d, else it is 0. 2 Independence: terms occur in documents independent of other documents. 3 Relevance of a document is independent of relevance of other documents 1 1 This is the assumption for PRP in general. Sumit Bhatia Probabilistic Information Retrieval 11/23

29 Binary Independence Model (BIM) Assumptions: 1 Binary: documents are represented as binary incidence vectors of terms. d = {d 1,d 2,...,d n } d i = 1 iff term i is present in d, else it is 0. 2 Independence: terms occur in documents independent of other documents. 3 Relevance of a document is independent of relevance of other documents 1 Implications: 1 Many documents have the same representation. 2 No association between terms is considered. 1 This is the assumption for PRP in general. Sumit Bhatia Probabilistic Information Retrieval 11/23

30 Binary Independence Model (BIM) We wish to compute P(R d,q). We do it in terms of term incidence vectors d and q. We thus compute P(R d, q). Sumit Bhatia Probabilistic Information Retrieval 12/23

31 Binary Independence Model (BIM) We wish to compute P(R d,q). We do it in terms of term incidence vectors d and q. We thus compute P(R d, q). Using Bayes Rule, we have: P(R = 1 d, q) = P( d R = 1, q) P(R = 1 q) P( d q) P(R = 0 d, q) = P( d R = 0, q) P(R = 0 q) P( d q) (1) (2) Sumit Bhatia Probabilistic Information Retrieval 12/23

32 Binary Independence Model (BIM) We wish to compute P(R d,q). We do it in terms of term incidence vectors d and q. We thus compute P(R d, q). Using Bayes Rule, we have: P(R = 1 d, q) = P( d R = 1, q) P(R = 1 q) P( d q) P(R = 0 d, q) = P( d R = 0, q) P(R = 0 q) P( d q) (1) (2) Prior Relevance Probability Sumit Bhatia Probabilistic Information Retrieval 12/23

33 Binary Independence Model Computing the Odd ratios, we get: O(R d, q) = P(R = 1 q) P(R = 0 q) P( d R = 1, q) P( d R = 0, q) (3) Sumit Bhatia Probabilistic Information Retrieval 13/23

34 Binary Independence Model Computing the Odd ratios, we get: O(R d, q) = Document Independent! P(R = 1 q) P(R = 0 q) P( d R = 1, q) P( d R = 0, q) (3) Sumit Bhatia Probabilistic Information Retrieval 13/23

35 Binary Independence Model Computing the Odd ratios, we get: O(R d, q) = Document Independent! P(R = 1 q) P(R = 0 q) P( d R = 1, q) P( d R = 0, q) What for the second term? (3) Sumit Bhatia Probabilistic Information Retrieval 13/23

36 Binary Independence Model Computing the Odd ratios, we get: O(R d, q) = P(R = 1 q) P(R = 0 q) P( d R = 1, q) P( d R = 0, q) (3) Document Independent! What for the second term? Naive Bayes Assumption Sumit Bhatia Probabilistic Information Retrieval 13/23

37 Binary Independence Model Computing the Odd ratios, we get: O(R d, q) = P(R = 1 q) P(R = 0 q) P( d R = 1, q) P( d R = 0, q) (3) Document Independent! What for the second term? Naive Bayes Assumption O(R d, q) m Π t=1 P( d t R = 1, q) P( d t R = 0, q) (4) Sumit Bhatia Probabilistic Information Retrieval 13/23

38 Binary Independence Model Observation 1: A term is either present in a document or not. Sumit Bhatia Probabilistic Information Retrieval 14/23

39 Binary Independence Model Observation 1: A term is either present in a document or not. O(R d, q) Π m P( d t = 1 R = 1, q) t:d t=1 P( d t = 1 R = 0, q). m P( d t = 0 R = 1, q) Π t:d t=0 P( d t = 0 R = 0, q) (5) Sumit Bhatia Probabilistic Information Retrieval 14/23

40 Binary Independence Model Observation 1: A term is either present in a document or not. O(R d, q) Π m P( d t = 1 R = 1, q) t:d t=1 P( d t = 1 R = 0, q). m P( d t = 0 R = 1, q) Π t:d t=0 P( d t = 0 R = 0, q) (5) document R = 1 R = 0 Term present d t = 1 p t u t Term absent d t = 0 1 p t 1 u t Sumit Bhatia Probabilistic Information Retrieval 14/23

41 Binary Independence Model Assumption: A term not in query is equally likey to occur in relevant and non-relevant documents. Sumit Bhatia Probabilistic Information Retrieval 15/23

42 Binary Independence Model Assumption: A term not in query is equally likey to occur in relevant and non-relevant documents. O(R d, q) p t Π. t:d t=q t=1 u t Π t:d t=0,q t=1 1 p t 1 u t (6) Sumit Bhatia Probabilistic Information Retrieval 15/23

43 Binary Independence Model Assumption: A term not in query is equally likey to occur in relevant and non-relevant documents. Manipulating: O(R d, q) O(R d, q) p t Π. t:d t=q t=1 u t Π t:d t=0,q t=1 1 p t 1 u t (6) p t (1 u t ) Π t:d t=q t=1 u t (1 p t ). 1 p t Πt:q t=1 (7) 1 u t Sumit Bhatia Probabilistic Information Retrieval 15/23

44 Binary Independence Model Assumption: A term not in query is equally likey to occur in relevant and non-relevant documents. Manipulating: O(R d, q) O(R d, q) Constant for a given query! p t Π. t:d t=q t=1 u t Π t:d t=0,q t=1 1 p t 1 u t (6) p t (1 u t ) Π t:d t=q t=1 u t (1 p t ). 1 p t Πt:q t=1 (7) 1 u t Sumit Bhatia Probabilistic Information Retrieval 15/23

45 Binary Independence Model RSV d = log = Π t:d t=q t=1 t:d t=q t=1 p t (1 u t ) u t (1 p t ) (8) log p t(1 u t ) u t (1 p t ) (9) Sumit Bhatia Probabilistic Information Retrieval 16/23

46 Binary Independence Model RSV d = log = Π t:d t=q t=1 t:d t=q t=1 p t (1 u t ) u t (1 p t ) (8) log p t(1 u t ) u t (1 p t ) (9) Docs R=1 R=0 Total d i = 1 s n-s n d i = 0 S-s (N-n)-(S-s) N-n Total S N-S N Sumit Bhatia Probabilistic Information Retrieval 16/23

47 Binary Independence Model RSV d = log = Π t:d t=q t=1 t:d t=q t=1 p t (1 u t ) u t (1 p t ) (8) log p t(1 u t ) u t (1 p t ) substituting, we get: RSV d = t:d t=q t=1 (9) Docs R=1 R=0 Total d i = 1 s n-s n d i = 0 S-s (N-n)-(S-s) N-n Total S N-S N (s log )/(S s ) (n s )/(N n S + s ) (10) Sumit Bhatia Probabilistic Information Retrieval 16/23

48 Observations Probabilities for non-relevant documents can be approximated by collection statistics. = log (1 u t) (N n) = log log N u t n n = IDF! Sumit Bhatia Probabilistic Information Retrieval 17/23

49 Observations Probabilities for non-relevant documents can be approximated by collection statistics. = log (1 u t) (N n) = log log N u t n n = IDF! It is not so simple for relevant documents Estimating from known relevant documents (not always known) Assuming p t = constant, equivalent to IDF weighting only Sumit Bhatia Probabilistic Information Retrieval 17/23

50 Observations Probabilities for non-relevant documents can be approximated by collection statistics. = log (1 u t) (N n) = log log N u t n n = IDF! It is not so simple for relevant documents Estimating from known relevant documents (not always known) Assuming p t = constant, equivalent to IDF weighting only Difficulties in probability estimation and drastic assumptions makes achieving performance difficult Sumit Bhatia Probabilistic Information Retrieval 17/23

51 Weighting Scheme BIM does not consider term frequencies and document length. BM25 weighting scheme (Okapi weighting) by was developed to build a probabilistic model sensitive to these quantities. BM25 today is widely used and has shown good performance in a number of practical systems. Sumit Bhatia Probabilistic Information Retrieval 18/23

52 Weighting Scheme RSV d = t q { log N df t (k 1 + 1)tf td k 1 ((1 b) + b ( l d )) + tf td l av } (k 3 + 1)tf tq k 3 + tf tq where: N is the total number of documents, df t is the document frequency, i.e.,number of documents that contain the term t, tf td is the frequency of term t in document d, tf tq is the frequency of term t in query q, l d is the length of document d, l av is the average length of documents, k 1, k 3 and b are constants which are generally set to 2, 2 and.75 respectively. Sumit Bhatia Probabilistic Information Retrieval 19/23

53 What Next? Similarity between terms and documents - is this sufficient? Sumit Bhatia Probabilistic Information Retrieval 20/23

54 What Next? Similarity between terms and documents - is this sufficient? JAVA: Coffee or Computer Language or Place? Sumit Bhatia Probabilistic Information Retrieval 20/23

55 What Next? Similarity between terms and documents - is this sufficient? JAVA: Coffee or Computer Language or Place? Time and Location of user? Sumit Bhatia Probabilistic Information Retrieval 20/23

56 What Next? Similarity between terms and documents - is this sufficient? JAVA: Coffee or Computer Language or Place? Time and Location of user? Different users might want different documents for same query? Sumit Bhatia Probabilistic Information Retrieval 20/23

57 What Next? Maximum Marginal Relevance [CG98] Rank documents so as to minimize similarity between returned documents Sumit Bhatia Probabilistic Information Retrieval 21/23

58 What Next? Maximum Marginal Relevance [CG98] Rank documents so as to minimize similarity between returned documents Result Diversification [Wan09] Rank documents so as to maximize mean relevance, given a variance level. Variance here determines the risk the user is willing to take Sumit Bhatia Probabilistic Information Retrieval 21/23

59 References Carbonell, Jaime and Goldstein, Jade, The use of MMR, diversity-based reranking for reordering documents and producing summaries, SIGIR, 1998, pp Christopher D. Manning, Prabhakar Raghavan, and Hinrich Schütze, to information retrieval, Cambridge University Press, Jun Wang, Mean-variance analysis: A new document ranking theory in information retrieval, Advances in Information Retrieval, 2009, pp Sumit Bhatia Probabilistic Information Retrieval 22/23

60 QUESTIONS??? Sumit Bhatia Probabilistic Information Retrieval 23/23

Information Retrieval

Information Retrieval Introduction to Information Retrieval Lecture 11: Probabilistic Information Retrieval 1 Outline Basic Probability Theory Probability Ranking Principle Extensions 2 Basic Probability Theory For events A

More information

PV211: Introduction to Information Retrieval

PV211: Introduction to Information Retrieval PV211: Introduction to Information Retrieval http://www.fi.muni.cz/~sojka/pv211 IIR 11: Probabilistic Information Retrieval Handout version Petr Sojka, Hinrich Schütze et al. Faculty of Informatics, Masaryk

More information

Lecture 9: Probabilistic IR The Binary Independence Model and Okapi BM25

Lecture 9: Probabilistic IR The Binary Independence Model and Okapi BM25 Lecture 9: Probabilistic IR The Binary Independence Model and Okapi BM25 Trevor Cohn (Slide credits: William Webber) COMP90042, 2015, Semester 1 What we ll learn in this lecture Probabilistic models for

More information

Information Retrieval Basic IR models. Luca Bondi

Information Retrieval Basic IR models. Luca Bondi Basic IR models Luca Bondi Previously on IR 2 d j q i IRM SC q i, d j IRM D, Q, R q i, d j d j = w 1,j, w 2,j,, w M,j T w i,j = 0 if term t i does not appear in document d j w i,j and w i:1,j assumed to

More information

IR Models: The Probabilistic Model. Lecture 8

IR Models: The Probabilistic Model. Lecture 8 IR Models: The Probabilistic Model Lecture 8 ' * ) ( % $ $ +#! "#! '& & Probability of Relevance? ' ', IR is an uncertain process Information need to query Documents to index terms Query terms and index

More information

Ranked Retrieval (2)

Ranked Retrieval (2) Text Technologies for Data Science INFR11145 Ranked Retrieval (2) Instructor: Walid Magdy 31-Oct-2017 Lecture Objectives Learn about Probabilistic models BM25 Learn about LM for IR 2 1 Recall: VSM & TFIDF

More information

Motivation. User. Retrieval Model Result: Query. Document Collection. Information Need. Information Retrieval / Chapter 3: Retrieval Models

Motivation. User. Retrieval Model Result: Query. Document Collection. Information Need. Information Retrieval / Chapter 3: Retrieval Models 3. Retrieval Models Motivation Information Need User Retrieval Model Result: Query 1. 2. 3. Document Collection 2 Agenda 3.1 Boolean Retrieval 3.2 Vector Space Model 3.3 Probabilistic IR 3.4 Statistical

More information

INFO 4300 / CS4300 Information Retrieval. slides adapted from Hinrich Schütze s, linked from

INFO 4300 / CS4300 Information Retrieval. slides adapted from Hinrich Schütze s, linked from INFO 4300 / CS4300 Information Retrieval slides adapted from Hinrich Schütze s, linked from http://informationretrieval.org/ IR 13: Query Expansion and Probabilistic Retrieval Paul Ginsparg Cornell University,

More information

Knowledge Discovery in Data: Overview. Naïve Bayesian Classification. .. Spring 2009 CSC 466: Knowledge Discovery from Data Alexander Dekhtyar..

Knowledge Discovery in Data: Overview. Naïve Bayesian Classification. .. Spring 2009 CSC 466: Knowledge Discovery from Data Alexander Dekhtyar.. Spring 2009 CSC 466: Knowledge Discovery from Data Alexander Dekhtyar Knowledge Discovery in Data: Naïve Bayes Overview Naïve Bayes methodology refers to a probabilistic approach to information discovery

More information

CS47300: Web Information Search and Management

CS47300: Web Information Search and Management CS47300: Web Informaton Search and Management Probablstc Retreval Models Prof. Chrs Clfton 7 September 2018 Materal adapted from course created by Dr. Luo S, now leadng Albaba research group 14 Why probabltes

More information

On the Foundations of Diverse Information Retrieval. Scott Sanner, Kar Wai Lim, Shengbo Guo, Thore Graepel, Sarvnaz Karimi, Sadegh Kharazmi

On the Foundations of Diverse Information Retrieval. Scott Sanner, Kar Wai Lim, Shengbo Guo, Thore Graepel, Sarvnaz Karimi, Sadegh Kharazmi On the Foundations of Diverse Information Retrieval Scott Sanner, Kar Wai Lim, Shengbo Guo, Thore Graepel, Sarvnaz Karimi, Sadegh Kharazmi 1 Outline Need for diversity The answer: MMR But what was the

More information

Language Models. Web Search. LM Jelinek-Mercer Smoothing and LM Dirichlet Smoothing. Slides based on the books: 13

Language Models. Web Search. LM Jelinek-Mercer Smoothing and LM Dirichlet Smoothing. Slides based on the books: 13 Language Models LM Jelinek-Mercer Smoothing and LM Dirichlet Smoothing Web Search Slides based on the books: 13 Overview Indexes Query Indexing Ranking Results Application Documents User Information analysis

More information

Ranking-II. Temporal Representation and Retrieval Models. Temporal Information Retrieval

Ranking-II. Temporal Representation and Retrieval Models. Temporal Information Retrieval Ranking-II Temporal Representation and Retrieval Models Temporal Information Retrieval Ranking in Information Retrieval Ranking documents important for information overload, quickly finding documents which

More information

Machine Learning for natural language processing

Machine Learning for natural language processing Machine Learning for natural language processing Classification: Naive Bayes Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 20 Introduction Classification = supervised method for

More information

Information Retrieval

Information Retrieval Introduction to Information Retrieval Lecture 12: Language Models for IR Outline Language models Language Models for IR Discussion What is a language model? We can view a finite state automaton as a deterministic

More information

Web Information Retrieval Dipl.-Inf. Christoph Carl Kling

Web Information Retrieval Dipl.-Inf. Christoph Carl Kling Institute for Web Science & Technologies University of Koblenz-Landau, Germany Web Information Retrieval Dipl.-Inf. Christoph Carl Kling Exercises WebIR ask questions! WebIR@c-kling.de 2 of 40 Probabilities

More information

RETRIEVAL MODELS. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

RETRIEVAL MODELS. Dr. Gjergji Kasneci Introduction to Information Retrieval WS RETRIEVAL MODELS Dr. Gjergji Kasneci Introduction to Information Retrieval WS 2012-13 1 Outline Intro Basics of probability and information theory Retrieval models Boolean model Vector space model Probabilistic

More information

Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology

Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology M. Soleymani Fall 2014 Most slides have been adapted from: Profs. Manning, Nayak & Raghavan (CS-276, Stanford)

More information

Machine Learning for natural language processing

Machine Learning for natural language processing Machine Learning for natural language processing Classification: k nearest neighbors Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 28 Introduction Classification = supervised method

More information

Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology

Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology M. Soleymani Fall 2016 Most slides have been adapted from: Profs. Manning, Nayak & Raghavan (CS-276, Stanford)

More information

Chap 2: Classical models for information retrieval

Chap 2: Classical models for information retrieval Chap 2: Classical models for information retrieval Jean-Pierre Chevallet & Philippe Mulhem LIG-MRIM Sept 2016 Jean-Pierre Chevallet & Philippe Mulhem Models of IR 1 / 81 Outline Basic IR Models 1 Basic

More information

Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology

Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology Scoring (Vector Space Model) CE-324: Modern Information Retrieval Sharif University of Technology M. Soleymani Fall 2017 Most slides have been adapted from: Profs. Manning, Nayak & Raghavan (CS-276, Stanford)

More information

INFO 4300 / CS4300 Information Retrieval. slides adapted from Hinrich Schütze s, linked from

INFO 4300 / CS4300 Information Retrieval. slides adapted from Hinrich Schütze s, linked from INFO 4300 / CS4300 Information Retrieval slides adapted from Hinrich Schütze s, linked from http://informationretrieval.org/ IR 26/26: Feature Selection and Exam Overview Paul Ginsparg Cornell University,

More information

Modern Information Retrieval

Modern Information Retrieval Modern Information Retrieval Chapter 3 Modeling Introduction to IR Models Basic Concepts The Boolean Model Term Weighting The Vector Model Probabilistic Model Retrieval Evaluation, Modern Information Retrieval,

More information

Boolean and Vector Space Retrieval Models

Boolean and Vector Space Retrieval Models Boolean and Vector Space Retrieval Models Many slides in this section are adapted from Prof. Joydeep Ghosh (UT ECE) who in turn adapted them from Prof. Dik Lee (Univ. of Science and Tech, Hong Kong) 1

More information

PV211: Introduction to Information Retrieval

PV211: Introduction to Information Retrieval PV211: Introduction to Information Retrieval http://www.fi.muni.cz/~sojka/pv211 IIR 6: Scoring, term weighting, the vector space model Handout version Petr Sojka, Hinrich Schütze et al. Faculty of Informatics,

More information

Dealing with Text Databases

Dealing with Text Databases Dealing with Text Databases Unstructured data Boolean queries Sparse matrix representation Inverted index Counts vs. frequencies Term frequency tf x idf term weights Documents as vectors Cosine similarity

More information

Introduction to Information Retrieval (Manning, Raghavan, Schutze) Chapter 6 Scoring term weighting and the vector space model

Introduction to Information Retrieval (Manning, Raghavan, Schutze) Chapter 6 Scoring term weighting and the vector space model Introduction to Information Retrieval (Manning, Raghavan, Schutze) Chapter 6 Scoring term weighting and the vector space model Ranked retrieval Thus far, our queries have all been Boolean. Documents either

More information

Information Retrieval

Information Retrieval Introduction to Information Retrieval CS276: Information Retrieval and Web Search Christopher Manning and Prabhakar Raghavan Lecture 6: Scoring, Term Weighting and the Vector Space Model This lecture;

More information

Term Weighting and the Vector Space Model. borrowing from: Pandu Nayak and Prabhakar Raghavan

Term Weighting and the Vector Space Model. borrowing from: Pandu Nayak and Prabhakar Raghavan Term Weighting and the Vector Space Model borrowing from: Pandu Nayak and Prabhakar Raghavan IIR Sections 6.2 6.4.3 Ranked retrieval Scoring documents Term frequency Collection statistics Weighting schemes

More information

CSCE 561 Information Retrieval System Models

CSCE 561 Information Retrieval System Models CSCE 561 Information Retrieval System Models Satya Katragadda 26 August 2015 Agenda Introduction to Information Retrieval Inverted Index IR System Models Boolean Retrieval Model 2 Introduction Information

More information

TDDD43. Information Retrieval. Fang Wei-Kleiner. ADIT/IDA Linköping University. Fang Wei-Kleiner ADIT/IDA LiU TDDD43 Information Retrieval 1

TDDD43. Information Retrieval. Fang Wei-Kleiner. ADIT/IDA Linköping University. Fang Wei-Kleiner ADIT/IDA LiU TDDD43 Information Retrieval 1 TDDD43 Information Retrieval Fang Wei-Kleiner ADIT/IDA Linköping University Fang Wei-Kleiner ADIT/IDA LiU TDDD43 Information Retrieval 1 Outline 1. Introduction 2. Inverted index 3. Ranked Retrieval tf-idf

More information

Information Retrieval

Information Retrieval Introduction to Information Retrieval CS276: Information Retrieval and Web Search Pandu Nayak and Prabhakar Raghavan Lecture 6: Scoring, Term Weighting and the Vector Space Model This lecture; IIR Sections

More information

Retrieval by Content. Part 2: Text Retrieval Term Frequency and Inverse Document Frequency. Srihari: CSE 626 1

Retrieval by Content. Part 2: Text Retrieval Term Frequency and Inverse Document Frequency. Srihari: CSE 626 1 Retrieval by Content Part 2: Text Retrieval Term Frequency and Inverse Document Frequency Srihari: CSE 626 1 Text Retrieval Retrieval of text-based information is referred to as Information Retrieval (IR)

More information

Vector Space Scoring Introduction to Information Retrieval INF 141 Donald J. Patterson

Vector Space Scoring Introduction to Information Retrieval INF 141 Donald J. Patterson Vector Space Scoring Introduction to Information Retrieval INF 141 Donald J. Patterson Content adapted from Hinrich Schütze http://www.informationretrieval.org Querying Corpus-wide statistics Querying

More information

CMPS 561 Boolean Retrieval. Ryan Benton Sept. 7, 2011

CMPS 561 Boolean Retrieval. Ryan Benton Sept. 7, 2011 CMPS 561 Boolean Retrieval Ryan Benton Sept. 7, 2011 Agenda Indices IR System Models Processing Boolean Query Algorithms for Intersection Indices Indices Question: How do we store documents and terms such

More information

CAIM: Cerca i Anàlisi d Informació Massiva

CAIM: Cerca i Anàlisi d Informació Massiva 1 / 21 CAIM: Cerca i Anàlisi d Informació Massiva FIB, Grau en Enginyeria Informàtica Slides by Marta Arias, José Balcázar, Ricard Gavaldá Department of Computer Science, UPC Fall 2016 http://www.cs.upc.edu/~caim

More information

Scoring, Term Weighting and the Vector Space

Scoring, Term Weighting and the Vector Space Scoring, Term Weighting and the Vector Space Model Francesco Ricci Most of these slides comes from the course: Information Retrieval and Web Search, Christopher Manning and Prabhakar Raghavan Content [J

More information

Lecture 2: IR Models. Johan Bollen Old Dominion University Department of Computer Science

Lecture 2: IR Models. Johan Bollen Old Dominion University Department of Computer Science Lecture 2: IR Models. Johan Bollen Old Dominion University Department of Computer Science http://www.cs.odu.edu/ jbollen January 30, 2003 Page 1 Structure 1. IR formal characterization (a) Mathematical

More information

Compact Indexes for Flexible Top-k Retrieval

Compact Indexes for Flexible Top-k Retrieval Compact Indexes for Flexible Top-k Retrieval Simon Gog Matthias Petri Institute of Theoretical Informatics, Karlsruhe Institute of Technology Computing and Information Systems, The University of Melbourne

More information

5 10 12 32 48 5 10 12 32 48 4 8 16 32 64 128 4 8 16 32 64 128 2 3 5 16 2 3 5 16 5 10 12 32 48 4 8 16 32 64 128 2 3 5 16 docid score 5 10 12 32 48 O'Neal averaged 15.2 points 9.2 rebounds and 1.0 assists

More information

Vector Space Scoring Introduction to Information Retrieval Informatics 141 / CS 121 Donald J. Patterson

Vector Space Scoring Introduction to Information Retrieval Informatics 141 / CS 121 Donald J. Patterson Vector Space Scoring Introduction to Information Retrieval Informatics 141 / CS 121 Donald J. Patterson Content adapted from Hinrich Schütze http://www.informationretrieval.org Querying Corpus-wide statistics

More information

Natural Language Processing. Topics in Information Retrieval. Updated 5/10

Natural Language Processing. Topics in Information Retrieval. Updated 5/10 Natural Language Processing Topics in Information Retrieval Updated 5/10 Outline Introduction to IR Design features of IR systems Evaluation measures The vector space model Latent semantic indexing Background

More information

Document indexing, similarities and retrieval in large scale text collections

Document indexing, similarities and retrieval in large scale text collections Document indexing, similarities and retrieval in large scale text collections Eric Gaussier Univ. Grenoble Alpes - LIG Eric.Gaussier@imag.fr Eric Gaussier Document indexing, similarities & retrieval 1

More information

Information Retrieval

Information Retrieval Introduction to Information Retrieval CS276: Information Retrieval and Web Search Christopher Manning, Pandu Nayak, and Prabhakar Raghavan Lecture 14: Learning to Rank Sec. 15.4 Machine learning for IR

More information

Boolean and Vector Space Retrieval Models CS 290N Some of slides from R. Mooney (UTexas), J. Ghosh (UT ECE), D. Lee (USTHK).

Boolean and Vector Space Retrieval Models CS 290N Some of slides from R. Mooney (UTexas), J. Ghosh (UT ECE), D. Lee (USTHK). Boolean and Vector Space Retrieval Models 2013 CS 290N Some of slides from R. Mooney (UTexas), J. Ghosh (UT ECE), D. Lee (USTHK). 1 Table of Content Boolean model Statistical vector space model Retrieval

More information

Term Weighting and Vector Space Model. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze

Term Weighting and Vector Space Model. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze Term Weighting and Vector Space Model Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze 1 Ranked retrieval Thus far, our queries have all been Boolean. Documents either

More information

A Risk Minimization Framework for Information Retrieval

A Risk Minimization Framework for Information Retrieval A Risk Minimization Framework for Information Retrieval ChengXiang Zhai a John Lafferty b a Department of Computer Science University of Illinois at Urbana-Champaign b School of Computer Science Carnegie

More information

Ranked IR. Lecture Objectives. Text Technologies for Data Science INFR Learn about Ranked IR. Implement: 10/10/2017. Instructor: Walid Magdy

Ranked IR. Lecture Objectives. Text Technologies for Data Science INFR Learn about Ranked IR. Implement: 10/10/2017. Instructor: Walid Magdy Text Technologies for Data Science INFR11145 Ranked IR Instructor: Walid Magdy 10-Oct-017 Lecture Objectives Learn about Ranked IR TFIDF VSM SMART notation Implement: TFIDF 1 Boolean Retrieval Thus far,

More information

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev CS4705 Probability Review and Naïve Bayes Slides from Dragomir Radev Classification using a Generative Approach Previously on NLP discriminative models P C D here is a line with all the social media posts

More information

Outline for today. Information Retrieval. Cosine similarity between query and document. tf-idf weighting

Outline for today. Information Retrieval. Cosine similarity between query and document. tf-idf weighting Outline for today Information Retrieval Efficient Scoring and Ranking Recap on ranked retrieval Jörg Tiedemann jorg.tiedemann@lingfil.uu.se Department of Linguistics and Philology Uppsala University Efficient

More information

How Latent Semantic Indexing Solves the Pachyderm Problem

How Latent Semantic Indexing Solves the Pachyderm Problem How Latent Semantic Indexing Solves the Pachyderm Problem Michael A. Covington Institute for Artificial Intelligence The University of Georgia 2011 1 Introduction Here I present a brief mathematical demonstration

More information

Effectiveness of complex index terms in information retrieval

Effectiveness of complex index terms in information retrieval Effectiveness of complex index terms in information retrieval Tokunaga Takenobu, Ogibayasi Hironori and Tanaka Hozumi Department of Computer Science Tokyo Institute of Technology Abstract This paper explores

More information

6 Probabilistic Retrieval Models

6 Probabilistic Retrieval Models Probabilistic Retrieval Models 1 6 Probabilistic Retrieval Models Notations Binary Independence Retrieval model Probability Ranking Principle Probabilistic Retrieval Models 2 6.1 Notations Q Q Q D rel.

More information

Towards modeling implicit feedback with quantum entanglement

Towards modeling implicit feedback with quantum entanglement Towards modeling implicit feedback with quantum entanglement Massimo Melucci Talk by Emanuele Di Buccio Department of Information Engineering University of Padua Quantum Interaction Oxford, UK, 26th 28th

More information

Advanced Topics in Information Retrieval 5. Diversity & Novelty

Advanced Topics in Information Retrieval 5. Diversity & Novelty Advanced Topics in Information Retrieval 5. Diversity & Novelty Vinay Setty (vsetty@mpi-inf.mpg.de) Jannik Strötgen (jtroetge@mpi-inf.mpg.de) 1 Outline 5.1. Why Novelty & Diversity? 5.2. Probability Ranking

More information

3. Basics of Information Retrieval

3. Basics of Information Retrieval Text Analysis and Retrieval 3. Basics of Information Retrieval Prof. Bojana Dalbelo Bašić Assoc. Prof. Jan Šnajder With contributions from dr. sc. Goran Glavaš Mladen Karan, mag. ing. University of Zagreb

More information

Query Propagation in Possibilistic Information Retrieval Networks

Query Propagation in Possibilistic Information Retrieval Networks Query Propagation in Possibilistic Information Retrieval Networks Asma H. Brini Université Paul Sabatier brini@irit.fr Luis M. de Campos Universidad de Granada lci@decsai.ugr.es Didier Dubois Université

More information

ChengXiang ( Cheng ) Zhai Department of Computer Science University of Illinois at Urbana-Champaign

ChengXiang ( Cheng ) Zhai Department of Computer Science University of Illinois at Urbana-Champaign Axiomatic Analysis and Optimization of Information Retrieval Models ChengXiang ( Cheng ) Zhai Department of Computer Science University of Illinois at Urbana-Champaign http://www.cs.uiuc.edu/homes/czhai

More information

11. Learning To Rank. Most slides were adapted from Stanford CS 276 course.

11. Learning To Rank. Most slides were adapted from Stanford CS 276 course. 11. Learning To Rank Most slides were adapted from Stanford CS 276 course. 1 Sec. 15.4 Machine learning for IR ranking? We ve looked at methods for ranking documents in IR Cosine similarity, inverse document

More information

Improving Diversity in Ranking using Absorbing Random Walks

Improving Diversity in Ranking using Absorbing Random Walks Improving Diversity in Ranking using Absorbing Random Walks Andrew B. Goldberg with Xiaojin Zhu, Jurgen Van Gael, and David Andrzejewski Department of Computer Sciences, University of Wisconsin, Madison

More information

Vector Space Scoring Introduction to Information Retrieval INF 141 Donald J. Patterson

Vector Space Scoring Introduction to Information Retrieval INF 141 Donald J. Patterson Vector Space Scoring Introduction to Information Retrieval INF 141 Donald J. Patterson Content adapted from Hinrich Schütze http://www.informationretrieval.org Collection Frequency, cf Define: The total

More information

Variable Latent Semantic Indexing

Variable Latent Semantic Indexing Variable Latent Semantic Indexing Prabhakar Raghavan Yahoo! Research Sunnyvale, CA November 2005 Joint work with A. Dasgupta, R. Kumar, A. Tomkins. Yahoo! Research. Outline 1 Introduction 2 Background

More information

Probability Theory for Machine Learning. Chris Cremer September 2015

Probability Theory for Machine Learning. Chris Cremer September 2015 Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares

More information

Name: Matriculation Number: Tutorial Group: A B C D E

Name: Matriculation Number: Tutorial Group: A B C D E Name: Matriculation Number: Tutorial Group: A B C D E Question: 1 (5 Points) 2 (6 Points) 3 (5 Points) 4 (5 Points) Total (21 points) Score: General instructions: The written test contains 4 questions

More information

Information Retrieval and Web Search Engines

Information Retrieval and Web Search Engines Information Retrieval and Web Search Engines Lecture 4: Probabilistic Retrieval Models April 29, 2010 Wolf-Tilo Balke and Joachim Selke Institut für Informationssysteme Technische Universität Braunschweig

More information

Information Retrieval

Information Retrieval Introduction to Information CS276: Information and Web Search Christopher Manning and Pandu Nayak Lecture 15: Learning to Rank Sec. 15.4 Machine learning for IR ranking? We ve looked at methods for ranking

More information

Basic Probability and Decisions

Basic Probability and Decisions Basic Probability and Decisions Chris Amato Northeastern University Some images and slides are used from: Rob Platt, CS188 UC Berkeley, AIMA Uncertainty Let action A t = leave for airport t minutes before

More information

Text mining and natural language analysis. Jefrey Lijffijt

Text mining and natural language analysis. Jefrey Lijffijt Text mining and natural language analysis Jefrey Lijffijt PART I: Introduction to Text Mining Why text mining The amount of text published on paper, on the web, and even within companies is inconceivably

More information

Data Mining and Matrices

Data Mining and Matrices Data Mining and Matrices 10 Graphs II Rainer Gemulla, Pauli Miettinen Jul 4, 2013 Link analysis The web as a directed graph Set of web pages with associated textual content Hyperlinks between webpages

More information

Lecture 3: Probabilistic Retrieval Models

Lecture 3: Probabilistic Retrieval Models Probabilistic Retrieval Models Information Retrieval and Web Search Engines Lecture 3: Probabilistic Retrieval Models November 5 th, 2013 Wolf-Tilo Balke and Kinda El Maarry Institut für Informationssysteme

More information

Comparing Relevance Feedback Techniques on German News Articles

Comparing Relevance Feedback Techniques on German News Articles B. Mitschang et al. (Hrsg.): BTW 2017 Workshopband, Lecture Notes in Informatics (LNI), Gesellschaft für Informatik, Bonn 2017 301 Comparing Relevance Feedback Techniques on German News Articles Julia

More information

A Study of the Dirichlet Priors for Term Frequency Normalisation

A Study of the Dirichlet Priors for Term Frequency Normalisation A Study of the Dirichlet Priors for Term Frequency Normalisation ABSTRACT Ben He Department of Computing Science University of Glasgow Glasgow, United Kingdom ben@dcs.gla.ac.uk In Information Retrieval

More information

Non-Boolean models of retrieval: Agenda

Non-Boolean models of retrieval: Agenda Non-Boolean models of retrieval: Agenda Review of Boolean model and TF/IDF Simple extensions thereof Vector model Language Model-based retrieval Matrix decomposition methods Non-Boolean models of retrieval:

More information

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013 Introduction to Bayes Nets CS 486/686: Introduction to Artificial Intelligence Fall 2013 1 Introduction Review probabilistic inference, independence and conditional independence Bayesian Networks - - What

More information

vector space retrieval many slides courtesy James Amherst

vector space retrieval many slides courtesy James Amherst vector space retrieval many slides courtesy James Allan@umass Amherst 1 what is a retrieval model? Model is an idealization or abstraction of an actual process Mathematical models are used to study the

More information

CS276A Text Information Retrieval, Mining, and Exploitation. Lecture 4 15 Oct 2002

CS276A Text Information Retrieval, Mining, and Exploitation. Lecture 4 15 Oct 2002 CS276A Text Information Retrieval, Mining, and Exploitation Lecture 4 15 Oct 2002 Recap of last time Index size Index construction techniques Dynamic indices Real world considerations 2 Back of the envelope

More information

MATRIX DECOMPOSITION AND LATENT SEMANTIC INDEXING (LSI) Introduction to Information Retrieval CS 150 Donald J. Patterson

MATRIX DECOMPOSITION AND LATENT SEMANTIC INDEXING (LSI) Introduction to Information Retrieval CS 150 Donald J. Patterson MATRIX DECOMPOSITION AND LATENT SEMANTIC INDEXING (LSI) Introduction to Information Retrieval CS 150 Donald J. Patterson Content adapted from Hinrich Schütze http://www.informationretrieval.org Latent

More information

INFO 630 / CS 674 Lecture Notes

INFO 630 / CS 674 Lecture Notes INFO 630 / CS 674 Lecture Notes The Language Modeling Approach to Information Retrieval Lecturer: Lillian Lee Lecture 9: September 25, 2007 Scribes: Vladimir Barash, Stephen Purpura, Shaomei Wu Introduction

More information

Machine Learning Algorithm. Heejun Kim

Machine Learning Algorithm. Heejun Kim Machine Learning Algorithm Heejun Kim June 12, 2018 Machine Learning Algorithms Machine Learning algorithm: a procedure in developing computer programs that improve their performance with experience. Types

More information

{ p if x = 1 1 p if x = 0

{ p if x = 1 1 p if x = 0 Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

Computational Genomics

Computational Genomics Computational Genomics http://www.cs.cmu.edu/~02710 Introduction to probability, statistics and algorithms (brief) intro to probability Basic notations Random variable - referring to an element / event

More information

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability

More information

Undirected Graphical Models

Undirected Graphical Models Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

Probabilistic Graphical Models (I)

Probabilistic Graphical Models (I) Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random

More information

Molecular Similarity Searching Using Inference Network

Molecular Similarity Searching Using Inference Network Molecular Similarity Searching Using Inference Network Ammar Abdo, Naomie Salim* Faculty of Computer Science & Information Systems Universiti Teknologi Malaysia Molecular Similarity Searching Search for

More information

INFO 4300 / CS4300 Information Retrieval. slides adapted from Hinrich Schütze s, linked from

INFO 4300 / CS4300 Information Retrieval. slides adapted from Hinrich Schütze s, linked from INFO 4300 / CS4300 Information Retrieval slides adapted from Hinrich Schütze s, linked from http://informationretrieval.org/ IR 9: Collaborative Filtering, SVD, and Linear Algebra Review Paul Ginsparg

More information

Modeling the Score Distributions of Relevant and Non-relevant Documents

Modeling the Score Distributions of Relevant and Non-relevant Documents Modeling the Score Distributions of Relevant and Non-relevant Documents Evangelos Kanoulas, Virgil Pavlu, Keshi Dai, and Javed A. Aslam College of Computer and Information Science Northeastern University,

More information

Information Retrieval and Topic Models. Mausam (Based on slides of W. Arms, Dan Jurafsky, Thomas Hofmann, Ata Kaban, Chris Manning, Melanie Martin)

Information Retrieval and Topic Models. Mausam (Based on slides of W. Arms, Dan Jurafsky, Thomas Hofmann, Ata Kaban, Chris Manning, Melanie Martin) Information Retrieval and Topic Models Mausam (Based on slides of W. Arms, Dan Jurafsky, Thomas Hofmann, Ata Kaban, Chris Manning, Melanie Martin) Sec. 1.1 Unstructured data in 1620 Which plays of Shakespeare

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian

More information

16 The Information Retrieval "Data Model"

16 The Information Retrieval Data Model 16 The Information Retrieval "Data Model" 16.1 The general model Not presented in 16.2 Similarity the course! 16.3 Boolean Model Not relevant for exam. 16.4 Vector space Model 16.5 Implementation issues

More information

A Latent Variable Graphical Model Derivation of Diversity for Set-based Retrieval

A Latent Variable Graphical Model Derivation of Diversity for Set-based Retrieval A Latent Variable Graphical Model Derivation of Diversity for Set-based Retrieval Keywords: Graphical Models::Directed Models, Foundations::Other Foundations Abstract Diversity has been heavily motivated

More information

Behavioral Data Mining. Lecture 2

Behavioral Data Mining. Lecture 2 Behavioral Data Mining Lecture 2 Autonomy Corp Bayes Theorem Bayes Theorem P(A B) = probability of A given that B is true. P(A B) = P(B A)P(A) P(B) In practice we are most interested in dealing with events

More information

Ranked IR. Lecture Objectives. Text Technologies for Data Science INFR Learn about Ranked IR. Implement: 10/10/2018. Instructor: Walid Magdy

Ranked IR. Lecture Objectives. Text Technologies for Data Science INFR Learn about Ranked IR. Implement: 10/10/2018. Instructor: Walid Magdy Text Technologies for Data Science INFR11145 Ranked IR Instructor: Walid Magdy 10-Oct-2018 Lecture Objectives Learn about Ranked IR TFIDF VSM SMART notation Implement: TFIDF 2 1 Boolean Retrieval Thus

More information

Vector Space Model. Yufei Tao KAIST. March 5, Y. Tao, March 5, 2013 Vector Space Model

Vector Space Model. Yufei Tao KAIST. March 5, Y. Tao, March 5, 2013 Vector Space Model Vector Space Model Yufei Tao KAIST March 5, 2013 In this lecture, we will study a problem that is (very) fundamental in information retrieval, and must be tackled by all search engines. Let S be a set

More information

What is Text mining? To discover the useful patterns/contents from the large amount of data that can be structured or unstructured.

What is Text mining? To discover the useful patterns/contents from the large amount of data that can be structured or unstructured. What is Text mining? To discover the useful patterns/contents from the large amount of data that can be structured or unstructured. Text mining What can be used for text mining?? Classification/categorization

More information

Lecture 13: More uses of Language Models

Lecture 13: More uses of Language Models Lecture 13: More uses of Language Models William Webber (william@williamwebber.com) COMP90042, 2014, Semester 1, Lecture 13 What we ll learn in this lecture Comparing documents, corpora using LM approaches

More information

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS PROBABILITY AND INFORMATION THEORY Dr. Gjergji Kasneci Introduction to Information Retrieval WS 2012-13 1 Outline Intro Basics of probability and information theory Probability space Rules of probability

More information