Knowledge Sharing. A conceptualization is a map from the problem domain into the representation. A conceptualization specifies:

Similar documents
A conceptualization is a map from the problem domain into the representation. A conceptualization specifies:

Web Ontology Language (OWL)

OWL Basics. Technologies for the Semantic Web. Building a Semantic Web. Ontology

OWL Semantics COMP Sean Bechhofer Uli Sattler

Propositions. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 5.1, Page 1

Individuals and Relations

Description Logics. decidable, supported by automatic reasoning systems

A Survey of Temporal Knowledge Representations

GEO-INFORMATION (LAKE DATA) SERVICE BASED ON ONTOLOGY

OWL Semantics. COMP60421 Sean Bechhofer University of Manchester

Description logics. Description Logics. Applications. Outline. Syntax - AL. Tbox and Abox

A System for Ontologically-Grounded Probabilistic Matching

An Upper Ontology of Event Classifications and Relations

Künstliche Intelligenz

Geospatial Semantics. Yingjie Hu. Geospatial Semantics

SPARQL Extensions and Outlook

An OWL Ontology for Quantum Mechanics

Logic Background (1A) Young W. Lim 12/14/15

Resource Description Framework (RDF) A basis for knowledge representation on the Web

Reasoning Under Uncertainty: Introduction to Probability

Lecture notes on Turing machines

Model-theoretic Semantics

DBpedia Ontology Enrichment for Inconsistency Detection and Statistical Schema Induction

Description Logics. an introduction into its basic ideas


An Introduction to Description Logics

Reasoning Under Uncertainty: Introduction to Probability

Learning Objectives. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 7.2, Page 1

Knowledge Representation and Description Logic Part 3

Ontology Design for Scientific Theories That Make Probabilistic Predictions

Model Averaging (Bayesian Learning)

An Ontology-based Framework for Modeling Movement on a Smart Campus

Logic. Introduction to Artificial Intelligence CS/ECE 348 Lecture 11 September 27, 2001

Knowledge Representation and Description Logic Part 2

Announcements. Project 1 underway... Second midterm Friday. discuss ideas with TAs check out wiki

Spatial Data on the Web Working Group. Geospatial RDA P9 Barcelona, 5 April 2017

Logic and Reasoning in the Semantic Web (part II OWL)

Chaos vs. Order: Aristotle and Linnaeus

ALOE A Graphical Editor for OWL Ontologies

DESCRIPTION LOGICS. Paula Severi. October 12, University of Leicester

Aristotle Metaphysics. Aristotle Metaphysics

Semantic Web SPARQL. Gerd Gröner, Matthias Thimm. July 16,

Using C-OWL for the Alignment and Merging of Medical Ontologies

On Integrating Ontologies with Relational Probabilistic Models

CS560 Knowledge Discovery and Management. CS560 - Lecture 3 1

The OntoNL Semantic Relatedness Measure for OWL Ontologies

On the Properties of Metamodeling in OWL

Logical Structures in Natural Language: First order Logic (FoL)

Week 4. COMP62342 Sean Bechhofer, Uli Sattler

First Order Logic: Syntax and Semantics

Formally Measuring Agreement and Disagreement in Ontologies

SPARQL Rewriting for Query Mediation over Mapped Ontologies

First Order Logic (FOL)

GeomRDF : A Geodata Converter with a Fine-Grained Structured Representation of Geometry in the Web

Common Sense Reasoning About Complex Actions

INTEGRITY CONSTRAINTS FOR THE SEMANTIC WEB: AN OWL 2 DL EXTENSION

Comparison between ontology distances (preliminary results)

Principles of Knowledge Representation and Reasoning

Overview. An Introduction to applied ontology. Information Integration. The ideal situation. Intro and motivation: Why do we need ontologies?

Domain Modelling: An Example (LOGICAL) DOMAIN MODELLING. Modelling Steps. Example Domain: Electronic Circuits (Russell/Norvig)

Chapters 2&3: A Representation and Reasoning System

Demystifying Ontology

Ontologies for nanotechnology. Colin Batchelor

Logic. (Propositional Logic)

Mastering New Jersey s 15 Most Chal enging Skil s Mastering New Jersey s 15 Most Chal enging Skil s New Jersey

Semantic Web Languages Towards an Institutional Perspective

Text and multimedia languages and properties

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Syllogistic Logic and its Extensions

Description Logics. Logics and Ontologies. franconi. Enrico Franconi

Introduction to Informatics

COMP3702/7702 Artificial Intelligence Week1: Introduction Russell & Norvig ch.1-2.3, Hanna Kurniawati

BUILD DEVELOPMENT ENVIRONMENT

Predicate Logic: Sematics Part 1

Geospatial Ontology Trade Study

Avoiding IS-A Overloading: The Role of Identity Conditions in Ontology Design

Axiomatized Relationships Between Ontologies. Carmen Chui

Proseminar on Semantic Theory Fall 2010 Ling 720 The Basics of Plurals: Part 1 1 The Meaning of Plural NPs and the Nature of Predication Over Plurals

ECE 479/579 Principles of Artificial Intelligence Part I Spring Dr. Michael Marefat

Description Logics. Foundations of Propositional Logic. franconi. Enrico Franconi

Logic: Bottom-up & Top-down proof procedures

St. Kitts and Nevis Heritage and Culture

Logic Background (1A) Young W. Lim 5/14/18

RDF and Logic: Reasoning and Extension

An Introduction to GLIF

Mixing Finite Success and Finite Failure. Automated Prover

Ordering, Indexing, and Searching Semantic Data: A Terminology Aware Index Structure

CS 6110 Lecture 28 Subtype Polymorphism 3 April 2013 Lecturer: Andrew Myers

Type Theory and Constructive Mathematics. Type Theory and Constructive Mathematics Thierry Coquand. University of Gothenburg

First Order Logic (1A) Young W. Lim 11/18/13

Reclaiming Meaning in Mathematics

Affordances in Representing the Behaviour of Event-Based Systems

COMP219: Artificial Intelligence. Lecture 19: Logic for KR

Peter Wood. Department of Computer Science and Information Systems Birkbeck, University of London Automata and Formal Languages

COMP219: Artificial Intelligence. Lecture 19: Logic for KR

An Ontology and Concept Lattice Based Inexact Matching Method for Service Discovery

Information Retrieval and Organisation

Proseminar on Semantic Theory Fall 2013 Ling 720 Propositional Logic: Syntax and Natural Deduction 1

SPEAKING MATHEMATICALLY. Prepared by Engr. John Paul Timola

Propositions. c D. Poole and A. Mackworth 2008 Artificial Intelligence, Lecture 5.1, Page 1

Transcription:

Knowledge Sharing A conceptualization is a map from the problem domain into the representation. A conceptualization specifies: What sorts of individuals are being modeled The vocabulary for specifying individuals, relations and properties The meaning or intention of the vocabulary If more than one person is building a knowledge base, they must be able to share the conceptualization. An ontology is a specification of a conceptualization. An ontology specifies the meanings of the symbols in an information system. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 1

Mapping from a conceptualization to a symbol c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 2

Semantic Web Ontologies are published on the web in machine readable form. Builders of knowledge bases or web sites adhere to and refer to a published ontology: a symbol defined by an ontology means the same thing across web sites that obey the ontology. if someone wants to refer to something not defined, they publish an ontology defining the terminology. Others adopt the terminology by referring to the new ontology. In this way, ontologies evolve. Separately developed ontologies can have mappings between them published. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 3

Challenges of building ontologies They can be huge: finding the appropriate terminology for a concept may be difficult. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 4

Challenges of building ontologies They can be huge: finding the appropriate terminology for a concept may be difficult. How one divides the world can depend on the application. Different ontologies describe the world in different ways. People can fundamentally disagree about an appropriate structure. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 5

Challenges of building ontologies They can be huge: finding the appropriate terminology for a concept may be difficult. How one divides the world can depend on the application. Different ontologies describe the world in different ways. People can fundamentally disagree about an appropriate structure. Different knowledge bases can use different ontologies. To allow KBs based on different ontologies to inter-operate, there must be mapping between ontologies. It has to be in user s interests to use an ontology. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 6

Challenges of building ontologies They can be huge: finding the appropriate terminology for a concept may be difficult. How one divides the world can depend on the application. Different ontologies describe the world in different ways. People can fundamentally disagree about an appropriate structure. Different knowledge bases can use different ontologies. To allow KBs based on different ontologies to inter-operate, there must be mapping between ontologies. It has to be in user s interests to use an ontology. The computer doesn t understand the meaning of the symbols. The formalism can constrain the meaning, but can t define it. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 7

Semantic Web Technologies XML the Extensible Markup Language provides generic syntax. tag... / or tag...... /tag. URI a Uniform Resource Identifier is a name of an individual (resource). This name can be shared. Often in the form of a URL to ensure uniqueness. RDF the Resource Description Framework is a language of triples OWL the Web Ontology Language, defines some primitive properties that can be used to define terminology. (Doesn t define a syntax). c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 8

Main Components of an Ontology Individuals the things / objects in the world (not usually specified as part of the ontology) Classes sets of individuals Properties between individuals and their values c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 9

Individuals Individuals are things in the world that can be named. (Concrete, abstract, concepts, reified). Unique names assumption (UNA): different names refer to different individuals. The UNA is not an assumption we can universally make: The Queen, Elizabeth Windsor, etc. Without the determining equality, we can t count! In OWL we can specify: i 1 SameIndividual i 2. i 1 DifferentIndividuals i 3. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 10

Classes A class is a set of individuals. E.g., house, building, officebuilding One class can be a subclass of another house subclassof building. officebuilding subclassof building. The most general class is Thing. Classes can be declared to be the same or to be disjoint: house EquivalentClasses singlefamilydwelling. house DisjointClasses officebuilding. Different classes are not necessarily disjoint. E.g., a building can be both a commercial building and a residential building. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 11

Properties A property is between an individual and a value. A property has a domain and a range. livesin domain person. livesin range placeofresidence. An ObjectProperty is a property whose range is an individual. A DatatypeProperty is one whose range isn t an individual, e.g., is a number or string. There can also be property hierarchies: livesin subpropertyof enclosure. principalresidence subpropertyof livesin. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 12

Properties (Cont.) One property can be inverse of another livesin InverseObjectProperties hasresident. Properties can be declared to be transitive, symmetric, functional, or inverse-functional. (Which of these are only applicable to object properties?) We can also state the minimum and maximal cardinality of a property. principalresidence mincardinality 1. principalresidence maxcardinality 1. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 13

Property and Class Restrictions We can define complex descriptions of classes in terms of restrictions of other classes and properties. E.g., A homeowner is a person who owns a house. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 14

Property and Class Restrictions We can define complex descriptions of classes in terms of restrictions of other classes and properties. E.g., A homeowner is a person who owns a house. homeowner person {x : h house such that x owns h} c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 15

Property and Class Restrictions We can define complex descriptions of classes in terms of restrictions of other classes and properties. E.g., A homeowner is a person who owns a house. homeowner person {x : h house such that x owns h} homeowner subclassof person. homeowner subclassof ObjectSomeValuesFrom(owns, house). c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 16

OWL Class Constructors owl:thing all individuals owl:nothing no individuals owl:objectintersectionof(c 1,..., C k ) C 1 C k owl:objectunionof(c 1,..., C k ) C 1 C k owl:objectcomplementof(c) Thing \ C owl:objectoneof(i 1,..., I k ) {I 1,..., I k } owl:objecthasvalue(p, I ) {x : x P I } owl:objectallvaluesfrom(p, C) {x : x P y y C} owl:objectsomevaluesfrom(p, C) {x : y C such that x P y} owl:objectmincardinality(n, P, C) {x : #{y xpy and y C} n} owl:objectmaxcardinality(n, P, C) {x : #{y xpy and y C} n} c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 17

OWL Predicates rdf:type(i, C) I C rdfs:subclassof(c 1, C 2 ) C 1 C 2 owl:equivalentclasses(c 1, C 2 ) C 1 C 2 owl:disjointclasses(c 1, C 2 ) C 1 C 2 = {} rdfs:domain(p, C) if xpy then x C rdfs:range(p, C) if xpy then y C rdfs:subpropertyof(p 1, P 2 ) xp 1 y implies xp 2 y owl:equivalentobjectproperties(p 1, P 2 ) xp 1 y if and only if xp 2 y owl:disjointobjectproperties(p 1, P 2 ) xp 1 y implies not xp 2 y owl:inverseobjectproperties(p 1, P 2 ) xp 1 y if and only if yp 2 x owl:sameindividual(i 1,..., I n ) j k I j = I k owl:differentindividuals(i 1,..., I n ) j k j k implies I j I k owl:functionalobjectproperty(p) if xpy 1 and xpy 2 then y 1 = y 2 owl:inversefunctionalobjectproperty(p) if x 1 Py and x 2 Py then x 1 = x 2 owl:transitiveobjectproperty(p) if xpy and ypz then xpz owl:symmetricobjectproperty if xpy then ypx c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 18

Knowledge Sharing One ontology typically imports and builds on other ontologies. OWL provides facilities for version control. Tools for mapping one ontology to another allow inter-operation of different knowledge bases. The semantic web promises to allow two pieces of information to be combined if they both adhere to an ontology these are the same ontology or there is a mapping between them. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 19

Example: Apartment Building An apartment building is a residential building with more than two units and they are rented. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 20

Example: Apartment Building An apartment building is a residential building with more than two units and they are rented. :numberofunits rdf:type owl:functionalobjectproperty; rdfs:domain :ResidentialBuilding; rdfs:range owl:oneof(:one :two :morethantwo). c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 21

Example: Apartment Building An apartment building is a residential building with more than two units and they are rented. :numberofunits rdf:type owl:functionalobjectproperty; rdfs:domain :ResidentialBuilding; rdfs:range owl:oneof(:one :two :morethantwo). :ApartmentBuilding owl:equivalentclasses owl:objectintersectionof ( owl:objecthasvalue(:numberofunits :morethantwo) owl:objecthasvalue(:onwership :rental) :ResidentialBuilding). c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 22

Aristotelian definitions Aristotle [350 B.C.] suggested the definition if a class C in terms of: Genus: the super-class Differentia: the attributes that make members of the class C different from other members of the super-class If genera are different and co-ordinate, their differentiae are themselves different in kind. Take as an instance the genus animal and the genus knowledge. With feet, two-footed, winged, aquatic, are differentiae of animal ; the species of knowledge are not distinguished by the same differentiae. One species of knowledge does not differ from another in being two-footed. Aristotle, Categories, 350 B.C. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 23

Example: hotel ontology Define the following: Room BathRoom StandardRoom - what is rented as a room in a hotel Suite RoomOnly c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 24

Example: hotel ontology Define the following: Room BathRoom StandardRoom - what is rented as a room in a hotel Suite RoomOnly Hotel HasForRent AllSuitesHotel NoSuitesHotel HasSuitesHotel c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 25

Basic Formal Ontology (BFO) entity continuant independent continuant site object aggregate object fiat part of object boundary of object dependent continuant realizable entity function role disposition quality spatial region volume / surface / line / point c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 26

BFO (cont.) occurrent temporal region connected temporal region temporal interval temporal instant scattered temporal region spatio-temporal region connected spatio-temporal region spatio-temporal interval / spatio-temporal instant scattered spatio-temporal region processual entity process process aggregate processual context fiat part of process boundary of process c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 27

Continuants vs Occurrents A continuant exists in an instance of time and maintains its identity through time. An occurrent has temporal parts. Continuants participate in occurrents. a person, a life, a finger, infancy: what is part of what? c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 28

Continuants vs Occurrents A continuant exists in an instance of time and maintains its identity through time. An occurrent has temporal parts. Continuants participate in occurrents. a person, a life, a finger, infancy: what is part of what? a holiday, the end of a lecture, an email, the sending of an email, the equator, earthquake, a smile, a laugh, the smell of a flower c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 29

Continuants a pen, a person, Newtonian mechanics, the memory of a past event: objects a flock of birds, the students in CS422, a card collection: object aggregates a city, a room, a mouth, the hole of a doughnut: site the dangerous part of a city, part of Grouse Mountain with the best view: fiat part of an object. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 13.2, Page 30