SlideShare a Scribd company logo
Inference in First-Order
Logic
Artificial Intelligence: Inference in First-Order Logic
1
A Brief History of Reasoning
450B.C. Stoics propositional logic, inference (maybe)
322B.C. Aristotle “syllogisms” (inference rules), quantifiers
1565 Cardano probability theory (propositional logic + uncertainty)
1847 Boole propositional logic (again)
1879 Frege first-order logic
1922 Wittgenstein proof by truth tables
1930 Go¨ del ∃ complete algorithm for FOL
1930 Herbrand complete algorithm for FOL (reduce to propositional)
1931 Go¨ del ¬∃ complete algorithm for arithmetic systems
1960 Davis/Putnam “practical” algorithm for propositional logic
1965 Robinson “practical” algorithm for FOL—resolution
Artificial Intelligence: Inference in First-Order Logic 5 March 2024
2
The Story So Far
●Propositional logic
●Subset of propositional logic: horn clauses
●Inference algorithms
– forward chaining
– backward chaining
– resolution (for full propositional logic)
●First order logic (FOL)
– variables
– functions
– quantifiers
– etc.
●Today: inference for first order logic
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
3
Outline
●Reducing first-order inference to propositional inference
●Unification
●Generalized Modus Ponens
●Forward and backward chaining
●Logic programming
●Resolution
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
4
reduction to
propositional inference
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
5
Universal Instantiation
●Every instantiation of a universally quantified sentence is entailed by it:
∀ v α
SUBST({v/g}, α)
for any variable v and ground term g
●E.g., ∀ x King(x) ∧ Greedy(x) =⇒ Evil(x) yields
King(John) ∧ Greedy(John) =⇒ Evil(John)
King(Richard) ∧ Greedy(Richard) =⇒ Evil(Richard)
King(Father(John)) ∧ Greedy(Father(John)) =⇒
Evil(Father(John))
⋮
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
6
Existential Instantiation
●For any sentence α, variable v, and constant symbol k
that does not appear elsewhere in the knowledge base:
∃ v
α
SUBST(Iv/k}, α)
●E.g., ∃ x Crown(x) ∧ OnHead(x, John) yields
Crown(C1) ∧ OnHead(C1, John)
provided C1 is a new constant symbol, called a Skolem constant
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
7
Instantiation
●Universal Instantiation
– can be applied several times to add new sentences
– the new KB is logically equivalent to the old
●Existential Instantiation
– can be applied once to replace the existential sentence
– the new KB is not equivalent to the old
– but is satisfiable iff the old KB was satisfiable
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
8
Reduction to Propositional
Inference
●Suppose the KB contains just the following:
∀ x King(x) ∧ Greedy(x) =⇒ Evil(x)
King(John)
Greedy(John)
Brother(Richard, John)
●Instantiating the universal sentence in all
possible ways, we have
King(John) ∧ Greedy(John) =⇒ Evil(John)
King(Richard) ∧ Greedy(Richard) =⇒ Evil(Richard)
King(John)
Greedy(John)
Brother(Richard, John)
●The new KB is propositionalized:
proposition symbols are
King(John), Greedy(John),
Evil(John), Brother(Richard,
John), etc.
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
9
Reduction to Propositional
Inference
●Claim: a ground sentence is entailed by new KB iff entailed by original KB
●Claim: every FOL KB can be propositionalized so as to preserve entailment
●Idea: propositionalize KB and query, apply resolution, return result
●Problem: with function symbols, there are infinitely many ground terms,
e.g., Father(Father(Father(John)))
●Theorem: Herbrand (1930). If a sentence α is entailed by an FOL KB,
it is entailed by a finite subset of the propositional KB
●Idea: For n = 0 to ∞ do
create a propositional KB by instantiating with depth-n terms
see if α is entailed by this KB
●Problem: works if α is entailed, loops if α is not entailed
●Theorem: Turing (1936), Church (1936), entailment in FOL is
semidecidable
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
Practical Problems with
Propositionalization10
●Propositionalization seems to generate lots of irrelevant sentences.
●E.g., from ∀ x King(x) ∧ Greedy(x) =⇒ Evil(x)
King(John)
∀ y Greedy(y)
Brother(Richard, John)
it seems obvious that Evil(John), but propositionalization produces lots of
facts such as Greedy(Richard) that are irrelevant
●With p k-ary predicates and n constants, there are p ⋅ nk instantiations
●With function symbols, it gets nuch much worse!
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
11
unification
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
12
Plan
●We have the inference rule
– ∀ x King(x) ∧ Greedy(x) =⇒ Evil(x)
●We have facts that (partially) match the precondition
– King(John)
– ∀ y Greedy(y)
●We need to match them up with substitutions: θ = Ix/John, y/John} works
– unification
– generalized modus ponens
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
13
Unification
●UNIFY(α, β) = θ if αθ =
βθ
p
q
θ
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, Jane)
Knows(y, Mary)
Knows(y, Mother(y))
Knows(x, Mary)
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
14
Unification
●UNIFY(α, β) = θ if αθ =
βθ
p
q
θ
Ix/Jane}
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, Jane)
Knows(y, Mary)
Knows(y, Mother(y))
Knows(x, Mary)
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
15
Unification
●UNIFY(α, β) = θ if αθ =
βθ
p
q
θ
Ix/Jane}
Ix/Mary,
y/John}
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, Jane)
Knows(y, Mary)
Knows(y, Mother(y))
Knows(x, Mary)
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
16
Unification
●UNIFY(α, β) = θ if αθ =
βθ
p
q
θ
Ix/Jane}
Ix/Mary, y/John}
Iy/John, x/Mother(John)}
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, Jane)
Knows(y, Mary)
Knows(y, Mother(y))
Knows(x, Mary)
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
17
Unification
●UNIFY(α, β) = θ if αθ =
βθ
p
q
θ
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, x)
Knows(John, Jane)
Knows(y, Mary)
Knows(y, Mother(y))
Knows(x, Mary)
Ix/Jane}
Ix/Mary, y/John}
Iy/John, x/Mother(John)}
fail
●Standardizing apart eliminates overlap of variables, e.g., Knows(z17, Mary)
Knows(John, x) Knows(z17, Mary) Iz17/John, x/Mary}
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
18
generalized modus ponens
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
19
Generalized Modus Ponens
●Generalized modus ponens used with KB of definite clauses
(exactly one positive literal)
●All variables assumed universally quantified
p1
′, p2
′, . . . , pn
′, (p1 ∧ p2 ∧ . . . ∧ pn ⇒
q) q
θ
′
where pi θ = piθ for all
i
●Rule:
●Precondition of rule:
●Implication:
●Facts:
●Substitution:
⇒ Result of modus ponens:
King(x) ∧ Greedy(x) =⇒ Evil(x)
p2 is Greedy(x)
p1 is King(x)
q is Evil(x)
p1
′ is
King(John)
p2
′ is Greedy(y)
θ is Ix/John,
y/John}
qθ is Evil(John)
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
20
forward chaining
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
21
Example Knowledge
●The law says that it is a crime for an American to sell weapons to hostile
nations. The country Nono, an enemy of America, has some
missiles, and all of its missiles were sold to it by Colonel West, who is
American.
●Prove that Col. West is a criminal
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
22
Example Knowledge Base
●. . . it is a crime for an American to sell weapons to hostile nations:
American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) =⇒ Criminal(x)
●Nono . . . has some missiles, i.e., ∃ x Owns(Nono, x) ∧
Missile(x): Owns(Nono, M1) and Missile(M1)
●. . . all of its missiles were sold to it by Colonel West
Missile(x) ∧ Owns(Nono, x) =⇒ Sells(West, x, Nono)
●Missiles are weapons:
Missile(x) ⇒ Weapon(x)
●An enemy of America counts as “hostile”:
Enemy(x, America) =⇒ Hostile(x)
●West, who is American . . .
American(West)
●The country Nono, an enemy of America . . .
Enemy(Nono, America)
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
23
Forward Chaining Proof
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
24
Forward Chaining Proof
(Note: ∀ x Missile(x) ∧ Owns(Nono, x) =⇒ Sells(West, x, Nono))
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
25
Forward Chaining Proof
(Note: American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) =⇒ Criminal(x))
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
26
Properties of Forward Chaining
●Sound and complete for first-order definite clauses
(proof similar to propositional proof)
●Datalog (1977) = first-order definite clauses + no functions (e.g., crime example)
Forward chaining terminates for Datalog in poly iterations: at most p ⋅ nk
literals
●May not terminate in general if α is not entailed
●This is unavoidable: entailment with definite clauses is semidecidable
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
27
Efficiency of Forward Chaining
●Simple observation: no need to match a rule on iteration k
if a premise wasn’t added on iteration k − 1
=⇒ match each rule whose premise contains a newly
added literal
●Matching itself can be expensive
●Database indexing allows O(1) retrieval of known facts
e.g., query Missile(x) retrieves Missile(M1)
●Matching conjunctive premises against known facts is
NP-hard
●Forward chaining is widely used in deductive
databases
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
28
Hard Matching Example
Diff(wa, nt) ∧ Diff(wa, sa) ∧
Diff(nt, q)Diff(nt, sa) ∧
Diff(q, nsw) ∧ Diff(q, sa)
∧
Diff(nsw, v) ∧ Diff(nsw, sa) ∧
Diff(v, sa) =⇒ Colorable()
Diff(Red, Blue)
Diff(Green, Red)
Diff(Blue, Red)
Diff(Red, Green)
Diff(Green, Blue)
Diff(Blue, Green)
●Colorable() is inferred iff the constraint satisfaction problem has a solution
●CSPs include 3SAT as a special case, hence matching is NP-hard
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
29
Forward Chaining Algorithm
function FOL-FC-ASK(KB, α) returns a substitution or false
repeat until new is empty
new ← g
for each sentence r in KB do
( p1 ∧ . . . ∧ pn =⇒ q ) ←
STANDARDIZE-APART(r) 1 n
for each θ such that (p1 ∧ . . . ∧ pn)θ = (p′ ∧ . . . ∧ p
′ )θ 1 n
for some p′ , . . . , p′ in KB
q ′ ← SUBST(θ, q )
if q ′ is not a renaming of a sentence already in KB or new then do
add q ′ to new
φ ← UNIFY(q ′, α)
if φ is not fail then return φ
add new to KB
return false
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
30
backward chaining
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
31
Backward Chaining
●Start with query
●Check if it can be derived by given rules and facts
– apply rules that infer the query
– recurse over pre-conditions
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
32
Backward Chaining Example
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
33
Backward Chaining Example
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
34
Backward Chaining Example
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
35
Backward Chaining Example
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
36
Backward Chaining Example
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
37
Backward Chaining Example
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
38
Backward Chaining Example
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
39
Properties of Backward
Chaining
●Depth-first recursive proof search: space is linear in size of proof
●Incomplete due to infinite loops
=⇒ fix by checking current goal against every goal on
stack
●Inefficient due to repeated subgoals (both success and failure)
=⇒ fix using caching of previous results (extra space!)
●Widely used (without improvements!) for logic programming
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
40
Backward Chaining Algorithm
function FOL-BC-ASK(KB, goals, θ) returns a set of substitutions
inputs: KB, a knowledge base
goals, a list of conjuncts forming a query (θ already applied)
θ, the current substitution, initially the empty substitution g
local variables: answers, a set of substitutions, initially empty
if goals is empty then return Iθ}
q ′ ← SUBST(θ, FIRST(goals))
for each sentence r in KB
where STANDARDIZE-APART(r) = ( p1 ∧ . . . ∧ pn ⇒ q)
and θ′ ← UNIFY(q, q ′) succeeds
new goals ← [ p1, . . . , pn|REST(goals)]
answers ← FOL-BC-ASK(KB, new goals, COMPOSE(θ′, θ)) ∪
answers
return answers
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
41
logic programming
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
42
Logic Programming
●Computation as inference on logical KBs
Logic programming
1. Identify problem
2. Assemble information
3. Tea break
4. Encode information in KB
5. Encode problem instance as facts
6. Ask queries
7. Find false facts
Ordinary programming
Identify problem
Assemble information
Figure out solution
Program solution
Encode problem instance as data
Apply program to data
Debug procedural errors
●Should be easier to debug Capital(NewY ork, US) than x =
∶ x + 2 !
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
43
Prolog
●Basis: backward chaining with Horn clauses + bells & whistles
●Widely used in Europe, Japan (basis of 5th Generation project)
●Compilation techniques ⇒ approaching a billion logical inferences per second
●Program = set of clauses = head :- literal1, . . . literaln.
criminal(X) :- american(X), weapon(Y), sells(X,Y,Z), hostile(Z). missile(M1).
owns(Nono,M1).
sells(West,X,Nono) :- missile(X), owns(Nono,X).
weapon(X) :- missile(X).
hostile(X) :- enemy(X,America).
American(West).
Enemy(Nono,America).
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
44
Prolog Systems
●Depth-first, left-to-right backward chaining
●Built-in predicates for arithmetic etc., e.g., X is Y*Z+3
●Closed-world assumption (“negation as failure”)
e.g., given alive(X) :- not dead(X). alive(joe)
succeeds if dead(joe) fails
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
45
resolution
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
46
Resolution: Brief Summary
●Full first-order version:
l1 ∨ ⋯ ∨ lk , m1 ∨ ⋯ ∨ mn
(l1 ∨ ⋯ ∨ li−1 ∨ li+1 ∨ ⋯ ∨ lk ∨ m1 ∨ ⋯ ∨ mj−1 ∨ mj+1 ∨ ⋯ ∨
mn)θ
where UNIFY(li, ¬mj) = θ.
●For example, ¬Rich(x) ∨ Unhappy(x) Rich(Ken)
Unhappy(Ken)
with θ = Ix/Ken}
●Apply resolution steps to CNF (KB ∧ ¬α); complete for FOL
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
47
Conversion to CNF
Everyone who loves all animals is loved by someone:
∀ x [∀ y Animal(y) =⇒ Loves(x, y)] =⇒ [∃ y Loves(y, x)]
1. Eliminate biconditionals and implications
∀ x [¬∀ y ¬Animal(y) ∨ Loves(x, y)] ∨ [∃
y Loves(y, x)]
2. Move ¬ inwards: ¬∀ x, p ≡ ∃ x ¬p, ¬∃ x, p ≡ ∀ x ¬p:
∀ x [∃ y ¬(¬Animal(y) ∨ Loves(x, y))] ∨ [∃ y Loves(y, x)]
∀ x [∃ y ¬¬Animal(y) ∧ ¬Loves(x, y)] ∨ [∃ y Loves(y, x)]
∀ x [∃ y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃ y Loves(y, x)]
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
48
Conversion to CNF
3. Standardize variables: each quantifier should use a different one
∀ x [∃ y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃ z Loves(z, x)]
4. Skolemize: a more general form of existential instantiation.
Each existential variable is replaced by a Skolem
function of the enclosing universally quantified
variables:
∀ x [Animal(F (x)) ∧ ¬Loves(x, F (x))] ∨ Loves(G(x), x)
5. Drop universal quantifiers:
[Animal(F (x)) ∧ ¬Loves(x, F (x))] ∨ Loves(G(x), x)
6. Distribute ∧ over ∨:
[Animal(F (x)) ∨ Loves(G(x), x)] ∧ [¬Loves(x, F (x)) ∨
Loves(G(x), x)]
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
49
Our Previous Example
●Rules
– American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) =⇒ Criminal(x)
– Missile(M1) and Owns(Nono, M1)
– Missile(x) ∧ Owns(Nono, x) =⇒ Sells(West, x, Nono)
– Missile(x) ⇒ Weapon(x)
– Enemy(x, America) =⇒ Hostile(x)
– American(West)
– Enemy(Nono, America)
●Converted to CNF
– ¬American(x) ∨ ¬Weapon(y) ∨ ¬Sells(x, y, z) ∨ ¬Hostile(z) ∨ Criminal(x)
– Missile(M1) and Owns(Nono, M1)
– ¬Missile(x) ∨ ¬Owns(Nono, x) ∨ Sells(West, x, Nono)
– ¬Missile(x) ∨ Weapon(x)
– ¬Enemy(x, America) ∨ Hostile(x)
– American(West)
– Enemy(Nono, America)
●Query: ¬Criminal(West)
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
50
Resolution Proof
Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024

More Related Content

PDF
lecture-inference-in-first-order-logic.pdf
angerfist1
 
PDF
First Order Logic resolution
Amar Jukuntla
 
PPTX
Inference in First-Order Logic
Junya Tanaka
 
PPT
Inference in first-order logic Reducing first-order
erasmuskratos
 
PDF
Ai lecture 10(unit03)
vikas dhakane
 
PPT
9.class-notesr9.ppt
Pabitha Chidambaram
 
PPTX
Knowledge Representation & Reasoning AI UNIT 3
Dr. SURBHI SAROHA
 
PPT
Unit III Knowledge Representation in AI K.Sundar,AP/CSE,VEC
sundarKanagaraj1
 
lecture-inference-in-first-order-logic.pdf
angerfist1
 
First Order Logic resolution
Amar Jukuntla
 
Inference in First-Order Logic
Junya Tanaka
 
Inference in first-order logic Reducing first-order
erasmuskratos
 
Ai lecture 10(unit03)
vikas dhakane
 
9.class-notesr9.ppt
Pabitha Chidambaram
 
Knowledge Representation & Reasoning AI UNIT 3
Dr. SURBHI SAROHA
 
Unit III Knowledge Representation in AI K.Sundar,AP/CSE,VEC
sundarKanagaraj1
 

Similar to lecture-inference-in-first-order-logic.pptx (20)

PPT
Propositional and first-order logic different chapters
ines396023
 
PPTX
Knowledge Representation and Reasoning.pptx
MohanKumarP34
 
PPT
Logic.ppt
syedadamiya
 
PDF
16_FirstOrderLogic.p_4_moduleModuleNotespdf
ShylaBg1
 
PPTX
Propositional logic(part 2)
Dr. SURBHI SAROHA
 
PPTX
Module4_AI 4th semester engineering.pptx
nithudgowda3
 
PPT
10a.ppt
sunil sharma
 
PPTX
First order logic
Faiz Zeya
 
PPT
Propositional and first order logic - AI
sheetalphougat1
 
PPTX
Jarrar: First Order Logic- Inference Methods
Mustafa Jarrar
 
PPTX
Foundations of Knowledge Representation in Artificial Intelligence.pptx
kitsenthilkumarcse
 
PPTX
AI3391 Artificial intelligence Session 28 Resolution.pptx
Guru Nanak Technical Institutions
 
PPT
chapter9.ppt
Praveen Kumar
 
PPTX
Knowledge representation and Predicate logic
Amey Kerkar
 
PPTX
Untitled presentation in first order logic .pptx
anjanaelectronicspvt
 
PPTX
logic part of where the ai takes place p
ANJAN259622
 
PPTX
AI_05_First Order Logic.pptx
Yousef Aburawi
 
PPTX
Unification and Lifting
Megha Sharma
 
PDF
Knowledge base artificial intelligence.pdf
jannatulferdous20101
 
PDF
first_order_logic.pdf
AhmedNURHUSIEN
 
Propositional and first-order logic different chapters
ines396023
 
Knowledge Representation and Reasoning.pptx
MohanKumarP34
 
Logic.ppt
syedadamiya
 
16_FirstOrderLogic.p_4_moduleModuleNotespdf
ShylaBg1
 
Propositional logic(part 2)
Dr. SURBHI SAROHA
 
Module4_AI 4th semester engineering.pptx
nithudgowda3
 
10a.ppt
sunil sharma
 
First order logic
Faiz Zeya
 
Propositional and first order logic - AI
sheetalphougat1
 
Jarrar: First Order Logic- Inference Methods
Mustafa Jarrar
 
Foundations of Knowledge Representation in Artificial Intelligence.pptx
kitsenthilkumarcse
 
AI3391 Artificial intelligence Session 28 Resolution.pptx
Guru Nanak Technical Institutions
 
chapter9.ppt
Praveen Kumar
 
Knowledge representation and Predicate logic
Amey Kerkar
 
Untitled presentation in first order logic .pptx
anjanaelectronicspvt
 
logic part of where the ai takes place p
ANJAN259622
 
AI_05_First Order Logic.pptx
Yousef Aburawi
 
Unification and Lifting
Megha Sharma
 
Knowledge base artificial intelligence.pdf
jannatulferdous20101
 
first_order_logic.pdf
AhmedNURHUSIEN
 
Ad

More from RaghavendraPrasad179187 (10)

PPT
gmatrix distro_gmatrix distro_gmatrix distro
RaghavendraPrasad179187
 
PPT
Project Planning and control in Software Engineering
RaghavendraPrasad179187
 
PPT
spatial surveillance techniques in artificial intelligence
RaghavendraPrasad179187
 
PPT
hillclimb algorithm for heuristic search
RaghavendraPrasad179187
 
PPT
bayessian structures and its role in artificial intelligence
RaghavendraPrasad179187
 
PPT
bayesian in artificial intelligence and search methods
RaghavendraPrasad179187
 
PPT
HEURISTIC SEARCH IN ARTIFICIAL INTELLEGENCE
RaghavendraPrasad179187
 
PPT
Dijkstra_Algorithm with illustarted example
RaghavendraPrasad179187
 
PPTX
Linked list data structures and algorithms
RaghavendraPrasad179187
 
PPT
Heuristic Search Algorithm in AI and its Techniques
RaghavendraPrasad179187
 
gmatrix distro_gmatrix distro_gmatrix distro
RaghavendraPrasad179187
 
Project Planning and control in Software Engineering
RaghavendraPrasad179187
 
spatial surveillance techniques in artificial intelligence
RaghavendraPrasad179187
 
hillclimb algorithm for heuristic search
RaghavendraPrasad179187
 
bayessian structures and its role in artificial intelligence
RaghavendraPrasad179187
 
bayesian in artificial intelligence and search methods
RaghavendraPrasad179187
 
HEURISTIC SEARCH IN ARTIFICIAL INTELLEGENCE
RaghavendraPrasad179187
 
Dijkstra_Algorithm with illustarted example
RaghavendraPrasad179187
 
Linked list data structures and algorithms
RaghavendraPrasad179187
 
Heuristic Search Algorithm in AI and its Techniques
RaghavendraPrasad179187
 
Ad

Recently uploaded (20)

PDF
Sujay Rao Mandavilli Multi-barreled appraoch to educational reform FINAL FINA...
Sujay Rao Mandavilli
 
PPTX
Reticular formation_nuclei_afferent_efferent
muralinath2
 
PPT
1a. Basic Principles of Medical Microbiology Part 2 [Autosaved].ppt
separatedwalk
 
PPTX
Nanofertilizer: Its potential benefits and associated challenges.pptx
BikramjitDeuri
 
PDF
Multiwavelength Study of a Hyperluminous X-Ray Source near NGC6099: A Strong ...
Sérgio Sacani
 
PPTX
Qualification of.UV visible spectrophotometer pptx
shrutipandit17
 
PPTX
Unit 4 - Astronomy and Astrophysics - Milky Way And External Galaxies
RDhivya6
 
PPTX
General Characters and Classification of Su class Apterygota.pptx
Dr Showkat Ahmad Wani
 
PPTX
Hericium erinaceus, also known as lion's mane mushroom
TinaDadkhah1
 
PPTX
fghvqwhfugqaifbiqufbiquvbfuqvfuqyvfqvfouiqvfq
PERMISONJERWIN
 
PPTX
Limbic system_components_connections_ functions.pptx
muralinath2
 
PDF
Gamifying Agent-Based Models in Cormas: Towards the Playable Architecture for...
ESUG
 
PPT
1. Basic Principles of Medical Microbiology Part 1.ppt
separatedwalk
 
PPTX
Quality control test for plastic & metal.pptx
shrutipandit17
 
PDF
Identification of Bacteria notes by EHH.pdf
Eshwarappa H
 
PDF
An Analysis of Inline Method Refactoring
ESUG
 
PPTX
Role of GIS in precision farming.pptx
BikramjitDeuri
 
PDF
FASTTypeScript metamodel generation using FAST traits and TreeSitter project
ESUG
 
PDF
The Cosmic Symphony: How Photons Shape the Universe and Our Place Within It
kutatomoshi
 
PDF
Identification of unnecessary object allocations using static escape analysis
ESUG
 
Sujay Rao Mandavilli Multi-barreled appraoch to educational reform FINAL FINA...
Sujay Rao Mandavilli
 
Reticular formation_nuclei_afferent_efferent
muralinath2
 
1a. Basic Principles of Medical Microbiology Part 2 [Autosaved].ppt
separatedwalk
 
Nanofertilizer: Its potential benefits and associated challenges.pptx
BikramjitDeuri
 
Multiwavelength Study of a Hyperluminous X-Ray Source near NGC6099: A Strong ...
Sérgio Sacani
 
Qualification of.UV visible spectrophotometer pptx
shrutipandit17
 
Unit 4 - Astronomy and Astrophysics - Milky Way And External Galaxies
RDhivya6
 
General Characters and Classification of Su class Apterygota.pptx
Dr Showkat Ahmad Wani
 
Hericium erinaceus, also known as lion's mane mushroom
TinaDadkhah1
 
fghvqwhfugqaifbiqufbiquvbfuqvfuqyvfqvfouiqvfq
PERMISONJERWIN
 
Limbic system_components_connections_ functions.pptx
muralinath2
 
Gamifying Agent-Based Models in Cormas: Towards the Playable Architecture for...
ESUG
 
1. Basic Principles of Medical Microbiology Part 1.ppt
separatedwalk
 
Quality control test for plastic & metal.pptx
shrutipandit17
 
Identification of Bacteria notes by EHH.pdf
Eshwarappa H
 
An Analysis of Inline Method Refactoring
ESUG
 
Role of GIS in precision farming.pptx
BikramjitDeuri
 
FASTTypeScript metamodel generation using FAST traits and TreeSitter project
ESUG
 
The Cosmic Symphony: How Photons Shape the Universe and Our Place Within It
kutatomoshi
 
Identification of unnecessary object allocations using static escape analysis
ESUG
 

lecture-inference-in-first-order-logic.pptx

  • 1. Inference in First-Order Logic Artificial Intelligence: Inference in First-Order Logic
  • 2. 1 A Brief History of Reasoning 450B.C. Stoics propositional logic, inference (maybe) 322B.C. Aristotle “syllogisms” (inference rules), quantifiers 1565 Cardano probability theory (propositional logic + uncertainty) 1847 Boole propositional logic (again) 1879 Frege first-order logic 1922 Wittgenstein proof by truth tables 1930 Go¨ del ∃ complete algorithm for FOL 1930 Herbrand complete algorithm for FOL (reduce to propositional) 1931 Go¨ del ¬∃ complete algorithm for arithmetic systems 1960 Davis/Putnam “practical” algorithm for propositional logic 1965 Robinson “practical” algorithm for FOL—resolution Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 3. 2 The Story So Far ●Propositional logic ●Subset of propositional logic: horn clauses ●Inference algorithms – forward chaining – backward chaining – resolution (for full propositional logic) ●First order logic (FOL) – variables – functions – quantifiers – etc. ●Today: inference for first order logic Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 4. 3 Outline ●Reducing first-order inference to propositional inference ●Unification ●Generalized Modus Ponens ●Forward and backward chaining ●Logic programming ●Resolution Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 5. 4 reduction to propositional inference Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 6. 5 Universal Instantiation ●Every instantiation of a universally quantified sentence is entailed by it: ∀ v α SUBST({v/g}, α) for any variable v and ground term g ●E.g., ∀ x King(x) ∧ Greedy(x) =⇒ Evil(x) yields King(John) ∧ Greedy(John) =⇒ Evil(John) King(Richard) ∧ Greedy(Richard) =⇒ Evil(Richard) King(Father(John)) ∧ Greedy(Father(John)) =⇒ Evil(Father(John)) ⋮ Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 7. 6 Existential Instantiation ●For any sentence α, variable v, and constant symbol k that does not appear elsewhere in the knowledge base: ∃ v α SUBST(Iv/k}, α) ●E.g., ∃ x Crown(x) ∧ OnHead(x, John) yields Crown(C1) ∧ OnHead(C1, John) provided C1 is a new constant symbol, called a Skolem constant Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 8. 7 Instantiation ●Universal Instantiation – can be applied several times to add new sentences – the new KB is logically equivalent to the old ●Existential Instantiation – can be applied once to replace the existential sentence – the new KB is not equivalent to the old – but is satisfiable iff the old KB was satisfiable Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 9. 8 Reduction to Propositional Inference ●Suppose the KB contains just the following: ∀ x King(x) ∧ Greedy(x) =⇒ Evil(x) King(John) Greedy(John) Brother(Richard, John) ●Instantiating the universal sentence in all possible ways, we have King(John) ∧ Greedy(John) =⇒ Evil(John) King(Richard) ∧ Greedy(Richard) =⇒ Evil(Richard) King(John) Greedy(John) Brother(Richard, John) ●The new KB is propositionalized: proposition symbols are King(John), Greedy(John), Evil(John), Brother(Richard, John), etc. Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 10. 9 Reduction to Propositional Inference ●Claim: a ground sentence is entailed by new KB iff entailed by original KB ●Claim: every FOL KB can be propositionalized so as to preserve entailment ●Idea: propositionalize KB and query, apply resolution, return result ●Problem: with function symbols, there are infinitely many ground terms, e.g., Father(Father(Father(John))) ●Theorem: Herbrand (1930). If a sentence α is entailed by an FOL KB, it is entailed by a finite subset of the propositional KB ●Idea: For n = 0 to ∞ do create a propositional KB by instantiating with depth-n terms see if α is entailed by this KB ●Problem: works if α is entailed, loops if α is not entailed ●Theorem: Turing (1936), Church (1936), entailment in FOL is semidecidable Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 11. Practical Problems with Propositionalization10 ●Propositionalization seems to generate lots of irrelevant sentences. ●E.g., from ∀ x King(x) ∧ Greedy(x) =⇒ Evil(x) King(John) ∀ y Greedy(y) Brother(Richard, John) it seems obvious that Evil(John), but propositionalization produces lots of facts such as Greedy(Richard) that are irrelevant ●With p k-ary predicates and n constants, there are p ⋅ nk instantiations ●With function symbols, it gets nuch much worse! Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 12. 11 unification Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 13. 12 Plan ●We have the inference rule – ∀ x King(x) ∧ Greedy(x) =⇒ Evil(x) ●We have facts that (partially) match the precondition – King(John) – ∀ y Greedy(y) ●We need to match them up with substitutions: θ = Ix/John, y/John} works – unification – generalized modus ponens Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 14. 13 Unification ●UNIFY(α, β) = θ if αθ = βθ p q θ Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, Jane) Knows(y, Mary) Knows(y, Mother(y)) Knows(x, Mary) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 15. 14 Unification ●UNIFY(α, β) = θ if αθ = βθ p q θ Ix/Jane} Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, Jane) Knows(y, Mary) Knows(y, Mother(y)) Knows(x, Mary) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 16. 15 Unification ●UNIFY(α, β) = θ if αθ = βθ p q θ Ix/Jane} Ix/Mary, y/John} Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, Jane) Knows(y, Mary) Knows(y, Mother(y)) Knows(x, Mary) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 17. 16 Unification ●UNIFY(α, β) = θ if αθ = βθ p q θ Ix/Jane} Ix/Mary, y/John} Iy/John, x/Mother(John)} Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, Jane) Knows(y, Mary) Knows(y, Mother(y)) Knows(x, Mary) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 18. 17 Unification ●UNIFY(α, β) = θ if αθ = βθ p q θ Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, x) Knows(John, Jane) Knows(y, Mary) Knows(y, Mother(y)) Knows(x, Mary) Ix/Jane} Ix/Mary, y/John} Iy/John, x/Mother(John)} fail ●Standardizing apart eliminates overlap of variables, e.g., Knows(z17, Mary) Knows(John, x) Knows(z17, Mary) Iz17/John, x/Mary} Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 19. 18 generalized modus ponens Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 20. 19 Generalized Modus Ponens ●Generalized modus ponens used with KB of definite clauses (exactly one positive literal) ●All variables assumed universally quantified p1 ′, p2 ′, . . . , pn ′, (p1 ∧ p2 ∧ . . . ∧ pn ⇒ q) q θ ′ where pi θ = piθ for all i ●Rule: ●Precondition of rule: ●Implication: ●Facts: ●Substitution: ⇒ Result of modus ponens: King(x) ∧ Greedy(x) =⇒ Evil(x) p2 is Greedy(x) p1 is King(x) q is Evil(x) p1 ′ is King(John) p2 ′ is Greedy(y) θ is Ix/John, y/John} qθ is Evil(John) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 21. 20 forward chaining Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 22. 21 Example Knowledge ●The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American. ●Prove that Col. West is a criminal Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 23. 22 Example Knowledge Base ●. . . it is a crime for an American to sell weapons to hostile nations: American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) =⇒ Criminal(x) ●Nono . . . has some missiles, i.e., ∃ x Owns(Nono, x) ∧ Missile(x): Owns(Nono, M1) and Missile(M1) ●. . . all of its missiles were sold to it by Colonel West Missile(x) ∧ Owns(Nono, x) =⇒ Sells(West, x, Nono) ●Missiles are weapons: Missile(x) ⇒ Weapon(x) ●An enemy of America counts as “hostile”: Enemy(x, America) =⇒ Hostile(x) ●West, who is American . . . American(West) ●The country Nono, an enemy of America . . . Enemy(Nono, America) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 24. 23 Forward Chaining Proof Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 25. 24 Forward Chaining Proof (Note: ∀ x Missile(x) ∧ Owns(Nono, x) =⇒ Sells(West, x, Nono)) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 26. 25 Forward Chaining Proof (Note: American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) =⇒ Criminal(x)) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 27. 26 Properties of Forward Chaining ●Sound and complete for first-order definite clauses (proof similar to propositional proof) ●Datalog (1977) = first-order definite clauses + no functions (e.g., crime example) Forward chaining terminates for Datalog in poly iterations: at most p ⋅ nk literals ●May not terminate in general if α is not entailed ●This is unavoidable: entailment with definite clauses is semidecidable Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 28. 27 Efficiency of Forward Chaining ●Simple observation: no need to match a rule on iteration k if a premise wasn’t added on iteration k − 1 =⇒ match each rule whose premise contains a newly added literal ●Matching itself can be expensive ●Database indexing allows O(1) retrieval of known facts e.g., query Missile(x) retrieves Missile(M1) ●Matching conjunctive premises against known facts is NP-hard ●Forward chaining is widely used in deductive databases Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 29. 28 Hard Matching Example Diff(wa, nt) ∧ Diff(wa, sa) ∧ Diff(nt, q)Diff(nt, sa) ∧ Diff(q, nsw) ∧ Diff(q, sa) ∧ Diff(nsw, v) ∧ Diff(nsw, sa) ∧ Diff(v, sa) =⇒ Colorable() Diff(Red, Blue) Diff(Green, Red) Diff(Blue, Red) Diff(Red, Green) Diff(Green, Blue) Diff(Blue, Green) ●Colorable() is inferred iff the constraint satisfaction problem has a solution ●CSPs include 3SAT as a special case, hence matching is NP-hard Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 30. 29 Forward Chaining Algorithm function FOL-FC-ASK(KB, α) returns a substitution or false repeat until new is empty new ← g for each sentence r in KB do ( p1 ∧ . . . ∧ pn =⇒ q ) ← STANDARDIZE-APART(r) 1 n for each θ such that (p1 ∧ . . . ∧ pn)θ = (p′ ∧ . . . ∧ p ′ )θ 1 n for some p′ , . . . , p′ in KB q ′ ← SUBST(θ, q ) if q ′ is not a renaming of a sentence already in KB or new then do add q ′ to new φ ← UNIFY(q ′, α) if φ is not fail then return φ add new to KB return false Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 31. 30 backward chaining Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 32. 31 Backward Chaining ●Start with query ●Check if it can be derived by given rules and facts – apply rules that infer the query – recurse over pre-conditions Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 33. 32 Backward Chaining Example Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 34. 33 Backward Chaining Example Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 35. 34 Backward Chaining Example Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 36. 35 Backward Chaining Example Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 37. 36 Backward Chaining Example Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 38. 37 Backward Chaining Example Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 39. 38 Backward Chaining Example Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 40. 39 Properties of Backward Chaining ●Depth-first recursive proof search: space is linear in size of proof ●Incomplete due to infinite loops =⇒ fix by checking current goal against every goal on stack ●Inefficient due to repeated subgoals (both success and failure) =⇒ fix using caching of previous results (extra space!) ●Widely used (without improvements!) for logic programming Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 41. 40 Backward Chaining Algorithm function FOL-BC-ASK(KB, goals, θ) returns a set of substitutions inputs: KB, a knowledge base goals, a list of conjuncts forming a query (θ already applied) θ, the current substitution, initially the empty substitution g local variables: answers, a set of substitutions, initially empty if goals is empty then return Iθ} q ′ ← SUBST(θ, FIRST(goals)) for each sentence r in KB where STANDARDIZE-APART(r) = ( p1 ∧ . . . ∧ pn ⇒ q) and θ′ ← UNIFY(q, q ′) succeeds new goals ← [ p1, . . . , pn|REST(goals)] answers ← FOL-BC-ASK(KB, new goals, COMPOSE(θ′, θ)) ∪ answers return answers Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 42. 41 logic programming Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 43. 42 Logic Programming ●Computation as inference on logical KBs Logic programming 1. Identify problem 2. Assemble information 3. Tea break 4. Encode information in KB 5. Encode problem instance as facts 6. Ask queries 7. Find false facts Ordinary programming Identify problem Assemble information Figure out solution Program solution Encode problem instance as data Apply program to data Debug procedural errors ●Should be easier to debug Capital(NewY ork, US) than x = ∶ x + 2 ! Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 44. 43 Prolog ●Basis: backward chaining with Horn clauses + bells & whistles ●Widely used in Europe, Japan (basis of 5th Generation project) ●Compilation techniques ⇒ approaching a billion logical inferences per second ●Program = set of clauses = head :- literal1, . . . literaln. criminal(X) :- american(X), weapon(Y), sells(X,Y,Z), hostile(Z). missile(M1). owns(Nono,M1). sells(West,X,Nono) :- missile(X), owns(Nono,X). weapon(X) :- missile(X). hostile(X) :- enemy(X,America). American(West). Enemy(Nono,America). Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 45. 44 Prolog Systems ●Depth-first, left-to-right backward chaining ●Built-in predicates for arithmetic etc., e.g., X is Y*Z+3 ●Closed-world assumption (“negation as failure”) e.g., given alive(X) :- not dead(X). alive(joe) succeeds if dead(joe) fails Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 46. 45 resolution Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 47. 46 Resolution: Brief Summary ●Full first-order version: l1 ∨ ⋯ ∨ lk , m1 ∨ ⋯ ∨ mn (l1 ∨ ⋯ ∨ li−1 ∨ li+1 ∨ ⋯ ∨ lk ∨ m1 ∨ ⋯ ∨ mj−1 ∨ mj+1 ∨ ⋯ ∨ mn)θ where UNIFY(li, ¬mj) = θ. ●For example, ¬Rich(x) ∨ Unhappy(x) Rich(Ken) Unhappy(Ken) with θ = Ix/Ken} ●Apply resolution steps to CNF (KB ∧ ¬α); complete for FOL Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 48. 47 Conversion to CNF Everyone who loves all animals is loved by someone: ∀ x [∀ y Animal(y) =⇒ Loves(x, y)] =⇒ [∃ y Loves(y, x)] 1. Eliminate biconditionals and implications ∀ x [¬∀ y ¬Animal(y) ∨ Loves(x, y)] ∨ [∃ y Loves(y, x)] 2. Move ¬ inwards: ¬∀ x, p ≡ ∃ x ¬p, ¬∃ x, p ≡ ∀ x ¬p: ∀ x [∃ y ¬(¬Animal(y) ∨ Loves(x, y))] ∨ [∃ y Loves(y, x)] ∀ x [∃ y ¬¬Animal(y) ∧ ¬Loves(x, y)] ∨ [∃ y Loves(y, x)] ∀ x [∃ y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃ y Loves(y, x)] Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 49. 48 Conversion to CNF 3. Standardize variables: each quantifier should use a different one ∀ x [∃ y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃ z Loves(z, x)] 4. Skolemize: a more general form of existential instantiation. Each existential variable is replaced by a Skolem function of the enclosing universally quantified variables: ∀ x [Animal(F (x)) ∧ ¬Loves(x, F (x))] ∨ Loves(G(x), x) 5. Drop universal quantifiers: [Animal(F (x)) ∧ ¬Loves(x, F (x))] ∨ Loves(G(x), x) 6. Distribute ∧ over ∨: [Animal(F (x)) ∨ Loves(G(x), x)] ∧ [¬Loves(x, F (x)) ∨ Loves(G(x), x)] Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 50. 49 Our Previous Example ●Rules – American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) =⇒ Criminal(x) – Missile(M1) and Owns(Nono, M1) – Missile(x) ∧ Owns(Nono, x) =⇒ Sells(West, x, Nono) – Missile(x) ⇒ Weapon(x) – Enemy(x, America) =⇒ Hostile(x) – American(West) – Enemy(Nono, America) ●Converted to CNF – ¬American(x) ∨ ¬Weapon(y) ∨ ¬Sells(x, y, z) ∨ ¬Hostile(z) ∨ Criminal(x) – Missile(M1) and Owns(Nono, M1) – ¬Missile(x) ∨ ¬Owns(Nono, x) ∨ Sells(West, x, Nono) – ¬Missile(x) ∨ Weapon(x) – ¬Enemy(x, America) ∨ Hostile(x) – American(West) – Enemy(Nono, America) ●Query: ¬Criminal(West) Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024
  • 51. 50 Resolution Proof Philipp Koehn Artificial Intelligence: Inference in First-Order Logic 5 March 2024