Debugging Chomsky's hierarchy -
Adequacy of Context Free Grammar to
represent language
By: Hussein Ghaly
Chomsky’s Hierarchy
A containment hierarchy (strictly nested sets) of classes of formal grammars.
Chomsky’s early work indicated that regular grammar and context free grammar
are inadequate to represent natural language.
Image: https://en.wikipedia.org/wiki/Chomsky_hierarchy
Table and definition: https://www.cs.wmich.edu/~bhardin/cs4850/ChomskyPresentation.pdf
Motivation
- For both Natural Language Processing and Language Acquisition, it is
important to have an adequate formal grammar that:
- Represents accurately the structure of the language, following Chomsky’s criteria of
generating only grammatical sentences and not generating ungrammatical ones
- Can be represented computationally
- Is learnable with finite sentence input (poverty of stimulus argument)
- Goes with the spirit of minimalism, to reduce the number of entities needed to represent
natural languages
- Therefore, thorough investigation is needed for Chomsky’s hierarchy, to
identify such adequate formal grammar
- As a demonstration, it is needed to develop a simple computational model
based on the proposed grammar
Motivation - NLP Applications
- Deciding on grammaticality (syntactic completion) of a sentence is an
important cue for turn taking in human-robot discussions (Skantze, 2017)
I would like to ask about ... Silence
Sorry I didn’t get that.
… the show times.
waiting
Motivation - Computational Challenges
- Modern parsers are not generally built
with rules to determine grammaticality of
sentences, hence can accept and parse
ungrammatical input (Output shown from
Stanford Online Parser)
Motivation - Learnability Challenges
Context sensitivity of the language is
also associated with the language being
unlearnable from input sentences alone
without an informant, according to
Gold’s definition of learnability (Gold,
1967)
Regular languages - Definition
- Can be represented by Finite State Automata
- A finite state language is a finite or infinite set of strings (sentences) of
symbols (words) generated by a finite set of rules (the grammar), where each
rule specifies the state of the system in which it can be applied, the symbol
which is generated, and the state of the system after the rule is applied.
(Chomsky, 1958)
Regular languages - Examples
- For example, following string can be generated by Finite State:
baaaa baaaaa baa
- It can also generate grammatical sequences, for example, a simple sentence
John saw the big red happy dog.
ab
DETVN ADJ N
Regular languages - Inadequacies
- Regular/Finite State Grammar cannot represent embedding, mirror-
image and other phenomena in natural language (Chomsky, 1957)
Context-Free Grammar (CFG)
Also referred to as “Phrase Structure Grammar (PSG)”.
A grammar made of rewrite rules (production rules):
X → Y
Where X and Y are symbols. Symbols are terminal and nonterminal. For example:
Non terminal: S → NP VP VP → V NP
Terminal: V → ate NP → John NP →
Chocolate
This can generate the sentence: John ate chocolate
Inadequacy of CFG - Constructional Homonymity
Inadequacy of CFG - Auxiliaries
In order to account for the possible ways
of formulating verbs with auxiliaries (is
taking - has been taking, will be taken),
the following rules were introduced:
Inadequacy of CFG - Verb Type and Transitivity
CFG is too limited to give a true picture of linguistic structure.
Context-sensitivity - Necessity of transformation
- In both Syntactic structures and three grammars, the discussion of inadequacy of
CFG/Phrase structure grammar is accompanied with the presentation of
Transformation (context-sensitive) rules as the solution to this inadequacy.
- For example, the auxiliary formulation problem:
Context-sensitivity - Transformational Grammar
Therefore, Chomsky proceeded with Context-Sensitive grammar, where all
sentences are simple declarative sentences, but are transformed to their surface
forms. For example: the sentence: “the food was eaten by the man” is “the man
ate the food”, and some transformations are applied.
Context-sensitivity - Structural Ambiguity
Examples such as:
- The shooting of the hunters (meaning: hunters shoot or are being shot)
- The growling of lions (meaning: lions growl)
- The raising of flowers (meaning: flowers are being raised)
Show that the transformation gives more explanatory power for the meaning
CFG Adequacy Claims - Gazdar (1982)
- “Phrase structure grammar analyses can be at least as elegant and general
and no more prone to counterexamples than the alternative transformational
accounts of the same phenomena
- Gazdar’s approach uses complex symbols, consisting of a component that
distinguishes between the X bar and X lexical category, and a component
which is a feature bundle
CFG Adequacy Claims - Pullum and Gazdar
- “Compiler design for Context Free languages is fairly well explored problem,
but designing compilers for non-CFLs can be grossly more difficult”
- in line with the thesis of Fodor (1975) “language acquisition for a human
learner is nothing more or less than the construction of a program to compile
the natural language into the human machine code”
CFG Adequacy claims: Verb Frames
In Chomsky’s example, the verb “fly” has actually multiple frames depending on its
sense, showing that each structure is associated with a different sense of the
verb. This goes also for other structural ambiguities (shoot vs. raise & growl)
https://verbs.colorado.edu/verb-index/index/F.php
New Model - CFG with finite flexible rules and labels
Instead of the set of rules and labels introduced by Chomsky for the auxiliary
problem, let’s use these ones, which are also CFG rules without transformation,
and without any AUX label
New Model → New labels
Labels
NP → the man NP → the book
V0 → take V → took V → takes VG → taking
VN → taken
BE → is BEEN → been BEING → being BE0 → be
HAVE → has MOD → will
Rules
S → NP VP VP → V NP
V → BE VG BE → MOD BE0 V → HAVE VN V → MOD V0
New Model - Generating Sentences
Given the rules: S → NP VP and VP → V NP we have the following formula S → NP V NP
- Since the label V applies directly to terminal labels (takes, took), we can formulate:
The man takes the book - The man took the book
- The label V also applies to other non-terminal labels: HAVE VN, BE VG, MOD V0
The man has taken the book - The man is taking the book - The man will take the book
New Model - Generating Sentences
Given the rules: S → NP VP and VP → V NP we have the following formula S → NP V NP
Given the rules: V → BE VG and BE → MOD BE0 we can formulate:
The man will be taking the book
New Model - Passive Voice
An additional concern is how to generate sentences in passive voice using CFG
without generating ungrammatical sentences, so one way of doing this:
- Introduce additional sentence level rule: S → NP VP_passive
- Introduce verb phrase level passive rules: VP_passive → V_passive,
VP_passive → V_passive BY NP
- Introduce verb level passive rules: V_passive → BE VN
- All other rules remain intact
New Model - Generating Sentences with Passive
We can generate passive sentences using the rules:
S → NP VP_passive , VP_passive → V_passive, V_passive → BE VN
the book is taken (BE → is)
the book will be taken (BE → MOD BE)
The book has been taken (BE → HAVE BEEN)
New Model - problem with “is being”
For the sentence: “The book is being taken”, it might be tempting to develop a
rule:
BE → BE BEING
However, this would lead to generating ungrammatical sentences:
* The man is being taking the book
Therefore, different rules should apply
New Model - problem with “is being”
We will simply introduce the rule:
BEING0 → BE BEING
And make sure this rule applies only in passive:
V_passive → BEING0 VN (in addition to V_passive → BE VN)
So the possible passive constructions generated will include:
“The book is being read” in addition to:
the book is read - the book has been read - the book will be read
New Model - outcome
- From the demonstration above, using a finite set of context free rules can
represent natural language structures previously thought to be context
sensitive and require transformation rules
- These rules will generate only grammatical sentences and will not generate
uny ungrammatical ones
BONUS: Inadequacy of CFG - Swiss German (Schieber, 1985)
- Cross-serial semantic dependencies to prove non-context freeness of natural
language
New Model - Swiss German
- From the data provided, it appears that the critical factor for grammaticality is
the case of the noun (dative or accusative) depending on the verb used
New Model - Swiss German
- The word order seems to be quite free
New Model - Swiss German
We can propose a finite set of rules for this construction (Relative Clause- RC):
NPa: accusative NP, NPd: dative NP
Vd: Verb with dative object, Va: Verb with accusative object
RC → NP NPd NPa Vd Va
RC → NP NPa NPd Vd Va
RC → NP NPd Vd NPd Va
New Model
Alternatively, we can propose certain merge rules of the two verbs so that the
composite structure would be equivalent to a ditransitive verb with both a dative
and an accusative object, which can occur in any order.
An indicative example in high German, also with some freedom in word order:
Ich habe ihm das Buch gegeben
Ich habe das Buch ihm gegeben
I have given him-dative the book-accusative
Vcomposite → Vd Va, RC → NP APa NPd Vcomposite, RC → NP APa NPd
Vcomposite
Implementation - Hatshepsut Parser
Building on this set of rules and labels, the following parser implementation was
developed, as a shift-reduce parser to parse sentences according to a finite set of
labels and recursive rules: http://arbsq.net/hp
Parser Testing
Conclusion
- Context-Free Grammar can be an appropriate formalism for representing
natural language, having addressed some of the concerns regarding its
adequacy
- This may have implications on the learnability, given the fact that we are
dealing with a finite number of labels and rules, which can be examined as
patterns by the learner
- CFG can be embedded in automatic parsing applications
Thank You!
Questions?
Hatshepsut Parser at: http://arbsq.net/hp
Email me: hmghaly@gmail.com
Inadequacy of CFG - Undecidability
Inadequacy of CFG - Ambiguity

More Related Content

PPTX
Formal Grammars of English
ODP
Ldml presentation
PPTX
English Grammar Lecture 13: The Object Complement Patterns
DOCX
Syntax turn paper
PPTX
English Grammar Lecture 6: Verb Patterns and the "Be" Patterns
PDF
Алексей Чеусов - Расчёсываем своё ЧСВ
PPTX
English Grammar Lecture 5: Sentences
PPT
Class9
Formal Grammars of English
Ldml presentation
English Grammar Lecture 13: The Object Complement Patterns
Syntax turn paper
English Grammar Lecture 6: Verb Patterns and the "Be" Patterns
Алексей Чеусов - Расчёсываем своё ЧСВ
English Grammar Lecture 5: Sentences
Class9

Similar to Debugging Chomsky's Hierarchy (20)

PDF
Lecture: Context-Free Grammars
PPT
Normal-forms-for-Context-Free-Grammars.ppt
PPTX
Problems of function based syntax
PDF
ChomskyPresentation.pdf
PPTX
Generative grammar ppt report
PPTX
Context Free Grammar
PPT
Lecture 7: Definite Clause Grammars
PDF
Generalized Transformations And Beyond Reflections On Minimalist Syntax Grtner
PDF
Morphology-and-Syntax-CFG for another random place
PDF
Syntactic Theory A Formal Introduction Second Edition Ivan A Sag Thomas Wasow...
PDF
CS571: Phrase Structure Grammar
PPTX
Issues W2011 Final
PPT
haenelt.ppt
PPT
Natural Language Processing 9th Chapter.ppt
PPTX
Automata and Language Theory-Context Free Grammars
PDF
Theory of Computation Grammar Concepts and Problems
PDF
Flat unit 3
PPT
INFO-2950-Languages-and-Grammars.ppt
PDF
Generative grammer
PPTX
5. Syntacticfffgffg analysis-Parsing.pptx
Lecture: Context-Free Grammars
Normal-forms-for-Context-Free-Grammars.ppt
Problems of function based syntax
ChomskyPresentation.pdf
Generative grammar ppt report
Context Free Grammar
Lecture 7: Definite Clause Grammars
Generalized Transformations And Beyond Reflections On Minimalist Syntax Grtner
Morphology-and-Syntax-CFG for another random place
Syntactic Theory A Formal Introduction Second Edition Ivan A Sag Thomas Wasow...
CS571: Phrase Structure Grammar
Issues W2011 Final
haenelt.ppt
Natural Language Processing 9th Chapter.ppt
Automata and Language Theory-Context Free Grammars
Theory of Computation Grammar Concepts and Problems
Flat unit 3
INFO-2950-Languages-and-Grammars.ppt
Generative grammer
5. Syntacticfffgffg analysis-Parsing.pptx
Ad

Recently uploaded (20)

PPTX
Computer Architecture Input Output Memory.pptx
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
PPTX
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
PDF
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
PDF
Uderstanding digital marketing and marketing stratergie for engaging the digi...
PDF
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
PPTX
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
PPTX
Share_Module_2_Power_conflict_and_negotiation.pptx
PDF
IGGE1 Understanding the Self1234567891011
PDF
My India Quiz Book_20210205121199924.pdf
PDF
advance database management system book.pdf
PDF
Environmental Education MCQ BD2EE - Share Source.pdf
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
DOC
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
PPTX
B.Sc. DS Unit 2 Software Engineering.pptx
PDF
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
PDF
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
PPTX
20th Century Theater, Methods, History.pptx
Computer Architecture Input Output Memory.pptx
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
Uderstanding digital marketing and marketing stratergie for engaging the digi...
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
Share_Module_2_Power_conflict_and_negotiation.pptx
IGGE1 Understanding the Self1234567891011
My India Quiz Book_20210205121199924.pdf
advance database management system book.pdf
Environmental Education MCQ BD2EE - Share Source.pdf
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
Paper A Mock Exam 9_ Attempt review.pdf.
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
B.Sc. DS Unit 2 Software Engineering.pptx
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
20th Century Theater, Methods, History.pptx
Ad

Debugging Chomsky's Hierarchy

  • 1. Debugging Chomsky's hierarchy - Adequacy of Context Free Grammar to represent language By: Hussein Ghaly
  • 2. Chomsky’s Hierarchy A containment hierarchy (strictly nested sets) of classes of formal grammars. Chomsky’s early work indicated that regular grammar and context free grammar are inadequate to represent natural language. Image: https://en.wikipedia.org/wiki/Chomsky_hierarchy Table and definition: https://www.cs.wmich.edu/~bhardin/cs4850/ChomskyPresentation.pdf
  • 3. Motivation - For both Natural Language Processing and Language Acquisition, it is important to have an adequate formal grammar that: - Represents accurately the structure of the language, following Chomsky’s criteria of generating only grammatical sentences and not generating ungrammatical ones - Can be represented computationally - Is learnable with finite sentence input (poverty of stimulus argument) - Goes with the spirit of minimalism, to reduce the number of entities needed to represent natural languages - Therefore, thorough investigation is needed for Chomsky’s hierarchy, to identify such adequate formal grammar - As a demonstration, it is needed to develop a simple computational model based on the proposed grammar
  • 4. Motivation - NLP Applications - Deciding on grammaticality (syntactic completion) of a sentence is an important cue for turn taking in human-robot discussions (Skantze, 2017) I would like to ask about ... Silence Sorry I didn’t get that. … the show times. waiting
  • 5. Motivation - Computational Challenges - Modern parsers are not generally built with rules to determine grammaticality of sentences, hence can accept and parse ungrammatical input (Output shown from Stanford Online Parser)
  • 6. Motivation - Learnability Challenges Context sensitivity of the language is also associated with the language being unlearnable from input sentences alone without an informant, according to Gold’s definition of learnability (Gold, 1967)
  • 7. Regular languages - Definition - Can be represented by Finite State Automata - A finite state language is a finite or infinite set of strings (sentences) of symbols (words) generated by a finite set of rules (the grammar), where each rule specifies the state of the system in which it can be applied, the symbol which is generated, and the state of the system after the rule is applied. (Chomsky, 1958)
  • 8. Regular languages - Examples - For example, following string can be generated by Finite State: baaaa baaaaa baa - It can also generate grammatical sequences, for example, a simple sentence John saw the big red happy dog. ab DETVN ADJ N
  • 9. Regular languages - Inadequacies - Regular/Finite State Grammar cannot represent embedding, mirror- image and other phenomena in natural language (Chomsky, 1957)
  • 10. Context-Free Grammar (CFG) Also referred to as “Phrase Structure Grammar (PSG)”. A grammar made of rewrite rules (production rules): X → Y Where X and Y are symbols. Symbols are terminal and nonterminal. For example: Non terminal: S → NP VP VP → V NP Terminal: V → ate NP → John NP → Chocolate This can generate the sentence: John ate chocolate
  • 11. Inadequacy of CFG - Constructional Homonymity
  • 12. Inadequacy of CFG - Auxiliaries In order to account for the possible ways of formulating verbs with auxiliaries (is taking - has been taking, will be taken), the following rules were introduced:
  • 13. Inadequacy of CFG - Verb Type and Transitivity CFG is too limited to give a true picture of linguistic structure.
  • 14. Context-sensitivity - Necessity of transformation - In both Syntactic structures and three grammars, the discussion of inadequacy of CFG/Phrase structure grammar is accompanied with the presentation of Transformation (context-sensitive) rules as the solution to this inadequacy. - For example, the auxiliary formulation problem:
  • 15. Context-sensitivity - Transformational Grammar Therefore, Chomsky proceeded with Context-Sensitive grammar, where all sentences are simple declarative sentences, but are transformed to their surface forms. For example: the sentence: “the food was eaten by the man” is “the man ate the food”, and some transformations are applied.
  • 16. Context-sensitivity - Structural Ambiguity Examples such as: - The shooting of the hunters (meaning: hunters shoot or are being shot) - The growling of lions (meaning: lions growl) - The raising of flowers (meaning: flowers are being raised) Show that the transformation gives more explanatory power for the meaning
  • 17. CFG Adequacy Claims - Gazdar (1982) - “Phrase structure grammar analyses can be at least as elegant and general and no more prone to counterexamples than the alternative transformational accounts of the same phenomena - Gazdar’s approach uses complex symbols, consisting of a component that distinguishes between the X bar and X lexical category, and a component which is a feature bundle
  • 18. CFG Adequacy Claims - Pullum and Gazdar - “Compiler design for Context Free languages is fairly well explored problem, but designing compilers for non-CFLs can be grossly more difficult” - in line with the thesis of Fodor (1975) “language acquisition for a human learner is nothing more or less than the construction of a program to compile the natural language into the human machine code”
  • 19. CFG Adequacy claims: Verb Frames In Chomsky’s example, the verb “fly” has actually multiple frames depending on its sense, showing that each structure is associated with a different sense of the verb. This goes also for other structural ambiguities (shoot vs. raise & growl) https://verbs.colorado.edu/verb-index/index/F.php
  • 20. New Model - CFG with finite flexible rules and labels Instead of the set of rules and labels introduced by Chomsky for the auxiliary problem, let’s use these ones, which are also CFG rules without transformation, and without any AUX label
  • 21. New Model → New labels Labels NP → the man NP → the book V0 → take V → took V → takes VG → taking VN → taken BE → is BEEN → been BEING → being BE0 → be HAVE → has MOD → will Rules S → NP VP VP → V NP V → BE VG BE → MOD BE0 V → HAVE VN V → MOD V0
  • 22. New Model - Generating Sentences Given the rules: S → NP VP and VP → V NP we have the following formula S → NP V NP - Since the label V applies directly to terminal labels (takes, took), we can formulate: The man takes the book - The man took the book - The label V also applies to other non-terminal labels: HAVE VN, BE VG, MOD V0 The man has taken the book - The man is taking the book - The man will take the book
  • 23. New Model - Generating Sentences Given the rules: S → NP VP and VP → V NP we have the following formula S → NP V NP Given the rules: V → BE VG and BE → MOD BE0 we can formulate: The man will be taking the book
  • 24. New Model - Passive Voice An additional concern is how to generate sentences in passive voice using CFG without generating ungrammatical sentences, so one way of doing this: - Introduce additional sentence level rule: S → NP VP_passive - Introduce verb phrase level passive rules: VP_passive → V_passive, VP_passive → V_passive BY NP - Introduce verb level passive rules: V_passive → BE VN - All other rules remain intact
  • 25. New Model - Generating Sentences with Passive We can generate passive sentences using the rules: S → NP VP_passive , VP_passive → V_passive, V_passive → BE VN the book is taken (BE → is) the book will be taken (BE → MOD BE) The book has been taken (BE → HAVE BEEN)
  • 26. New Model - problem with “is being” For the sentence: “The book is being taken”, it might be tempting to develop a rule: BE → BE BEING However, this would lead to generating ungrammatical sentences: * The man is being taking the book Therefore, different rules should apply
  • 27. New Model - problem with “is being” We will simply introduce the rule: BEING0 → BE BEING And make sure this rule applies only in passive: V_passive → BEING0 VN (in addition to V_passive → BE VN) So the possible passive constructions generated will include: “The book is being read” in addition to: the book is read - the book has been read - the book will be read
  • 28. New Model - outcome - From the demonstration above, using a finite set of context free rules can represent natural language structures previously thought to be context sensitive and require transformation rules - These rules will generate only grammatical sentences and will not generate uny ungrammatical ones
  • 29. BONUS: Inadequacy of CFG - Swiss German (Schieber, 1985) - Cross-serial semantic dependencies to prove non-context freeness of natural language
  • 30. New Model - Swiss German - From the data provided, it appears that the critical factor for grammaticality is the case of the noun (dative or accusative) depending on the verb used
  • 31. New Model - Swiss German - The word order seems to be quite free
  • 32. New Model - Swiss German We can propose a finite set of rules for this construction (Relative Clause- RC): NPa: accusative NP, NPd: dative NP Vd: Verb with dative object, Va: Verb with accusative object RC → NP NPd NPa Vd Va RC → NP NPa NPd Vd Va RC → NP NPd Vd NPd Va
  • 33. New Model Alternatively, we can propose certain merge rules of the two verbs so that the composite structure would be equivalent to a ditransitive verb with both a dative and an accusative object, which can occur in any order. An indicative example in high German, also with some freedom in word order: Ich habe ihm das Buch gegeben Ich habe das Buch ihm gegeben I have given him-dative the book-accusative Vcomposite → Vd Va, RC → NP APa NPd Vcomposite, RC → NP APa NPd Vcomposite
  • 34. Implementation - Hatshepsut Parser Building on this set of rules and labels, the following parser implementation was developed, as a shift-reduce parser to parse sentences according to a finite set of labels and recursive rules: http://arbsq.net/hp
  • 36. Conclusion - Context-Free Grammar can be an appropriate formalism for representing natural language, having addressed some of the concerns regarding its adequacy - This may have implications on the learnability, given the fact that we are dealing with a finite number of labels and rules, which can be examined as patterns by the learner - CFG can be embedded in automatic parsing applications
  • 37. Thank You! Questions? Hatshepsut Parser at: http://arbsq.net/hp Email me: [email protected]
  • 38. Inadequacy of CFG - Undecidability
  • 39. Inadequacy of CFG - Ambiguity

Editor's Notes

  • #5: Skantze, G. (2017). Towards a General, Continuous Model of Turn-taking in Spoken Dialogue using LSTM Recurrent Neural Networks. In Proceedings of SigDial. Saarbrucken, Germany. [pdf]