Prof. Michael G. Dyer HomePage
CS 163 - Introduction to Natural Language Processing (NLP)
Emphasis is on extraction of semantic content from text using symbolic methods. Students learn how to represent thought and knowledge and how to map language text into conceptual representations. A variety of NLP systems are examined that comprehend simple narrative and editorial text.
Prerequisites: cs130 or cs131 (cs 161 is highly recommended).
Topics:
1. Paradigms in NLP: Behaviorist, Chomskian, AI, Empirical, Connectionist.
2. Semantic Networks for NLP; Procedural Semantics; Conceptual Dependency Theory.
3. Conceptual Analysis, Conceptual Generation, and Common Sense Inference.
4. Representing Stereotypic Knowledge with Scripts; Goal /Plan Analysis for Story Comprehension and Invention.
5. Question Analysis, Answer Retrieval; Thematic Analysis; Affect Processing.
6. Language Acquisition, Phrasal Lexicons; NLP in Naive Mechanics Domain.
7. Episodic vs. Semantic Memory; Explanation vs. Similarity-Based Learning.
8. Belief Systems and Argumentation.
9. Legal & Moral Reasoning; Stream of Thought; Story Invention Revisited.
10. Introduction to Connectionism: Symbols vs. Neurons; Philosophy of Mind.
Textbook: There is no textbook. Lectures are taken from a wide variety of sources.
Grading: Based mainly on homeworks and a course project, in which students write a computer program that reads a few paragraphs of text (usually narrative text), accesses lexical and semantic memory, applies world knowledge and planning to construct a conceptual representation of the meaning of the text. The program must also comprehend and answer questions concerning the conceptual content of the text.
Offered: Course is offered rarely. Undergraduate students who want some experience related to NLP should take cs161, especially when taught by Prof. Dyer (which is usually in the Winter quarter.)