Prof. Michael G. Dyer      HomePage

CS 263A - Language and Thought

Introduction to Natural Language Processing (NLP) and Cognitive Models that support NLP; Emphasis is on both Semantics/Logic and Empirical/Statistical Approaches to Natural Language Text.

Prerequistes: graduate standing and cs161 (or consent of instructor).


1. Formal vs. natural language, brief linguistics background, Fillmore's cases, basic probability and information theory, NL corpora.

2. Collocations and N-grams models, Conceptual Dependency (CD) theory, conceptual analysis of NL text, conceptual generation, and common sense inference.

3. Word-sense disambiguation and lexical acquisition.

4. Script-based processing,, plan/goal-based processing, semantic vs. episodic memory, question answering.

5. Markov models and POS tagging.

6. Story invention, thematic analysis, affect processing.

7. Probabilistic CFGs and probabilistic parsing.

8. Statistical alignment and machine translation.

9. Belief analysis and argumentation; morality, humor, and irony recognition.

10. Modeling stream of thought; episodic memory and story invention revisited; philosophy of mind and machine intentionality.

Text: C. D. Manning and H. Schutze (1999). Foundations of Statistical Natural Language Processing (FSNLP), MIT Press, Cambridge, MA. (FSNLP covers Statistical/Empirical NLP; Semantic analysis is covered via readings from a wide variety of sources.)

Grading: Consists of: Project I (approx. 30% of grade), Project II (approx. 60%), and Project II presentation (approx. 10%). Project I - Each student (or in teams of two) implements a small, rule-based story-understanding and Q/A system. Purpose of Project I is to appreciate issues involved in semantic analysis of NL text. Project II - is usually statistical/empirical in nature but student may choose instead to extend Project I.

Hours per week: Lectures, 4 units (meets twice weekly).

Offered: Every other year, usually in the spring quarter.