Home > Events > Baggett Lectures: Paul Smolensky (Johns Hopkins)

Baggett Lectures: Paul Smolensky (Johns Hopkins)

Time: 
Thursday, November 17, 2016 - 3:30 PM
Location: 
Maryland Room, Marie Mount Hall

This is the second in a series of three lectures by distinguished phonologist Paul Smolensky, Krieger-Eisenhower Professor of Cognitive Science at Johns Hopkins University, generously supported by Dave Baggett.

Overview of the lectures

A fundamental task of cognitive science is reconciling (i) the discrete, categorical character of mental states and knowledge — e.g., symbolic expressions governed by symbolic rules of grammar or logic — with (ii) the continuous, gradient character of neural states and processes. This year’s Baggett Lectures will present an approach to unifying discrete symbolic and continuous neural computation: Gradient Symbolic Computation (GSC). This unification leads to new grammatical theories as well as novel neural network architectures that realize these theories. The importance of reconciling symbolic and neural network computation now extends beyond basic science into applied Natural Language Processing, where the best-performing systems utilize neural networks, but it is not currently known how to construct networks that enable rapid instruction, human understanding of internal knowledge, and competence in a diversity of tasks — all properties that are characteristic of symbolic systems.

Lecture 2, Gradient symbols in grammatical competence

Use of gradient symbol structures in theories of grammatical competence will be illustrated by partially-present constituents in base positions of syntactic wh-movement, partially-present [voice] features in final consonants in certain final-devoicing languages, and, most extensively, partially-present consonants in underlying forms of French words participating in liaison — consonants which disappear in contexts where fully-present consonants remain. The liaison case illustrates how gradient versions of multiple distinct structures posited by competing theories can be blended to form an account that covers a range of data that no single structure can explain.