Home > Events > Baggett Lectures: Paul Smolensky (Johns Hopkins)
S M T W T F S
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
 
 
 

Baggett Lectures: Paul Smolensky (Johns Hopkins)

Time: 
Wednesday, November 16, 2016 - 3:30 PM
Location: 
Maryland Room, Marie Mount Hall

This is the first in a series of three lectures by distinguished phonologist Paul Smolensky, Krieger-Eisenhower Professor of Cognitive Science at Johns Hopkins University, generously supported by Dave Baggett.

Overview of the lectures

A fundamental task of cognitive science is reconciling (i) the discrete, categorical character of mental states and knowledge — e.g., symbolic expressions governed by symbolic rules of grammar or logic — with (ii) the continuous, gradient character of neural states and processes. This year’s Baggett Lectures will present an approach to unifying discrete symbolic and continuous neural computation: Gradient Symbolic Computation (GSC). This unification leads to new grammatical theories as well as novel neural network architectures that realize these theories. The importance of reconciling symbolic and neural network computation now extends beyond basic science into applied Natural Language Processing, where the best-performing systems utilize neural networks, but it is not currently known how to construct networks that enable rapid instruction, human understanding of internal knowledge, and competence in a diversity of tasks — all properties that are characteristic of symbolic systems.

Lecture 1: Unifying discrete linguistic computation with continuous neural computation

GSC’s novel neural architecture — capable of encoding and processing symbol structures — will be presented, and the new grammatical theories that emerge from this architecture will be described and illustrated: theories in which grammars are evaluators of well-formedness, and grammatical structures are those that are maximally well-formed or optimal. Gradient Symbol Structures will be defined: these are structures (such as phonological strings or syntactic trees) in which each single location hosts a blend of symbols, each present (or active) to a continuously variable degree.