Paul Smolensky

Krieger-Eisenhower Professor

Krieger 241B
Away Spring and Fall 2017
410-516-5331
smolensky@jhu.edu
Curriculum Vitae
Google Scholar Profile

Biography
Research
Publications
Books
PhD Students
Presentations

Postdoc positions available

Bio: My research (see Research tab) focuses on integrating symbolic and neural network computation for modeling reasoning and, especially, grammar in the human mind/brain. The work is formal and computational, with emerging applications to neuroscience and applied natural language processing. My research has primarily addressed issues of representation and processing rather than learning. Principal contributions (see Publications tab) are to linguistic theory, the theory of vectorial neural network computation, and the philosophical foundations of cognitive science.

I am currently on leave from Johns Hopkins, working at Microsoft Research in Redmond, Washington (for a non-technical synopsis of some of my recent work there, see this link: Mind/Brain Networks). Prior to joining the faculty of the Cognitive Science Department at Johns Hopkins, I was a professor in the Computer Science Department and Institute of Cognitive Science at the University of Colorado Boulder. I had been a postdoc at the Center for Cognitive Science at the University of California at San Diego, where I was a founding member of the Parallel Distributed Processing Research Group and worked with Dave Rumelhart, James McClelland and Geoff Hinton. (I also contributed to the User-Centered System Design group led by Don Norman.) My degrees are an A.B. in Physics from Harvard and, from Indiana University, Bloomington, a M.S. in Physics and a Ph.D. in Mathematical Physics. (CV)

Goal

Unification of the sciences of mind & brain through integration of

  • compositional, structured, symbolic computation
    • at the core of many successful classical theories of the mind
      • in particular, the theory of language
    • a branch of discrete mathematics
  • dynamic, distributed, vectorial connectionist computation
    • at the core of the theory of neural networks, crucial for
      • computational models of the brain
      • emergentist models of the mind
      • contemporary machine learning and Artificial Intelligence
    • a branch of continuous mathematics

Current

The theory, and application to language, of Gradient Symbolic Computation, a new cognitive architecture in which a single computational system can simultaneously be described formally at two levels:

  • a higher ‘abstract mental’ level, where
    • data
      • consist of symbols that have partial degrees of presence — gradient activity levels
      • which blend together to form Gradient Symbol Structures (such as gradient trees)
    • processing
      • is algebraic operations on vectors and tensors
  • a lower ‘abstract neural’ level, where
    • data
      • consist of distributed activation vectors over many model neurons
      • which superimpose to implement Gradient Symbol Structures…
    • processing
      • is probabilistic spreading of activation (governed by stochastic differential equations)
      • through networks with numerically weighted interconnections

Displaying the 20 most recent publications. View the Google Scholar Profile for complete publications list.

Note: Please refresh the page if no publications initially appear.

G Legendre, P Smolensky
A competition-based analysis of French anticausatives
Lingvisticæ Investigationes 40 (1), 25-42, 2017

Q Huang, P Smolensky, X He, L Deng, D Wu
A Neural-Symbolic Approach to Natural Language Tasks
arXiv preprint arXiv:1710.11475, 2017

Q Huang, P Smolensky, X He, L Deng, D Wu
Tensor Product Generation Networks
arXiv preprint arXiv:1709.09118, 2017

H Palangi, P Smolensky, X He, L Deng
Deep Learning of Grammatically-Interpretable Representations Through Question-Answering
arXiv preprint arXiv:1705.08432, 2017

C Manning, P Smolensky
Integrating Symbolic and Neural Computation
Annual Review of Linguistics 4 (1), 2017

MT Putnam, G Legendre, P Smolensky
How constrained is language mixing in bi-and uni-modal production?
Linguistic Approaches to Bilingualism 6 (6), 812-816, 2017

PW Cho, M Goldrick, P Smolensky
Incremental parsing in a continuous dynamical system: sentence processing in Gradient Symbolic Computation
Linguistics Vanguard 3 (1), 2017

X He, L Deng, J Gao, W Yih, M Lee, P Smolensky
Computational-model operation using multiple subject representations
US Patent App. 15/084,366, 2016

P Smolensky, M Lee, X He, W Yih, J Gao, L Deng
Basic reasoning with tensor product representations
arXiv preprint arXiv:1601.02745, 2016

P Smolensky, M Goldrick
Gradient symbolic representations in grammar: The case of French liaison
Ms. Johns Hopkins University and Northwestern University, 2016

G Legendre, P Smolensky, J Culbertson
Blocking effects at the lexicon/semantics interface and bi-directional optimization in French
Optimality-theoretic syntax, semantics, and pragmatics: From uni-to …, 2016

PW Cho, P Smolensky
Bifurcation analysis of a Gradient Symbolic Computation model of incremental processing
Proceedings of the 38th Annual Conference of the Cognitive Science Society …, 2016

M Lee, X He, W Yih, J Gao, L Deng, P Smolensky
Reasoning in vector space: An exploratory study of question answering
arXiv preprint arXiv:1511.06426, 2015

P Smolensky, M Goldrick, D Mathis
Optimization and quantization in gradient symbol systems: a framework for integrating the continuous and the discrete in cognition
Cognitive Science 38 (6), 1102-1138, 2014

J Culbertson, P Smolensky, C Wilson
Cognitive biases, linguistic universals, and constraint‐based grammar learning
Topics in cognitive science 5 (3), 392-424, 2013

J Culbertson, P Smolensky
A Bayesian Model of Biases in Artificial Language Learning: The Case of a Word‐Order Universal
Cognitive science 36 (8), 1468-1498, 2012

P Smolensky
Symbolic functions from neural computation
Phil. Trans. R. Soc. A 370 (1971), 3543-3569, 2012

J Culbertson, P Smolensky, G Legendre
Learning biases predict a word order universal
Cognition 122 (3), 306-329, 2012

J Culbertson, P Smolensky, G Legendre
Statistical learning constrained by syntactic biases in an artificial language learning task
Proceedings of the 36th Annual Boston University Conference on Language …, 2012

P Smolensky
Subsymbolic computation theory for the human intuitive processor
How the World Computes, 675-685, 2012

Below is a list of my primary and secondary PhD student advisees since 1995. To view a complete list of my department's PhD alumni, please visit our Alumni Placement webpage.

edited 8/2016

Primary Advisor

Name

Current Position

Dissertation. Graduating Year.

Deepti Ramadoss
(co-advisor:
L. Burzio)

Sr Data & Communications Specialist, Univ of Pittsburgh

The phonology and phonetics of tone perception. 2011.

Jennifer Culbertson
(co-advisor:
G. Legendre)

Chancellor's Fellow
Dept of Linguistics & English Language
Univ of Edinburgh

Learning biases, regularization, and the emergence of typological universals in syntax. 2010.

Rebecca Morley

Asst Prof
Dept of Linguistics
Ohio State Univ

Generalization, Lexical Statistics, and Typologically Rare Systems. 2008.

Sara Finley

Asst Prof
Dept of Psychology
Pacific Lutheran Univ

Formal and Cognitive Restrictions on Vowel Harmony. 2008.

Gaja Jarosz

Assoc Prof
Dept of Linguistics
Univ of Massachusetts Amherst

Rich Lexicons and Restrictive Grammars – Maximum Likelihood Learning in Optimality Theory. 2006.

Adam Buchwald
(co-advisor:
B. Rapp)

Assoc Prof
Dept of Communicative
Sciences & Disorders
NYU

Sound structure representation, repair and well-formedness: Grammar in spoken language production. 2005.

Lisa Davidson

Assoc Prof
Dept of Linguistics
NYU

The atoms of phonological representation: Gestures, coordination and perceptual features in consonant cluster phonotactics. 2003.

John Hale

Assoc Prof
Dept of Linguistics
Cornell

Grammar, uncertainty, and sentence processing. 2003.

Bruce Tesar

Professor
Dept of Linguistics
Rutgers

Computational Optimality Theory. 1995. Computer Science, Univ of Colorado

Secondary Advisor

Name

Current Position

Dissertation. Graduating Year.

Tamara Nicol Medina
(primary advisor:
B. Landau)

Asst Prof - Teaching
Dept of Psychology
Univ of Delaware

Learning which verbs allow object omission: Verb semantic selectivity and the implicit object construction. 2007.

Matthew Goldrick
(primary advisor:
B. Rapp)

Assoc Prof
Dept of Linguistics
Northwestern

Patterns in sound, patterns in mind: Phonological regularities in speech production. 2002.

Colin Wilson
(primary advisor:
L. Burzio)

Assoc Prof
Dept of Cognitive Science
JHU

Targeted Constraints: An Approach to Positional Neutralization in Optimality Theory. 2000.

Adamantios Gafos
(primary advisor:
L. Burzio)

Professor
Dept of Linguistics
Postdam Univ

The Articulatory Basis of Locality in Phonology. 1996.

Grammatical theory with Gradient Symbol Structures
January 12, 2016, Budapest; Research Institute for Linguistics, Hungarian Academy of Sciences

Four facts about Tensor Product Representations
December 12, 2015, Montreal; NIPS workshop Cognitive Computation: Integrating Neural and Symbolic Approaches

Gradient Symbols in Grammar
October 26, 2015; Mind, Technology and Society Talk Series, Cognitive and Information Sciences Department, University of California − Merced

Towards Understandable Neural Networks for High Level AI Tasks (short course on Tensor Product Representations)
Fall, 2015; Microsoft Research Talks

Does the success of deep neural network language processing mean — finally — the end of theoretical linguistics?
July 31, 2015, Beijing; Invited talk, with Jennifer Culbertson. CoNLL (Conference on Computational Natural Language Learning; SIGNLL of ACL)

Symbolic roles in vectorial computation
July 14, 2014, Redmond WA; Microsoft Research Faculty Summit panel, Deep Learning for Text Processing