
The Harmonic Mind (Vol II)
 January 2011 , MIT Press
 Role: coauthor

Géraldine Legendre, coauthor  Purchase Online
Paul Smolensky
KriegerEisenhower Professor
Krieger 241B
Away Spring and Fall 2017
4105165331
smolensky@jhu.edu
Curriculum Vitae
Google Scholar Profile
Bio: My research (see Research tab) focuses on integrating symbolic and neural network computation for modeling reasoning and, especially, grammar in the human mind/brain. The work is formal and computational, with emerging applications to neuroscience and applied natural language processing. My research has primarily addressed issues of representation and processing rather than learning. Principal contributions (see Publications tab) are to linguistic theory, the theory of vectorial neural network computation, and the philosophical foundations of cognitive science.
I am currently on leave from Johns Hopkins, working at Microsoft Research in Redmond, Washington (for a nontechnical synopsis of some of my recent work there, see this link: Mind/Brain Networks). Prior to joining the faculty of the Cognitive Science Department at Johns Hopkins, I was a professor in the Computer Science Department and Institute of Cognitive Science at the University of Colorado Boulder. I had been a postdoc at the Center for Cognitive Science at the University of California at San Diego, where I was a founding member of the Parallel Distributed Processing Research Group and worked with Dave Rumelhart, James McClelland and Geoff Hinton. (I also contributed to the UserCentered System Design group led by Don Norman.) My degrees are an A.B. in Physics from Harvard and, from Indiana University, Bloomington, a M.S. in Physics and a Ph.D. in Mathematical Physics. (CV)
Goal
Unification of the sciences of mind & brain through integration of
 compositional, structured, symbolic computation
 at the core of many successful classical theories of the mind
 in particular, the theory of language
 a branch of discrete mathematics
 at the core of many successful classical theories of the mind
 dynamic, distributed, vectorial connectionist computation
 at the core of the theory of neural networks, crucial for
 computational models of the brain
 emergentist models of the mind
 contemporary machine learning and Artificial Intelligence
 a branch of continuous mathematics
 at the core of the theory of neural networks, crucial for
Current
The theory, and application to language, of Gradient Symbolic Computation, a new cognitive architecture in which a single computational system can simultaneously be described formally at two levels:
 a higher ‘abstract mental’ level, where
 data
 consist of symbols that have partial degrees of presence — gradient activity levels
 which blend together to form Gradient Symbol Structures (such as gradient trees)
 processing
 is algebraic operations on vectors and tensors
 data
 a lower ‘abstract neural’ level, where
 data
 consist of distributed activation vectors over many model neurons
 which superimpose to implement Gradient Symbol Structures…
 processing
 is probabilistic spreading of activation (governed by stochastic differential equations)
 through networks with numerically weighted interconnections
 data
Displaying the 20 most recent publications. View the Google Scholar Profile for complete publications list.
Note: Please refresh the page if no publications initially appear.
G Legendre, P Smolensky
A competitionbased analysis of French anticausatives
Lingvisticæ Investigationes 40 (1), 2542, 2017
Q Huang, P Smolensky, X He, L Deng, D Wu
A NeuralSymbolic Approach to Natural Language Tasks
arXiv preprint arXiv:1710.11475, 2017
Q Huang, P Smolensky, X He, L Deng, D Wu
Tensor Product Generation Networks
arXiv preprint arXiv:1709.09118, 2017
H Palangi, P Smolensky, X He, L Deng
Deep Learning of GrammaticallyInterpretable Representations Through QuestionAnswering
arXiv preprint arXiv:1705.08432, 2017
C Manning, P Smolensky
Integrating Symbolic and Neural Computation
Annual Review of Linguistics 4 (1), 2017
MT Putnam, G Legendre, P Smolensky
How constrained is language mixing in biand unimodal production?
Linguistic Approaches to Bilingualism 6 (6), 812816, 2017
PW Cho, M Goldrick, P Smolensky
Incremental parsing in a continuous dynamical system: sentence processing in Gradient Symbolic Computation
Linguistics Vanguard 3 (1), 2017
X He, L Deng, J Gao, W Yih, M Lee, P Smolensky
Computationalmodel operation using multiple subject representations
US Patent App. 15/084,366, 2016
P Smolensky, M Lee, X He, W Yih, J Gao, L Deng
Basic reasoning with tensor product representations
arXiv preprint arXiv:1601.02745, 2016
P Smolensky, M Goldrick
Gradient symbolic representations in grammar: The case of French liaison
Ms. Johns Hopkins University and Northwestern University, 2016
G Legendre, P Smolensky, J Culbertson
Blocking effects at the lexicon/semantics interface and bidirectional optimization in French
Optimalitytheoretic syntax, semantics, and pragmatics: From unito …, 2016
PW Cho, P Smolensky
Bifurcation analysis of a Gradient Symbolic Computation model of incremental processing
Proceedings of the 38th Annual Conference of the Cognitive Science Society …, 2016
M Lee, X He, W Yih, J Gao, L Deng, P Smolensky
Reasoning in vector space: An exploratory study of question answering
arXiv preprint arXiv:1511.06426, 2015
P Smolensky, M Goldrick, D Mathis
Optimization and quantization in gradient symbol systems: a framework for integrating the continuous and the discrete in cognition
Cognitive Science 38 (6), 11021138, 2014
J Culbertson, P Smolensky, C Wilson
Cognitive biases, linguistic universals, and constraint‐based grammar learning
Topics in cognitive science 5 (3), 392424, 2013
J Culbertson, P Smolensky
A Bayesian Model of Biases in Artificial Language Learning: The Case of a Word‐Order Universal
Cognitive science 36 (8), 14681498, 2012
P Smolensky
Symbolic functions from neural computation
Phil. Trans. R. Soc. A 370 (1971), 35433569, 2012
J Culbertson, P Smolensky, G Legendre
Learning biases predict a word order universal
Cognition 122 (3), 306329, 2012
J Culbertson, P Smolensky, G Legendre
Statistical learning constrained by syntactic biases in an artificial language learning task
Proceedings of the 36th Annual Boston University Conference on Language …, 2012
P Smolensky
Subsymbolic computation theory for the human intuitive processor
How the World Computes, 675685, 2012

The Harmonic Mind (Vol I)
 January 2011 , MIT Press
 Role: coauthor

Géraldine Legendre, coauthor  Purchase Online

Optimality Theory: Constraint Interaction in Generative Grammar
 September 2004 , WileyBlackwell
 Role: coauthor
 Purchase Online

Learnability in Optimality Theory
 May 2000 , MIT Press
 Role: coauthor
 Purchase Online

Mathematical Perspectives on Neural Networks
 June 1996 , Lawrence Erlbaum Publishers
 Role: coeditor
 Purchase Online

Connectionism: Debates on Psychological Explanation (Vol 2)
 May 1995 , Blackwell Publishers
 Role: contributor
 Purchase Online

Proceedings of the Connectionist Models Summer School 1993
 November 1993 , Lawrence Erlbaum Publishers
 Role: coeditor
 Purchase Online

Il Connessionismo: Tra Simboli e Neuroni
 January 1992 , Marietti/Cambridge University Press
 Role: author
 Purchase Online
Below is a list of my primary and secondary PhD student advisees since 1995. To view a complete list of my department's PhD alumni, please visit our Alumni Placement webpage.
edited 8/2016
Primary Advisor
Name  Current Position  Dissertation. Graduating Year. 

Deepti Ramadoss  Sr Data & Communications Specialist, Univ of Pittsburgh  The phonology and phonetics of tone perception. 2011. 
Jennifer Culbertson  Chancellor's Fellow  Learning biases, regularization, and the emergence of typological universals in syntax. 2010. 
Asst Prof  Generalization, Lexical Statistics, and Typologically Rare Systems. 2008.  
Asst Prof  Formal and Cognitive Restrictions on Vowel Harmony. 2008.  
Assoc Prof  Rich Lexicons and Restrictive Grammars – Maximum Likelihood Learning in Optimality Theory. 2006.  
Adam Buchwald  Assoc Prof  Sound structure representation, repair and wellformedness: Grammar in spoken language production. 2005. 
Assoc Prof  The atoms of phonological representation: Gestures, coordination and perceptual features in consonant cluster phonotactics. 2003.  
Assoc Prof  Grammar, uncertainty, and sentence processing. 2003.  
Professor  Computational Optimality Theory. 1995. Computer Science, Univ of Colorado 
Secondary Advisor
Name  Current Position  Dissertation. Graduating Year. 

Tamara Nicol Medina  Asst Prof  Teaching  Learning which verbs allow object omission: Verb semantic selectivity and the implicit object construction. 2007. 
Matthew Goldrick  Assoc Prof  Patterns in sound, patterns in mind: Phonological regularities in speech production. 2002. 
Colin Wilson  Assoc Prof  Targeted Constraints: An Approach to Positional Neutralization in Optimality Theory. 2000. 
Adamantios Gafos  Professor  The Articulatory Basis of Locality in Phonology. 1996. 
Grammatical theory with Gradient Symbol Structures
January 12, 2016, Budapest; Research Institute for Linguistics, Hungarian Academy of Sciences
Four facts about Tensor Product Representations
December 12, 2015, Montreal; NIPS workshop Cognitive Computation: Integrating Neural and Symbolic Approaches
Gradient Symbols in Grammar
October 26, 2015; Mind, Technology and Society Talk Series, Cognitive and Information Sciences Department, University of California − Merced
Towards Understandable Neural Networks for High Level AI Tasks (short course on Tensor Product Representations)
Fall, 2015; Microsoft Research Talks
Does the success of deep neural network language processing mean — finally — the end of theoretical linguistics?
July 31, 2015, Beijing; Invited talk, with Jennifer Culbertson. CoNLL (Conference on Computational Natural Language Learning; SIGNLL of ACL)
Symbolic roles in vectorial computation
July 14, 2014, Redmond WA; Microsoft Research Faculty Summit panel, Deep Learning for Text Processing