Skip to main content
Article
Generative linguistics and neural networks at 60: foundation, friction, and fusion
Language (2019)
  • Joe Pater
Abstract
The birthdate of both generative linguistics and neural networks can be taken as 1957, the year of the publication of foundational work by both Noam Chomsky and Frank Rosenblatt. This paper traces the development of these two approaches to cognitive science, from their largely autonomous early development in their first thirty years, through their collision in the 1980s around the past tense debate (Rumelhart and McClelland 1986, Pinker and Prince 1988), and their integration in much subsequent work up to the present. Although this integration has produced a considerable body of results, the continued general gulf between these two lines of research is likely impeding progress in both: on learning in generative linguistics, and on the representation of language in neural modeling. The paper concludes with a brief argument that generative linguistics is unlikely to fulfill its promise of accounting for language learning if it continues to maintain its distance from neural and statistical approaches to learning.
Keywords
  • Generative linguistics,
  • connectionism,
  • syntax,
  • phonology,
  • neural networks,
  • deep learning,
  • computational linguistics
Publication Date
March, 2019
Citation Information
Joe Pater. "Generative linguistics and neural networks at 60: foundation, friction, and fusion" Language (2019)
Available at: http://works.bepress.com/joe_pater/35/