Skip to main content
Contribution to Book
Learning Probabilities Over Underlying Representations
Proceedings of the Twelfth Meeting of the Special Interest Group on Computational Morphology and Phonology (SIGMORPHON2012), Montreal, Canada, June 7, 2012 (2012)
  • Joe Pater
  • Robert Staubs
  • Karen Jesney, University of Southern California
  • Brian Smith, University of Massachusetts Amherst
Abstract
We show that a class of cases that has been previously studied in terms of learning of abstract phonological underlying representations (URs) can be handled by a learner that chooses URs from a contextually conditioned distribution over observed surface representations. We implement such a learner in a Maximum Entropy version of Optimality Theory, in which UR learning is an instance of semisupervised learning. Our objective function incorporates a term aimed to ensure generalization, independently required for phonotactic learning in Optimality Theory, and does not have a bias for single URs for morphemes. This learner is successful on a test language provided by Tesar (2006) as a challenge for UR learning. We also provide successful results on learning of a toy case modeled on French vowel alternations, which have also been previously analyzed in terms of abstract URs. This case includes lexically conditioned variation, an aspect of the data that cannot be handled by abstract URs, showing that in this respect our approach is more general. 
Disciplines
Publication Date
2012
Publisher
Association for Computational Linguistics
Citation Information
Joe Pater, Robert Staubs, Karen Jesney and Brian Smith. "Learning Probabilities Over Underlying Representations" Proceedings of the Twelfth Meeting of the Special Interest Group on Computational Morphology and Phonology (SIGMORPHON2012), Montreal, Canada, June 7, 2012 (2012) p. 62 - 71
Available at: http://works.bepress.com/joe_pater/17/