Contribution to Book
Learning Probabilities Over Underlying RepresentationsProceedings of the Twelfth Meeting of the Special Interest Group on Computational Morphology and Phonology (SIGMORPHON2012), Montreal, Canada, June 7, 2012 (2012)
We show that a class of cases that has been previously studied in terms of learning of abstract phonological underlying representations (URs) can be handled by a learner that chooses URs from a contextually conditioned distribution over observed surface representations. We implement such a learner in a Maximum Entropy version of Optimality Theory, in which UR learning is an instance of semisupervised learning. Our objective function incorporates a term aimed to ensure generalization, independently required for phonotactic learning in Optimality Theory, and does not have a bias for single URs for morphemes. This learner is successful on a test language provided by Tesar (2006) as a challenge for UR learning. We also provide successful results on learning of a toy case modeled on French vowel alternations, which have also been previously analyzed in terms of abstract URs. This case includes lexically conditioned variation, an aspect of the data that cannot be handled by abstract URs, showing that in this respect our approach is more general.
PublisherAssociation for Computational Linguistics
Citation InformationJoe Pater, Robert Staubs, Karen Jesney and Brian Smith. "Learning Probabilities Over Underlying Representations" Proceedings of the Twelfth Meeting of the Special Interest Group on Computational Morphology and Phonology (SIGMORPHON2012), Montreal, Canada, June 7, 2012 (2012) p. 62 - 71
Available at: http://works.bepress.com/joe_pater/17/