Date of Degree
Computational Linguistics | First and Second Language Acquisition | Syntax
linguistics, learnability, language acquisition
In this thesis, I propose a reconceptualization of the traditional syntactic parameter space of the principles and parameters framework (Chomsky, 1981). In lieu of binary parameter settings, parameter values exist on a gradient plane where a learner’s knowledge of their language is encoded in their confidence that a particular parametric target value, and thus grammatical construction of an encountered sentence, is likely to be licensed by their target grammar. First, I discuss other learnability models in the classic parameter space which lack either psychological plausibility, theoretical consistency, or some combination of the two. Then, I argue for the Gradient Parameter Space as an alternative to discrete binary parameters. Finally, I present findings from a preliminary implementation of a learner that operates in a gradient space, the Non-Defaults Learner (NDL). The findings suggest the Gradient Parameter Space is a viable alternative to the traditional, discrete binary parameter space, and at least one learner in a gradient space a viable alternative to default learners and classical triggering learners, which makes better use of the linguistic input available to the learner.
Howitt, Katherine, "Doing Away With Defaults: Motivation for a Gradient Parameter Space" (2020). CUNY Academic Works.