Dissertations, Theses, and Capstone Projects
Date of Degree
6-2020
Document Type
Thesis
Degree Name
M.A.
Program
Linguistics
Advisor
William Sakas
Subject Categories
Computational Linguistics | First and Second Language Acquisition | Syntax
Keywords
linguistics, learnability, language acquisition
Abstract
In this thesis, I propose a reconceptualization of the traditional syntactic parameter space of the principles and parameters framework (Chomsky, 1981). In lieu of binary parameter settings, parameter values exist on a gradient plane where a learner’s knowledge of their language is encoded in their confidence that a particular parametric target value, and thus grammatical construction of an encountered sentence, is likely to be licensed by their target grammar. First, I discuss other learnability models in the classic parameter space which lack either psychological plausibility, theoretical consistency, or some combination of the two. Then, I argue for the Gradient Parameter Space as an alternative to discrete binary parameters. Finally, I present findings from a preliminary implementation of a learner that operates in a gradient space, the Non-Defaults Learner (NDL). The findings suggest the Gradient Parameter Space is a viable alternative to the traditional, discrete binary parameter space, and at least one learner in a gradient space a viable alternative to default learners and classical triggering learners, which makes better use of the linguistic input available to the learner.
Recommended Citation
Howitt, Katherine, "Doing Away With Defaults: Motivation for a Gradient Parameter Space" (2020). CUNY Academic Works.
https://academicworks.cuny.edu/gc_etds/3796
Included in
Computational Linguistics Commons, First and Second Language Acquisition Commons, Syntax Commons