Dissertations, Theses, and Capstone Projects

Date of Degree

6-2023

Document Type

Dissertation

Degree Name

Ph.D.

Program

Linguistics

Advisor

Virginia Valian

Committee Members

Martin Chodorow

Kyle Gorman

Allyson Ettinger

Subject Categories

Artificial Intelligence and Robotics | Cognitive Psychology | Computational Linguistics | Morphology

Keywords

quasi-regularity, neural model, morphological inflection, grapheme-phoneme mapping

Abstract

Many aspects of language can be categorized as quasi-regular: the relationship between the inputs and outputs is systematic but allows many exceptions. Common domains that contain quasi-regularity include morphological inflection and grapheme-phoneme mapping. How humans process quasi-regularity has been debated for decades. This thesis implemented modern neural network models, transformer models, on two tasks: English past tense inflection and Chinese character naming, to investigate how transformer models perform quasi-regularity tasks. This thesis focuses on investigating to what extent the models' performances can represent human behavior. The results show that the transformers' performance is very similar to human behavior in many aspects, such as accuracy, answer variability, etc. However, there are still some differences in the models' performance and human behavior, such as humans are more likely to produce irregular forms for nonce English verbs and are more likely to produce regular pinyin for unknown Chinese characters.

Share

COinS