Dissertations, Theses, and Capstone Projects
Date of Degree
9-2018
Document Type
Thesis
Degree Name
M.A.
Program
Linguistics
Advisor
William Sakas
Subject Categories
Computational Linguistics | Semantics and Pragmatics
Keywords
computational semantics, word vectors, semantic representation
Abstract
Semantic representation has a rich history rife with both complex linguistic theory and computational models. Though this history stretches back almost 50 years (Salton, 1971), recently the field has undergone an unexpected shift in paradigm thanks to the work of Mikolov et al., 2013(a & b) which has proven that vector-space semantic models can capture large amounts of semantic information. As of yet, these semantic representations are computed at the word level, and finding a semantic representation of a phrase is a much more difficult challenge. Mikolov et al., 2013(a&b) proved that their word vectors can be composed arithmetically to achieve reasonable representations of phrases, but this ignores syntactic information due to the commutativity of the arithmetic composition functions (addition, multiplication, etc.), causing the representation for the phrase “man bites dog” and “dog bites man” to be identical. This work hopes to introduce a way of computing word level semantic representations alongside a parse tree based approach to composing those word vectors to achieve a joint word-phrase semantic vector space. All associated code for this thesis was written in Python and can be found at https://github.com/liamge/Pytorch_ReNN.
Recommended Citation
Geron, Liam S., "Recursive Neural Networks for Semantic Sentence Representation" (2018). CUNY Academic Works.
https://academicworks.cuny.edu/gc_etds/2875
Code repository