Date of Degree

6-2020

Document Type

Dissertation

Degree Name

Ph.D.

Program

Speech-Language-Hearing Sciences

Advisor

Richard Schwartz

Committee Members

Klara Marton

Michelle MacRoy-Higgins

Subject Categories

Speech Pathology and Audiology

Keywords

autism, visual perspective taking, mental rotation, language use, strategy use

Abstract

Individuals with autism spectrum disorders (ASD) have difficulty in recognizing environmental perspectives other than their own. Perspective-taking deficits impact language use and understanding in discourse and more general social and cognitive function. Despite extensive research on perspective taking abilities in individuals with ASD, many factors have not been fully examined. This study further examined the contribution of angular disparity, anthropomorphism of an observer, and language use on visual perspective-taking.

Individuals with (ASD) demonstrate a strength in visual spatial cognition and a weakness in visual social cognition. This study examined the factors that may cause people with ASD to have difficulties in taking another’s visual perspective including the impact of angular disparity between the participant’s and observer’s perspectives and the impact of anthropomorphic features on observers. Participants included 15 children with autism spectrum disorders and 15 neurotypical children. This study included three experimental tasks. The first was a visual perspective taking task (VPT2), which examined the participants’ ability to judge how a depicted observer perceived an object. In the visual perspective taking task, the observer viewed the object from different angles, including some trials with angular disparity between the participant’s and the observer’s viewpoints causing opposing views of the same item. The anthropomorphic features of the observer were manipulated by including a block figure, a cartoonish line drawing of a female, and more a naturalistic line drawing of a person (female). The second task included a mental rotation task that required the participant to make judgments about whether two three-dimensional figures were rotations of one another or mirror-images. The third task included the visual perspective taking language task. Participants were required to direct the examiner on how to complete an image based on varying degrees of angular disparity between the participant and examiner’s viewpoints.

The VPT2 and mental rotation tasks were computerized and used eye tracking to gather information about participants’ fixations to images and eye gaze patterns. Data analysis examined eye tracking, reaction time, and accuracy data. Visual perspective taking reaction time results were compared to standardized language scores and a standardized non-verbal intelligence standard scores. For the third task, accuracy and language use type were coded. Participants with ASD were less accurate on the visual perspective taking eye tracking task but performed with a similar degree of accuracy on the mental rotation and language tasks. In addition, participants with ASD fixated on the observer more than the object when compare to neurotypical peers. All of the participants used similar language when directing another person on the language visual perspective taking task that did not require looking at the other person. Based on the results of this study, it appears that individuals with ASD are less accurate and use different strategies when completing VPT2 tasks but they use similar language with a similar degree of accuracy when directing another person on a VPT2 task. This may be due to a variety of factors such as the social qualities of the depicted observers, the need to take into account the depicted observers location in space, and difficulties suppressing their own egocentric viewpoints.

Overall, participants with ASD demonstrated difficulties understanding the visual perspectives of depicted observers which was no the result of mental rotation abilities. Although, they demonstrated this difficulty, they were able to verbally direct another person on a VPT2 task as accurately as their neurotypical counterparts.

Share

COinS