Date of Award
Spring 5-2025
Document Type
Thesis
Degree Name
Master of Arts (MA)
Department/Program
Forensic Psychology
Language
English
First Advisor or Mentor
Mark R. Fondacaro
Second Reader
Philip Yanos
Third Advisor
Cynthia Calkins
Abstract
Gender bias is prevalent in personality disorder assessments, and while artificial intelligence has been posited as a solution to improve diagnostic objectivity and accuracy, the potential for such technologies to propagate human gender bias in mental health contexts remains underexplored. This study investigated the influences of gender bias on the diagnostic performance of ChatGPT-4o for personality disorders using three factorial research designs, which involved experimentally manipulating patient gender in a combined sample of 360 vignettes and case studies. Vignettes were synthesized through a novel artificial intelligence-assisted methodology established for this research, and case studies were identified from the literature. Significant gender bias was detected in the diagnoses formulated by ChatGPT-4o for prototypic vignettes, with women being underdiagnosed with antisocial personality disorder (ϕ = .38, p < .001) and overdiagnosed with borderline personality disorder (ϕ = .25, p = .002). When vignettes were ambiguous for two disorders, women were more likely to be diagnosed with borderline personality disorder than men (OR = 4.64, p = .005). When it came to the case studies, no evidence of gender bias was found for depictions of antisocial or borderline personality disorder. Comparisons of vignettes and case studies revealed that diagnostic accuracy was superior for vignettes when patients were men (OR = 6.80, p = .015); however, the vignette accuracy advantage was reduced for women (OR = 0.10, p = .015). Diagnostic errors were greater for vignettes than for case studies when patients were women. These findings provide timely evidence on the potential of artificial intelligence to perpetuate bias in the context of mental health assessment, underscoring the need for further research on this topic. Recommendations are outlined for users of artificial intelligence, developers of these technologies, and researchers in the field.
Recommended Citation
Colclough, Zoe, "Exploring the Biasing Effects of Gender on Personality Disorder Diagnoses Formulated by Artificial Intelligence" (2025). CUNY Academic Works.
https://academicworks.cuny.edu/jj_etds/347

Comments
Supplemental materials can be accessed using the following link: https://www.dropbox.com/scl/fo/0kvyr114acvs2mn16g967/AJM7VNy5-igmPCVrq0PXBhs?rlkey=dvmoptclbv944sy6zqtzt0jb4&st=92wuk9wu&dl=0