Dissertations, Theses, and Capstone Projects

Date of Degree

9-2018

Document Type

Dissertation

Degree Name

Ph.D.

Program

Psychology

Advisor

Daniel Fienup

Committee Members

Alexandra Logue

Emily Jones

Kenneth Reeve

Jason Vladescu

Subject Categories

Applied Behavior Analysis

Keywords

Grading criterion, passing criteria, question type, question novelty

Abstract

Students in higher education perform better on exams when they complete frequent quizzes on the assigned reading material; but little research has investigated the different ways quizzes can be administered and how these variations affect quiz and exam performance. One variable that influences how quizzes are administered is the grading criterion. The standard practice grading criterion consists of receiving a score for the quiz based on the number of correctly answered questions. A passing criterion consists of requiring a student to obtain a certain score to earn full credit for the quiz. Previous research has found that students, particularly those who are at risk for failing, do significantly better on exams when there is a low-passing criterion as compared to a higher-passing criterion. Currently, there is no research that compares the effects of quizzes with a standard practice criterion and quizzes with a passing criterion on exam scores. The present study sought to compare a low-passing criterion and a standard practice criterion for quizzes and their effects on exam scores. Furthermore, we manipulated the type of exam question and whether the question has been replicated from a previous quiz across both a low-passing criterion and standard practice. This study replicated previous research demonstrating that students performed better on low-passing criterion exam questions. Additionally, students performed better on comprehension questions and replicated questions. Future research should conduct a parametric analysis of passing criteria to determine the optimal criteria for exam performance.

Share

COinS