Year of Award
2024
Document Type
Thesis
Degree Type
Master of Arts (MA)
Degree Name
Experimental Psychology
Department or School/College
Department of Psychology
Committee Chair
Rachel L. Severson
Commitee Members
Daniel Denis, Michael Kavanaugh
Subject Categories
Child Psychology | Developmental Psychology | Social Psychology | Theory and Philosophy
Abstract
A prevailing theory in moral psychology, known as the Theory of Dyadic Morality, proposes that we evaluate moral transgressions primarily through weighing the mental states and intentions of both the perpetrator and the victim of a given transgression. Previous research has found adults and children similarly refer to mental states and intentions when making moral evaluations as third-party observers of a moral transgression. How might this framework apply to social robots, who arguably lack the mental states and intentions we can effortlessly identify in humans? The present study sought to examine how children view a social robot as culpable for its actions after physically harming a human partner when its source of movement is manipulated. Participants (N = 48; 3- and 5-years-old) were randomly assigned to a robot condition (autonomous or remote-controlled). Participants first viewed a video depicting a robotic puppy engage in a series of behaviors during an interaction with the human partner. Depending on condition, the robot’s behaviors appeared to be either self-initiated (autonomous) or produced by a handheld remote control operated by the robot’s owner (remote-controlled). After viewing the robot’s behavioral repertoire, participants responded to questions (presented randomly) to indicate their attributions of psychological, perceptual, social, and moral characteristics to robot dogs and humans more generally. Next, participants viewed a second video where the human partner was bit by the robotic puppy following a brief period of play. Following this second video, participants were asked a fixed set of questions to address intentionality, acceptability, culpability, and punishment in response to the moral transgression depicted in the video. Results indicated minimal differences between conditions with respect to attributions and culpability judgments, however, 3-year-olds were more likely to attribute psychological and social traits to the robot than 5-year-olds. Children’s perceptual attributions to robots contributed significantly to their judgments of who was ultimately culpable for harming the human partner.
Recommended Citation
Macko, Thomas M., "A New Frontier of Responsibility: Children's Attributions of Moral Responsibility to Autonomous and Controlled Robots" (2024). Graduate Student Theses, Dissertations, & Professional Papers. 12424.
https://scholarworks.umt.edu/etd/12424
Included in
Child Psychology Commons, Developmental Psychology Commons, Social Psychology Commons, Theory and Philosophy Commons
© Copyright 2024 Thomas M. Macko