Abstract:Zoomorphic robots could serve as accessible and practical alternatives for users unable or unwilling to keep pets. However, their affective interactions are often simplistic and short-lived, limiting their potential for domestic adoption. In order to facilitate more dynamic and nuanced affective interactions and relationships between users and zoomorphic robots we present AZRA, a novel augmented reality (AR) framework that extends the affective capabilities of these robots without physical modifications. To demonstrate AZRA, we augment a zoomorphic robot, Petit Qoobo, with novel emotional displays (face, light, sound, thought bubbles) and interaction modalities (voice, touch, proximity, gaze). Additionally, AZRA features a computational model of emotion to calculate the robot's emotional responses, daily moods, evolving personality and needs. We highlight how AZRA can be used for rapid participatory prototyping and enhancing existing robots, then discuss implications on future zoomorphic robot development.
Abstract:Zoomorphic robots have the potential to offer companionship and well-being as accessible, low-maintenance alternatives to pet ownership. Many such robots, however, feature limited emotional expression, restricting their potential for rich affective relationships with everyday domestic users. Additionally, exploring this design space using hardware prototyping is obstructed by physical and logistical constraints. We leveraged virtual reality rapid prototyping with passive haptic interaction to conduct a broad mixed-methods evaluation of emotion expression modalities and participatory prototyping of multimodal expressions. We found differences in recognisability, effectiveness and user empathy between modalities while highlighting the importance of facial expressions and the benefits of combining animal-like and unambiguous modalities. We use our findings to inform promising directions for the affective zoomorphic robot design and potential implementations via hardware modification or augmented reality, then discuss how VR prototyping makes this field more accessible to designers and researchers.
Abstract:Physical interactions with socially assistive robots (SARs) positively affect user wellbeing. However, haptic experiences when touching a SAR are typically limited to perceiving the robot's movements or shell texture, while other modalities that could enhance the touch experience with the robot, such as vibrotactile stimulation, are under-explored. In this exploratory qualitative study, we investigate the potential of enhancing human interaction with the PARO robot through vibrotactile heartbeats, with the goal to regulate subjective wellbeing during stressful situations. We conducted in-depth one-on-one interviews with 30 participants, who watched three horror movie clips alone, with PARO, and with a PARO that displayed a vibrotactile heartbeat. Our findings show that PARO's presence and its interactive capabilities can help users regulate emotions through attentional redeployment from a stressor toward the robot. The vibrotactile heartbeat further reinforced PARO's physical and social presence, enhancing the socio-emotional support provided by the robot and its perceived life-likeness. We discuss the impact of individual differences in user experience and implications for the future design of life-like vibrotactile stimulation for SARs.