HR-ACT (Human–Robot Action) Database: Communicative and noncommunicative action videos featuring a human and a humanoid robot


Pekçetin T. N., Aşkın G., Evsen Ş., Karaduman T. D., Barinal B., Tunç J., ...More

Behavior Research Methods, vol.58, no.1, 2026 (SSCI, Scopus) identifier identifier identifier

  • Publication Type: Article / Article
  • Volume: 58 Issue: 1
  • Publication Date: 2026
  • Doi Number: 10.3758/s13428-025-02910-0
  • Journal Name: Behavior Research Methods
  • Journal Indexes: Social Sciences Citation Index (SSCI), Scopus, BIOSIS, MEDLINE, Psycinfo
  • Keywords: Action perception, Communicative actions, Human–robot interaction, Normative data, Social robotics
  • Middle East Technical University Affiliated: Yes

Abstract

We present the HR-ACT (Human–Robot Action) Database, a comprehensive collection of 80 standardized videos featuring matched communicative and noncommunicative actions performed by both a humanoid robot (Pepper) and a human actor. We describe the creation of 40 action exemplars per agent, with actions executed in a similar manner, timing, and number of repetitions. The database includes detailed normative data collected from 438 participants, providing metrics on action identification, confidence ratings, communicativeness ratings, meaning clusters, and H values (an entropy-based measure reflecting response homogeneity). We provide researchers with controlled yet naturalistic stimuli in multiple formats: videos, image frames, and raw animation files (.qanim). These materials support diverse research applications in human–robot interaction, cognitive psychology, and neuroscience. The database enables systematic investigation of action perception across human and robotic agents, while the inclusion of raw animation files allows researchers using Pepper robots to implement these actions for real-time experiments. The full set of stimuli, along with comprehensive normative data and documentation, is publicly available at https://osf.io/8vsxq/.