HR-ACT (Human–Robot Action) Database: Communicative and noncommunicative action videos featuring a human and a humanoid robot


Pekçetin T. N., Aşkın G., Evsen Ş., Karaduman T. D., Barinal B., Tunç J., ...Daha Fazla

Behavior Research Methods, cilt.58, sa.1, 2026 (SSCI, Scopus) identifier identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 58 Sayı: 1
  • Basım Tarihi: 2026
  • Doi Numarası: 10.3758/s13428-025-02910-0
  • Dergi Adı: Behavior Research Methods
  • Derginin Tarandığı İndeksler: Social Sciences Citation Index (SSCI), Scopus, BIOSIS, MEDLINE, Psycinfo
  • Anahtar Kelimeler: Action perception, Communicative actions, Human–robot interaction, Normative data, Social robotics
  • Orta Doğu Teknik Üniversitesi Adresli: Evet

Özet

We present the HR-ACT (Human–Robot Action) Database, a comprehensive collection of 80 standardized videos featuring matched communicative and noncommunicative actions performed by both a humanoid robot (Pepper) and a human actor. We describe the creation of 40 action exemplars per agent, with actions executed in a similar manner, timing, and number of repetitions. The database includes detailed normative data collected from 438 participants, providing metrics on action identification, confidence ratings, communicativeness ratings, meaning clusters, and H values (an entropy-based measure reflecting response homogeneity). We provide researchers with controlled yet naturalistic stimuli in multiple formats: videos, image frames, and raw animation files (.qanim). These materials support diverse research applications in human–robot interaction, cognitive psychology, and neuroscience. The database enables systematic investigation of action perception across human and robotic agents, while the inclusion of raw animation files allows researchers using Pepper robots to implement these actions for real-time experiments. The full set of stimuli, along with comprehensive normative data and documentation, is publicly available at https://osf.io/8vsxq/.