HPRNet: Hierarchical point regression for whole-body human pose estimation


Creative Commons License

SAMET N., AKBAŞ E.

Image and Vision Computing, cilt.115, 2021 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 115
  • Basım Tarihi: 2021
  • Doi Numarası: 10.1016/j.imavis.2021.104285
  • Dergi Adı: Image and Vision Computing
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Applied Science & Technology Source, Biotechnology Research Abstracts, Computer & Applied Sciences, INSPEC
  • Anahtar Kelimeler: Whole-body human pose estimation, Multi-person pose estimation, Facial landmark detection, Bottom-up human pose estimation, Hand keypoint Estimation
  • Orta Doğu Teknik Üniversitesi Adresli: Evet

Özet

© 2021 Elsevier B.V.In this paper, we present a new bottom-up one-stage method for whole-body pose estimation, which we call “hierarchical point regression,” or HPRNet for short. In standard body pose estimation, the locations of ~17 major joints on the human body are estimated. Differently, in whole-body pose estimation, the locations of fine-grained keypoints (68 on face, 21 on each hand and 3 on each foot) are estimated as well, which creates a scale variance problem that needs to be addressed. To handle the scale variance among different body parts, we build a hierarchical point representation of body parts and jointly regress them. The relative locations of fine-grained keypoints in each part (e.g. face) are regressed in reference to the center of that part, whose location itself is estimated relative to the person center. In addition, unlike the existing two-stage methods, our method predicts whole-body pose in a constant time independent of the number of people in an image. On the COCO WholeBody dataset, HPRNet significantly outperforms all previous bottom-up methods on the keypoint detection of all whole-body parts (i.e. body, foot, face and hand); it also achieves state-of-the-art results on face (75.4 AP) and hand (50.4 AP) keypoint detection. Code and models are available at https://github.com/nerminsamet/HPRNet.git.