3D Perceptual Soundfield Reconstruction via Virtual Microphone Synthesis


ERDEM E., Cvetkovic Z., Hachabiboglu H.

IEEE/ACM Transactions on Audio Speech and Language Processing, vol.31, pp.1305-1317, 2023 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 31
  • Publication Date: 2023
  • Doi Number: 10.1109/taslp.2023.3260703
  • Journal Name: IEEE/ACM Transactions on Audio Speech and Language Processing
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Compendex, INSPEC, RILM Abstracts of Music Literature
  • Page Numbers: pp.1305-1317
  • Keywords: Sound field reconstruction, sound field extrapolation, spherical harmonics, spherical microphone arrays, REPRODUCTION
  • Middle East Technical University Affiliated: Yes

Abstract

Perceptual soundfield reconstruction (PSR) is a multichannel audio recording and reproduction framework based on time-intensity panning in the horizontal plane. A practical limitation of PSR is that the optimal directivity patterns required by the system cannot be trivially and precisely obtained in practice, and it is limited to the horizontal plane. This paper extends the horizontal PSR to three dimensions and proposes a virtual microphone synthesis approach to obtain the PSR directivity pattern via sound field extrapolation. The proposed 3D extension and virtual microphone synthesis are evaluated using numerical simulations and a subjective localisation test. Comparisons with second-order Ambisonics rendering indicate that subjects localise sources rendered using 3D PSR more accurately and also with a higher certainty, particularly at an off-centre listening position for the low-channel count reproduction system employed.