Digital Signal Processing: A Review Journal, cilt.168, 2026 (SCI-Expanded, Scopus)
In this paper, we propose an algorithm for downlink (DL) channel covariance matrix (CCM) estimation for frequency division duplexing (FDD) massive multiple-input multiple-output (MIMO) communication systems with base stations (BS) possessing a uniform linear array (ULA) antenna structure. We consider a setting where the UL CCM is mapped to the DL CCM by an interpolator function. We first present a theoretical error analysis of learning a nonlinear embedding by constructing an analytical mapping, which points to the importance of the Lipschitz regularity of the mapping for achieving high estimation performance. Then, based on the theoretical ground, we propose a representation learning algorithm as a solution for the estimation problem, where Gaussian RBF kernel interpolators are chosen to map UL CCMs to their DL counterparts. The proposed algorithm is based on the optimization of an objective function that fits a regression model between the DL CCM and the UL CCM samples in the training dataset and preserves the local geometric structure of the data in the UL CCM space, while explicitly regulating the Lipschitz continuity of the mapping function in light of our theoretical findings. Simulation results show that the proposed algorithm surpasses benchmark methods with respect to three different error metrics.