© 2022 IEEE.In this work, we study the problem of learning time-vertex dictionaries for the modeling and estimation of time-varying graph signals. We consider a setting with a collection of partially observed time-varying graph signals, and propose a solution for the estimation of the missing signal observations by learning time-vertex dictionaries from the available observations. We adopt a time-vertex dictionary model defined through a set of joint time-vertex spectral kernels, each of which captures a different spectral component of the signals in their joint time-vertex spectrum. The kernel parameters are optimized along with the representations of the signals so as to be coherent with the available signal observations. Experimental results show that the proposed method yields promising estimation performance in comparison with non-adaptive graph dictionary models and baseline classical graph regression methods.