Soil moisture datasets vary greatly with respect to their time series variability and signal-to-noise characteristics. Minimizing differences in signal variances is particularly important in data assimilation to optimize the accuracy of the analysis obtained after merging model and observation datasets. Strategies that reduce these differences are typically based on resealing the observation time series to match the model. As a result, the impact of the relative accuracy of the model reference dataset is often neglected. In this study, the impacts of the relative accuracies of model- and observation-Pbased soil moisture time series for seasonal and sub seasonal (anomaly) components, respectively on optimal model-Pobservation integration are investigated. Experiments are performed using both well-Pcontrolled synthetic and real data test beds. Investigated experiments are based on resealing observations to a model using strategies with decreasing aggressiveness: 1) using the seasonality of the model directly while matching the variance of the observed anomaly component, 2) resealing the seasonality and the anomaly components separately, and 3) resealing the entire time series as one piece or for each monthly climatology. All experiments use a simple antecedent precipitation index model and assimilate observations via a Kalman filtering approach. Synthetic and real data assimilation results demonstrate that resealing observations more aggressively to the model is favorable when the model is more skillful than observations; however, resealing observations more aggressively to the model can degrade the Kalman filter analysis if observations are relatively more accurate.