Variable selection in linear-circular regression models


ÇAMLI O., KALAYLIOĞLU AKYILDIZ Z. I. , SenGupta A.

JOURNAL OF APPLIED STATISTICS, 2022 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Publication Date: 2022
  • Doi Number: 10.1080/02664763.2022.2110860
  • Journal Name: JOURNAL OF APPLIED STATISTICS
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, ABI/INFORM, Aerospace Database, Business Source Elite, Business Source Premier, CAB Abstracts, Veterinary Science Database, zbMATH
  • Keywords: Regularization, Bayesian lasso, laplace distribution, circular regression, dimension reduction, TUNING PARAMETER SELECTION, BAYESIAN-ANALYSIS, SHRINKAGE, LASSO, DISTRIBUTIONS
  • Middle East Technical University Affiliated: Yes

Abstract

Applications of circular regression models are ubiquitous in many disciplines, particularly in meteorology, biology and geology. In circular regression models, variable selection problem continues to be a remarkable open question. In this paper, we address variable selection in linear-circular regression models where uni-variate linear dependent and a mixed set of circular and linear independent variables constitute the data set. We consider Bayesian lasso which is a popular choice for variable selection in classical linear regression models. We show that Bayesian lasso in linear-circular regression models is not able to produce robust inference as the coefficient estimates are sensitive to the choice of hyper-prior setting for the tuning parameter. To eradicate the problem, we propose a robustified Bayesian lasso that is based on an empirical Bayes (EB) type methodology to construct a hyper-prior for the tuning parameter while using Gibbs Sampling. This hyper-prior construction is computationally more feasible than the hyper-priors that are based on correlation measures. We show in a comprehensive simulation study that Bayesian lasso with EB-GS hyper-prior leads to a more robust inference. Overall, the method offers an efficient Bayesian lasso for variable selection in linear-circular regression while reducing model complexity.