Camli O, Kalaylioglu Z, SenGupta A. Variable selection in linear-circular regression models.
J Appl Stat 2022;
50:3337-3361. [PMID:
37969893 PMCID:
PMC10637216 DOI:
10.1080/02664763.2022.2110860]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Accepted: 08/01/2022] [Indexed: 10/15/2022]
Abstract
Applications of circular regression models are ubiquitous in many disciplines, particularly in meteorology, biology and geology. In circular regression models, variable selection problem continues to be a remarkable open question. In this paper, we address variable selection in linear-circular regression models where uni-variate linear dependent and a mixed set of circular and linear independent variables constitute the data set. We consider Bayesian lasso which is a popular choice for variable selection in classical linear regression models. We show that Bayesian lasso in linear-circular regression models is not able to produce robust inference as the coefficient estimates are sensitive to the choice of hyper-prior setting for the tuning parameter. To eradicate the problem, we propose a robustified Bayesian lasso that is based on an empirical Bayes (EB) type methodology to construct a hyper-prior for the tuning parameter while using Gibbs Sampling. This hyper-prior construction is computationally more feasible than the hyper-priors that are based on correlation measures. We show in a comprehensive simulation study that Bayesian lasso with EB-GS hyper-prior leads to a more robust inference. Overall, the method offers an efficient Bayesian lasso for variable selection in linear-circular regression while reducing model complexity.
Collapse