Zhu C, Ji X, Chen C, Zhou R, Wei L, Zhang X. Improved linear classifier model with Nyström.
PLoS One 2018;
13:e0206798. [PMID:
30395624 PMCID:
PMC6218068 DOI:
10.1371/journal.pone.0206798]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2017] [Accepted: 10/14/2018] [Indexed: 11/30/2022] Open
Abstract
Most data sets consist of interlaced-distributed samples from multiple classes and since these samples always cannot be classified correctly by a linear hyperplane, so we name them nonlinearly separable data sets and corresponding classifiers are named nonlinear classifiers. Traditional nonlinear classifiers adopt kernel functions to generate kernel matrices and then get optimal classifier parameters with the solution of these matrices. But computing and storing kernel matrices brings high computational and space complexities. Since INMKMHKS adopts Nyström approximation technique and NysCK changes nonlinearly separable data to linearly ones so as to reduce the complexities, we combines ideas of them to develop an improved NysCK (INysCK). Moreover, we extend INysCK into multi-view applications and propose multi-view INysCK (MINysCK). Related experiments validate the effectiveness of them in terms of accuracy, convergence, Rademacher complexity, etc.
Collapse