Joint self-representation and subspace learning for unsupervised feature selection

Ruili Wang, Ming Zong

Research output: Journal PublicationArticlepeer-review

8 Citations (Scopus)

Abstract

This paper proposes a novel unsupervised feature selection method by jointing self-representation and subspace learning. In this method, we adopt the idea of self-representation and use all the features to represent each feature. A Frobenius norm regularization is used for feature selection since it can overcome the over-fitting problem. The Locality Preserving Projection (LPP) is used as a regularization term as it can maintain the local adjacent relations between data when performing feature space transformation. Further, a low-rank constraint is also introduced to find the effective low-dimensional structures of the data, which can reduce the redundancy. Experimental results on real-world datasets verify that the proposed method can select the most discriminative features and outperform the state-of-the-art unsupervised feature selection methods in terms of classification accuracy, standard deviation, and coefficient of variation.

Original languageEnglish
Pages (from-to)1745-1758
Number of pages14
JournalWorld Wide Web
Volume21
Issue number6
DOIs
Publication statusPublished - 1 Nov 2018
Externally publishedYes

Keywords

  • Self-representation
  • Subspace learning
  • Unsupervised feature selection

ASJC Scopus subject areas

  • Software
  • Hardware and Architecture
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Joint self-representation and subspace learning for unsupervised feature selection'. Together they form a unique fingerprint.

Cite this