You are here

Kernel principal subspace Mahalanobis distances for outlier detection

TitleKernel principal subspace Mahalanobis distances for outlier detection
Publication TypeConference Paper
Year of Publication2011
AuthorsLi C, Georgiopoulos M, Anagnostopoulos GC
Conference NameNeural Networks (IJCNN), The 2011 International Joint Conference on
Date PublishedJuly
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Conference LocationSan Jose, California, USA
KeywordsImage reconstruction, Kernel, kernel principal component analysis, Mahalanobis distance, Manifolds, orthogonal subspace complement, outlier detection, principal component analysis, reconstruction error, Single photon emission computed tomography, Support vector machines, Training
Abstract

Over the last few years, Kernel Principal Component Analysis (KPCA) has found several applications in outlier detection. A relatively recent method uses KPCA to compute the reconstruction error (RE) of previously unseen samples and, via thresholding, to identify atypical samples. In this paper we propose an alternative method, which performs the same task, but considers Mahalanobis distances in the orthogonal complement of the subspace that is utilized to compute the reconstruction error. In order to illustrate its merits, we provide qualitative and quantitative results on both artificial and real datasets and we show that it is competitive, if not superior, for several outlier detection tasks, when compared to the original RE-based variant and the One-Class SVM detection approach.

Notes

Acceptance rate 75% (468/620).

DOI10.1109/IJCNN.2011.6033548

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer