You are here

Kernel principal subspace Mahalanobis distances for outlier detection

TitleKernel principal subspace Mahalanobis distances for outlier detection
Publication TypeConference Paper
Year of Publication2011
AuthorsLi C, Georgiopoulos M, Anagnostopoulos GC
Conference NameNeural Networks (IJCNN), The 2011 International Joint Conference on
Date PublishedJuly
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Conference LocationSan Jose, California, USA
KeywordsImage reconstruction, Kernel, kernel principal component analysis, Mahalanobis distance, Manifolds, orthogonal subspace complement, outlier detection, principal component analysis, reconstruction error, Single photon emission computed tomography, Support vector machines, Training

Over the last few years, Kernel Principal Component Analysis (KPCA) has found several applications in outlier detection. A relatively recent method uses KPCA to compute the reconstruction error (RE) of previously unseen samples and, via thresholding, to identify atypical samples. In this paper we propose an alternative method, which performs the same task, but considers Mahalanobis distances in the orthogonal complement of the subspace that is utilized to compute the reconstruction error. In order to illustrate its merits, we provide qualitative and quantitative results on both artificial and real datasets and we show that it is competitive, if not superior, for several outlier detection tasks, when compared to the original RE-based variant and the One-Class SVM detection approach.


Acceptance rate 75% (468/620).


Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer