You are here
Kernel principal subspace Mahalanobis distances for outlier detection
|Title||Kernel principal subspace Mahalanobis distances for outlier detection|
|Publication Type||Conference Paper|
|Year of Publication||2011|
|Authors||Li C, Georgiopoulos M, Anagnostopoulos GC|
|Conference Name||Neural Networks (IJCNN), The 2011 International Joint Conference on|
|Publisher||Institute of Electrical and Electronics Engineers (IEEE)|
|Conference Location||San Jose, California, USA|
|Keywords||Image reconstruction, Kernel, kernel principal component analysis, Mahalanobis distance, Manifolds, orthogonal subspace complement, outlier detection, principal component analysis, reconstruction error, Single photon emission computed tomography, Support vector machines, Training|
Over the last few years, Kernel Principal Component Analysis (KPCA) has found several applications in outlier detection. A relatively recent method uses KPCA to compute the reconstruction error (RE) of previously unseen samples and, via thresholding, to identify atypical samples. In this paper we propose an alternative method, which performs the same task, but considers Mahalanobis distances in the orthogonal complement of the subspace that is utilized to compute the reconstruction error. In order to illustrate its merits, we provide qualitative and quantitative results on both artificial and real datasets and we show that it is competitive, if not superior, for several outlier detection tasks, when compared to the original RE-based variant and the One-Class SVM detection approach.
Acceptance rate 75% (468/620).