| Typically, problems arise when performing pattern recognition tasks in high-dimensional spaces (i.e., âcurse of dimensionalityâ). Significant improvements can be achieved by first mapping the data into a lower-dimensional sub-space. Applying dimensionality reduction to some vector <math>x=[x_{1},x_{2},...,x_{N}]^{T}</math> in an N-dimensional space yields another vector <math>y=[y_{1},y_{2},...,y_{K}]^{T}</math> in an K-dimensional space where <math> K << N </math> Dimensionality reduction techniques using linear transformations have been very popular in determining the intrinsic dimensionality of the manifold as well as extracting its principal directions (i.e., basis vectors). | | Typically, problems arise when performing pattern recognition tasks in high-dimensional spaces (i.e., âcurse of dimensionalityâ). Significant improvements can be achieved by first mapping the data into a lower-dimensional sub-space. Applying dimensionality reduction to some vector <math>x=[x_{1},x_{2},...,x_{N}]^{T}</math> in an N-dimensional space yields another vector <math>y=[y_{1},y_{2},...,y_{K}]^{T}</math> in an K-dimensional space where <math> K << N </math> Dimensionality reduction techniques using linear transformations have been very popular in determining the intrinsic dimensionality of the manifold as well as extracting its principal directions (i.e., basis vectors). |
− | The most prominent method in this category is Principal Component Analysis (see module on Face Recognition Using PCA). PCA determines the basis vectors by finding the directions of maximum variance in the data and it is optimal in the sense that it minimizes the error between the original image and the one reconstructed from its low-dimensional representation. This is equivalent to retaining as much as possible of the variation present in the original data. Its success has triggered significant research in pattern recognition and many powerful dimensionality reduction techniques (e.g., Probabilistic PCA, Linear Discriminant Analysis (LDA) Independent Component Analysis (ICA), Local Feature Analysis (LFA), Kernel PCA) have been proposed for finding appropriate low-dimensional pattern representations<ref name="FaceR">W. Zhao, R. Chellappa, P.J. Phillips, and A. Rosenfeld, âFace recognition: A literature survey,â ACM Computing Surveys, vol. 35, no. 4, pp. 399â458, 2003.</ref>. | + | The most prominent method in this category is Principal Component Analysis (see module on Face Recognition Using PCA). PCA determines the basis vectors by finding the directions of maximum variance in the data and it is optimal in the sense that it minimizes the error between the original image and the one reconstructed from its low-dimensional representation. This is equivalent to retaining as much as possible of the variation present in the original data. Its success has triggered significant research in pattern recognition and many powerful dimensionality reduction techniques (e.g., Probabilistic PCA, Linear Discriminant Analysis (LDA) Independent Component Analysis (ICA), Local Feature Analysis (LFA), Kernel PCA) have been proposed for finding appropriate low-dimensional pattern representations<ref name="FaceR">W. Zhao, R. Chellappa, P.J. Phillips, and A. Rosenfeld, "Face recognition: A literature survey," ACM Computing Surveys, vol. 35, no. 4, pp. 399â458, 2003.</ref>. |