Dimensionality Reduction Methods: Comparative Analysis of methods PCA, PPCA and KPCA
DOI:
https://doi.org/10.15359/ru.30-1.7Keywords:
Dimensionality Reduction, Points Clouds, Preimage problemAbstract
The dimensionality reduction methods are algorithms mapping the set of data in subspaces derived from the original space, of fewer dimensions, that allow a description of the data at a lower cost. Due to their importance, they are widely used in processes associated with learning machine. This article presents a comparative analysis of PCA, PPCA and KPCA dimensionality reduction methods. A reconstruction experiment of worm-shape data was performed through structures of landmarks located in the body contour, with methods having different number of main components. The results showed that all methods can be seen as alternative processes. Nevertheless, thanks to the potential for analysis in the features space and the method for calculation of its preimage presented, KPCA offers a better method for recognition process and pattern extractionReferences
Amini, A. A., Chen, Y., Elayyadi, M., & Radeva, P. (2001). Tag surface reconstruction and tracking of myocardial beads from SPAMM-MRI with parametric B-spline surfaces. Medical Imaging, IEEE Transactions on, 20(2), 94-103. Recuperado de doi http://dx.doi.org/10.1109/42.913176
Arroyo, J. y Alvarado, J. (2014). A new variant of Conformal Map Approach method for computing the preimage in Input Space. Recent Advances in Computer Engineering, Communications and Information Technology, 301-304 Recuperado de http://www.wseas.us/e-library/conferences/2014/Tenerife/INFORM/INFORM-00.pdf
Honeine, P. y Richard, C. (Marzo, 2011). Preimage Problem in Kernel-Based Machine Learning. IEEE Signal Processing Magazine, 28 (2), 77-88. Recuperado de http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5714388&isnumber=5714377
Lee, J. y Verleysen, M. (2007). Nonlinear Dimensionality Reduction. Springer. Science & Business. Estados Unidos. doi http://dx.doi.org/10.1007/978-0-387-39351-3
Shlens, J. (2005). A Tutorial on Principal Component Analysis. Systems Neurobiology Laboratory, Salk Institute for Biological Studies. Recuperado de http://arxiv.org/pdf/1404.1100v1.pdf
Scholkopf, B., Smola, A. y Müller, K. (1999). Kernel principal component analysis. Advances in Kernel Methods-Support vector Learning, 327-352. Recuperado de http://pca.narod.ru/scholkopf_kernel.pdf
Tipping, M. y Bishop, M. (1999). Probabilistic principal component analysis. Journal of the Royal Statistical Society. Series B, 61 (3), 611-622. Recuperado de
doi http://dx.doi.org/10.1111/1467-9868.00196
Van der Maaten, L., Postma, E. y Van den Herik, H. (2009). Dimensionality Reduction: A Comparative Review. Technical Report TiCC TR. Recuperado de http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.125.6716&rep=rep1&type=pdf
Downloads
Published
Issue
Section
License
Authors who publish with this journal agree to the following terms:
1. Authors guarantee the journal the right to be the first publication of the work as licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
2. Authors can set separate additional agreements for non-exclusive distribution of the version of the work published in the journal (eg, place it in an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
3. The authors have declared to hold all permissions to use the resources they provided in the paper (images, tables, among others) and assume full responsibility for damages to third parties.
4. The opinions expressed in the paper are the exclusive responsibility of the authors and do not necessarily represent the opinion of the editors or the Universidad Nacional.
Uniciencia Journal and all its productions are under Creative Commons Atribución-NoComercial-SinDerivadas 4.0 Unported.
There is neither fee for access nor Article Processing Charge (APC)