© The Institution of Engineering and Technology
Supervised locally linear embedding (SLLE) has been proposed for classification tasks. SLLE can take full use of the label information and select neighbours only in the same class. However, SLLE uses the least squares (LSs) method for solving a set of linear equations to obtain linear representation coefficients, which relates to the inverse of a matrix. If the matrix is singular, the solution to the set of linear equations does not exist. Additionally, if the size of neighbourhood is not appropriate, some further neighbours along the manifold would be selected. To remedy those, this study deals with SLLE based on orthogonal matching pursuit (SLLE-OMP) by introducing OMP into SLLE. In SLLE-OMP, LS is replaced by OMP and OMP can reselect new neighbours from old ones. Experimental results on some real-world datasets show that SLLE-OMP can achieve better classification performance compared with SLLE.
References
-
-
1)
-
17. Candès, E.J., Wakin, M.B.: ‘An introduction to compressed sampling’, IEEE Signal Process. Mag., 2008, 25, (2), pp. 21–30 (doi: 10.1109/MSP.2007.914731).
-
2)
-
22. Bache, K., Lichman, M.: .
-
3)
-
3. Belkin, M., Niyogi, P.: ‘Laplacian eigenmaps for dimensionality reduction and data representation’, Neural Comput., 2003, 15, (6), pp. 1373–1396 (doi: 10.1162/089976603321780317).
-
4)
-
12. Baraniuk, R.G.: ‘A lecture on compressive sensing’, IEEE Signal Process. Mag., 2007, 24, (4), pp. 118–121 (doi: 10.1109/MSP.2007.4286571).
-
5)
-
7. Kouropteva, O., Okun, O., Pietikäinen, M.: ‘Incremental locally linear embedding’, Pattern Recognit., 2005, 38, (10), pp. 1764–1767 (doi: 10.1016/j.patcog.2005.04.006).
-
6)
-
25. Samaria, F.S., Harter, A.: ‘Parameterisation of a stochastic model for human face identification’. Proc. of the Second IEEE Workshop on Applications of Computer Vision, 1994, pp. 138–142. .
-
7)
-
21. de Ridder, D., Duin, R.: ‘Locally linear embedding for classification’. .
-
8)
-
4. Donoho, D.L., Grimes, C.: ‘Hessian eigenmaps: locally linear embedding techniques for high-dimensional data’, Proc. Natl Acad. Sci. USA, 2003, 100, (10), pp. 5591–5596 (doi: 10.1073/pnas.1031596100).
-
9)
-
26. Lee, K.C., Ho, J., Kriegman, D.: ‘Nine points of lights: acquiring subspaces for face recognition under variable lighting’. Proc. of IEEE Conf. on Computer Vision and Pattern Recognition Machine Learning (CVPR'07). .
-
10)
-
19. Zhang, L., Zhou, W.: ‘On the sparseness of 1-norm support vector machines’, Neural Netw., 2010, 23, (3), pp. 373–385 (doi: 10.1016/j.neunet.2009.11.012).
-
11)
-
16. DeVore, R.: ‘Deterministic constructions of compressed sensing matrices’, J. Complex., 2007, 23, pp. 918–925 (doi: 10.1016/j.jco.2007.04.002).
-
12)
-
5. Zhang, Z., Zha, H.: ‘Principal manifolds and nonlinear dimension reduction via local tangent space alignment’, J. Shanghai Univ. (Engl. Ed.), 2004, 8, (4), pp. 406–424 (doi: 10.1007/s11741-004-0051-1).
-
13)
-
10. Kong, D., Ding, C., Huang, H., Nie, F.: ‘An iterative locally linear embedding algorithm’. Proc. of the 29th Int. Conf. on Machine Learning (ICML 2012), 2012.
-
14)
-
20. Kouropteva, O., Okun, O., Hadid, A., Soriano, M., Pietikäinen, M.: ‘Beyond locally linear embedding algorithm’. , Machine Vision Group, University of Oulu, Finland, 2002.
-
15)
-
14. Chen, S., Billings, S.A., Luo, W.: ‘Orthogonal least squares methods and their application to non-linear system identification’, Int. J. Control, 1989, 50, (5), pp. 1873–1896 (doi: 10.1080/00207178908953472).
-
16)
-
6. de Ridder, D., Kouropteva, O., Okun, O., Pietikäinen, M., Duin, R.: ‘Supervised locally linear embedding’, Artif. Neural Netw. Neural Inf. Process., 2003, 2714, pp. 333–341.
-
17)
-
24. Pomeroy, S., Tamayo, P., Gaasenbeek, M., et al: ‘Prediction of central nervous system embryonal tumour outcome based on gene expression’, Nature, 2002, 415, (6870), pp. 436–442 (doi: 10.1038/415436a).
-
18)
-
19)
-
13. Donoho, D.L.: ‘Compressed sensing’, IEEE Trans. Inf. Theory, 2006, 52, pp. 1289–1306 (doi: 10.1109/TIT.2006.871582).
-
20)
-
8. Saul, L., Roweis, S.: ‘Think globally, fit locally: unsupervised learning of nonlinear manifolds’, J. Mach. Learn. Res., 2003, 4, pp. 119–155.
-
21)
-
10. Roweis, S.T., Saul, L.K.: ‘Nonlinear dimensionality reduction by locally linear embedding’, SCIENCE, 2000, 290, pp. 2323–2326 (doi: 10.1126/science.290.5500.2323).
-
22)
-
21. Elad, M., Aharon, M.: ‘Image denoising via sparse and redundant representations over learned dictionaries’, IEEE Trans. Image Process., 2006, 15, pp. 3736–3745 (doi: 10.1109/TIP.2006.881969).
-
23)
-
24)
-
18. Donoho, D., Tsaig, Y., Drori, I., Starck, J.: ‘Sparse solution of underdetermined systems of linear equations by stagewise orthogonal matching pursuit’, IEEE Trans. Inf. Theory, 2012, 58, pp. 1094–1121 (doi: 10.1109/TIT.2011.2173241).
-
25)
-
9. de Ridder, D., Loog, M., Reinders, M.: ‘Local fisher embedding’. Proc. of the 17th Int. Conf. on Pattern Recognition, August 2004, vol. 2, pp. 295–298.
-
26)
-
2. Tenenbaum, J.B., Silva, V.D., Langford, J.C.: ‘A global geometric framework for nonlinear dimensionality reduction’, Science, 2000, 290, (5500), pp. 2319–2323 (doi: 10.1126/science.290.5500.2319).
-
27)
-
15. Pati, Y.C., Rezaiifar, R., Krishnaprasad, P.S.: ‘Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition’. Proc. on Record of the 27th Asilomar Conf.: Signal, Systems and Computers, 1993, vol. 1, pp. 40–44.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-ipr.2014.0841
Related content
content/journals/10.1049/iet-ipr.2014.0841
pub_keyword,iet_inspecKeyword,pub_concept
6
6