© The Institution of Engineering and Technology
The authors consider the general problem of graph clustering. Graph clustering manipulates the graph-based data structure and the entries of the solution vectors are only allowed to take non-negative discrete values. Finding the optimal solution is NP-hard, so relaxations are usually considered. Spectral clustering retains the orthogonality rigorously but ignores the non-negativity and discreteness of the solution. Sym non-negative matrix factorisation can retain the non-negativity rigorously but it is hard to reach the orthogonality. In this study, they proposed a novel method named congruent approximate graph clustering (CAC), which can retain the non-negativity rigorously and can reach the orthogonality properly by congruency approximation. Furthermore, the solution obtained by CAC is sparse, which is approximate with the ideal discrete solution. Experimental results on several real image benchmark datasets indicate that CAC achieves encouraging results compared with state-of-the-art methods.
References
-
-
1)
-
15. Dhillon, I.S., Guan, Y., Kulis, B.: ‘Kernel k-means: spectral clustering and normalized cuts’. Proc. of the 10th ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining., 2004, pp. 551–556.
-
2)
-
14. Zhang, R., Rudnicky, A.I.: ‘A large scale clustering scheme for kernel k-means’. 16th Int. Conf. on Pattern Recognition, 2002. Proc, 2002, vol. 4, pp. 289–292.
-
3)
-
1. Shi, J., Malik, J.: ‘Normalized cuts and image segmentation’, IEEE Trans. Pattern Anal. Mach. Intell., 2000, 22, (8), pp. 888–905 (doi: 10.1109/34.868688).
-
4)
-
34. Plummer, M.D., Lovász, L.: ‘Matching theory’ (Elsevier, Amsterdam, 1986).
-
5)
-
5. Comaniciu, D., Meer, P.: ‘Mean shift: a robust approach toward feature space analysis’, IEEE Trans. Pattern Anal. Mach. Intell., 2002, 24, (5), pp. 603–619 (doi: 10.1109/34.1000236).
-
6)
-
23. Cheng, B., Yang, J., Yan, S., et al: ‘Learning with l1-graph for image analysis’, IEEE Trans. Image Process., 2010, 19, (4), pp. 858–866 (doi: 10.1109/TIP.2009.2038764).
-
7)
-
7. Nascimento, M.C.V., De Carvalho, A.C.: ‘Spectral methods for graph clustering – a survey’, Eur. J. Oper. Res., 2011, 211, (2), pp. 221–231 (doi: 10.1016/j.ejor.2010.08.012).
-
8)
-
39. Dempster, A.P., Laird, N.M., Rubin, D.B.: ‘Maximum likelihood from incomplete data via the EM algorithm’, J. R. Stat. Soc., 1977, 39, (1), pp. 1–38.
-
9)
-
25. Arias-Castro, E., Lerman, G., Zhang, T.: ‘Spectral clustering based on local PCA’. , 2013.
-
10)
-
11)
-
31. Bank, R.E., Douglas, C.C.: ‘Sparse matrix multiplication package (SMMP)’, Adv. Comput. Math., 1993, 1, (1), pp. 127–137 (doi: 10.1007/BF02070824).
-
12)
-
33. Jolliffe, I.: ‘Principal component analysis’ (John Wiley & Sons, Ltd., Chichester, 2005).
-
13)
-
18. Xu, W., Liu, X., Gong, Y.: ‘Document clustering based on non-negative matrix factorization’. Proc. of the 26th Annual Int. ACM SIGIR Conf. on Research and Development in Information Retrieval, 2003, pp. 267–273.
-
14)
-
5. Lee, D.D., Seung, H.S.: ‘Learning the parts of objects by non-negative matrix factorization’, Nature, 1999, 401, (6755), pp. 788–791 (doi: 10.1038/44565).
-
15)
-
41. Ding, C., He, X., Zha, H., et al: ‘A min-max cut algorithm for graph partitioning and data clustering’. Proc. IEEE Int. Conf. on Data Mining, San Jose, CA, 2001, pp. 107–114.
-
16)
-
27. Kulis, B.: ‘Metric learning: a survey’, Found. Trends Mach. Learn., 2012, 5, (4), pp. 287–364 (doi: 10.1561/2200000019).
-
17)
-
18)
-
29. Kjeldsen, T.H.: ‘A contextualized historical analysis of the Kuhn–Tucker Theorem in nonlinear programming: the impact of World War II’, Historia Math., 2000, 27, (4), pp. 331–361 (doi: 10.1006/hmat.2000.2289).
-
19)
-
35. Chapelle, O., Zien, A.: ‘Semi-supervised classification by low density separation’. Proc. of the Tenth Int. Workshop on Artificial Intelligence and Statistics, 2005, pp. 57–64.
-
20)
-
2. Von Luxburg, U.: ‘A tutorial on spectral clustering’, Stat. Comput., 2007, 17, (4), pp. 395–416 (doi: 10.1007/s11222-007-9033-z).
-
21)
-
8. Wu, Z., Leahy, R.: ‘An optimal graph theoretic approach to data clustering: theory and its application to image segmentation’, IEEE Trans. Pattern Anal. Mach. Intell., 1993, 15, (11), pp. 1101–1113 (doi: 10.1109/34.244673).
-
22)
-
26. James, W., Stein, C.: ‘Estimation with quadratic loss’, Proc. Fourth Berkeley Symp. Math. Stat. Probab., 1961, 1, (1961), pp. 361–379.
-
23)
-
3. Ng, A.Y., Jordan, M.I., Weiss, Y.: ‘On spectral clustering analysis and an algorithm’. Proc. of Advances in Neural Information Processing Systems, Cambridge, MA, 2001, vol. 14, pp. 849–856.
-
24)
-
6. Wu, K.L., Yang, M.S.: ‘Mean shift-based clustering’, Pattern Recognit., 2007, 40, (11), pp. 3035–3052 (doi: 10.1016/j.patcog.2007.02.006).
-
25)
-
28. Kulis, B., Sustik, M.A., Dhillon, I.S.: ‘Low-rank kernel learning with Bregman matrix divergences’, J. Mach. Learn. Res., 2009, 10, pp. 341–376.
-
26)
-
12. Kuang, D., Park, H., Ding, C.H.Q.: ‘Symmetric nonnegative matrix factorization for graph clustering’, SDM, 2012, 12, pp. 106–117.
-
27)
-
32. Yuster, R., Zwick, U.: ‘Fast sparse matrix multiplication’, ACM Trans. Algorithms (TALG), 2005, 1, (1), pp. 2–13 (doi: 10.1145/1077464.1077466).
-
28)
-
9. Sarkar, S., Soundararajan, P.: ‘Supervised learning of large perceptual organization: graph spectral partitioning and learning automata’, IEEE Trans. Pattern Anal. Mach. Intell., 2000, 22, (5), pp. 504–525 (doi: 10.1109/34.857006).
-
29)
-
16. Cai, D., He, X., Han, J., et al: ‘Graph regularized nonnegative matrix factorization for data representation’, IEEE Trans. Pattern Anal. Mach. Intell., 2011, 33, (8), pp. 1548–1560 (doi: 10.1109/TPAMI.2010.231).
-
30)
-
20. Hoyer, P.O.: ‘Non-negative sparse coding’. Neural Networks for Signal Processing, 2002. Proc. of the 2002 12th IEEE Workshop on, 2002, pp. 557–565.
-
31)
-
17. Ren, W., Li, G., Tu, D., et al: ‘Nonnegative matrix factorization with regularizations’, IEEE J. Emerg. Sel. Top. Circuits Syst., 2014, 4, (1), pp. 153–164 (doi: 10.1109/JETCAS.2014.2298290).
-
32)
-
21. Hoyer, P.O.: ‘Non-negative matrix factorization with sparseness constraints’, J. Mach. Learn. Res., 2004, 5, pp. 1457–1469.
-
33)
-
4. Hagen, L., Kahng, A.B.: ‘New spectral methods for ratio cut partitioning and clustering’, IEEE Trans. Comput.-Aided Des. Integr. Circuits Syst., 1992, 11, (9), pp. 1074–1085 (doi: 10.1109/43.159993).
-
34)
-
37. Balcan, M.F., Blum, A., Gupta, A.: ‘Approximate clustering without the approximation’. Proc. of the 20th Annual ACM-SIAM Symp.n Discrete Algorithms, 2009, pp. 1068–1077.
-
35)
-
24. Li, Z., Liu, J., Chen, S., et al: ‘Noise robust spectral clustering’. . IEEE 11th Int. Conf. on Computer Vision, 2007. ICCV 2007, 2007, pp. 1–8.
-
36)
-
11. Ding, C.H.Q., He, X., Simon, H.D.: ‘On the equivalence of nonnegative matrix factorization and spectral clustering’, SDM, 2005, 5, pp. 606–610.
-
37)
-
30. Pissanetzky, S.: ‘Sparse matrix technology’ (Academic Press, London, 1984).
-
38)
-
10. Luo, D., Ding, C., Huang, H., et al: ‘Non-negative Laplacian embedding’. Ninth IEEE Int. Conf. on Data Mining, 2009. ICDM'09, 2009, pp. 337–346.
-
39)
-
22. Jebara, T., Wang, J., Chang, S.F.: ‘Graph construction and b-matching for semi-supervised learning’. Proc. of the 26th Annual Int. Conf. on Machine Learning, 2009, pp. 441–448.
-
40)
-
38. Balcan, M.F., Blum, A., Gupta, A.: ‘Clustering under approximation stability’, J. ACM (JACM), 2013, 60, (2), p. 8 (doi: 10.1145/2450142.2450144).
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2014.0131
Related content
content/journals/10.1049/iet-cvi.2014.0131
pub_keyword,iet_inspecKeyword,pub_concept
6
6