Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon openaccess Two-phase clustering algorithm with density exploring distance measure

Here, the authors propose a novel two-phase clustering algorithm with a density exploring distance (DED) measure. In the first phase, the fast global K-means clustering algorithm is used to obtain the cluster number and the prototypes. Then, the prototypes of all these clusters and representatives of points belonging to these clusters are regarded as the input data set of the second phase. Afterwards, all the prototypes are clustered according to a DED measure which makes data points locating in the same structure to possess high similarity with each other. In experimental studies, the authors test the proposed algorithm on seven artificial as well as seven UCI data sets. The results demonstrate that the proposed algorithm is flexible to different data distributions and has a stronger ability in clustering data sets with complex non-convex distribution when compared with the comparison algorithms.

References

    1. 1)
    2. 2)
      • [20]. Wang, L., Bo, L., Jiao, L.: ‘A modified k-means clustering with a density-sensitive distance metric’. Proc. of the Int. Conf. on Rough Sets and Knowledge Technology, Chongquing, China, 2006, pp. 544551.
    3. 3)
      • [9]. Meng, F., Li, H., Wu, Q., et al: ‘Globally measuring the similarity of superpixels by binary edge maps for superpixel clustering’, IEEE Trans. Circuits Syst. Video Technol., 2016, doi: 10.1109/TCSVT.2016.2632148.
    4. 4)
    5. 5)
      • [30]. Bousquet, O., Chapelle, O., Hein, M.: ‘Measure based regularization’. Advances in Neural Information Processing Systems, Vancouver, Canada, 2004, pp. 12211228.
    6. 6)
      • [2]. Hartigan, J.A., Wong, M.A.: ‘Algorithm as 136: a k-means clustering algorithm’, J. R. Stat. Soc. C, Appl. Stat., 1979, 28, (1), pp. 100108.
    7. 7)
    8. 8)
    9. 9)
    10. 10)
    11. 11)
    12. 12)
      • [31]. Blum, A., Chawla, S.: ‘Learning from labeled and unlabeled data using graph mincuts’. Proc. of the 18th Int. Conf. on Machine Learning (ICML), Williamstown, MA, USA, 2001, pp. 1926.
    13. 13)
    14. 14)
      • [3]. Wang, W., Yang, J., Muntz, R., et al: ‘STING: a statistical information grid approach to spatial data mining’. Proc. of the 23rd Int. Conf. on Very Large Data Bases, Athens, Greece, 1997, pp. 186195.
    15. 15)
      • [21]. Gong, M., Jiao, L., Wang, L., et al: ‘Density-sensitive evolutionary clustering’. Proc. of the 11th Pacific-Asia Conf. on Knowledge Discovery and Data Mining, Nanjing, China, 2007, pp. 507514.
    16. 16)
    17. 17)
    18. 18)
    19. 19)
    20. 20)
      • [4]. Agrawal, R., Gehrke, J., Gunopulos, D., et al: ‘Automatic subspace clustering of high dimensional data for data mining applications’. Proc. of ACM-SIGMOND Int. Conf. Management on Data, Seattle, Washington, USA, 1998, pp. 94105.
    21. 21)
    22. 22)
      • [32]. Blake, C.L., Merz, C.J.: ‘UCI repository of machine learning databases’. Technical Report, Department of Information and Computer Science, University of California, Irvine, CA, 1998.
    23. 23)
      • [5]. Guha, S., Rastogi, R., Shim, K.: ‘CURE: an efficient clustering algorithm for large databases’. Proc. of ACM-SIGMOND Int. Conf. Management on Data, Seattle, Washington, USA, 1998, pp. 7384.
    24. 24)
    25. 25)
    26. 26)
    27. 27)
      • [25]. Kang, Z., Peng, C., Cheng, Q.: ‘Twin learning for similarity and clustering: a unified kernel approach’. AAAI, San Francisco, California, USA, 2017, pp. 20802086.
    28. 28)
    29. 29)
    30. 30)
      • [27]. Zhou, D., Bousquet, O., Lal, T.N., et al: ‘Learning with local and global consistency’. Advances in Neural Information Processing Systems, Vancouver, Canada, 2004, pp. 321328.
    31. 31)
    32. 32)
http://iet.metastore.ingenta.com/content/journals/10.1049/trit.2018.0006
Loading

Related content

content/journals/10.1049/trit.2018.0006
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address