Nearest-neighbour ensembles in lasso feature subspaces

Access Full Text

Nearest-neighbour ensembles in lasso feature subspaces

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IET Computer Vision — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

The least absolute shrinkage and selection operator (lasso) is a promising feature selection technique. However, it has traditionally not been a focus of research in ensemble classification methods. In this study, the authors propose a robust classification algorithm that makes use of an ensemble of classifiers in lasso feature subspaces. The algorithm consists of two stages: the first is a lasso-based multiple feature subsets selection cycle, which tries to find a number of relevant and diverse feature subspaces; the second is an ensemble-based decision system that intends to preserve the classification performance in case of abrupt changes in the representation space. Experimental results on the two-class textured image segmentation problem prove the effectiveness of the proposed classification method.

Inspec keywords: image texture; image segmentation

Other keywords: lasso feature subspaces; nearest-neighbour ensembles; textured image segmentation problem; representation space; least absolute shrinkage and selection operator

Subjects: Computer vision and image processing techniques; Optical, image and video signal processing

References

    1. 1)
      • He, X., Beauseroy, P., Smolarz, A.: `Nearest neighbor ensembles in lasso feature subspaces', Proc. Fifth IET Int. Conf. on Visual Information Engineering, 29 July–1 August 2008, Xi'an, China, p. 100–105.
    2. 2)
      • L.K. Hansen , P. Salamon . Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. , 10 , 993 - 1001
    3. 3)
      • Opitz, D.W.: `Feature selection for ensembles', Proc. 16th National Conf. on Artificial Intelligence, 18–22 July 1999, Orlando, FL, USA, p. 379–384.
    4. 4)
      • K. Tumer , J. Ghosh . Error correlation and error reduction in ensemble classifiers. Connect. Sci. , 385 - 404
    5. 5)
      • S.D. Bay . Nearest neighbor classification from multiple feature subsets. Intell. Data Anal. , 3 , 191 - 209
    6. 6)
      • V. Roth . The generalized lasso. IEEE Trans. Neural Netw. , 1 , 16 - 28
    7. 7)
      • R. Tibshirani . Regression shrinkage and selection via the lasso. J. R. Stat. Soc. B (Methodol.) , 1 , 267 - 288
    8. 8)
      • V. Vapnik . (1995) The nature of statistical learning theory.
    9. 9)
      • V.N. Vapnik . (1998) Statistical learning theory.
    10. 10)
      • Krogh, A., Vedelsby, J.: `Neural network ensembles, cross validation, and active learning', Proc. Eighth Annual Conf. on Neural Information Processing System, 3–8 December 1994, Vancouver, British Columbia, Canada, p. 231–238.
    11. 11)
      • Beauseroy, P., Smolarz, A.: `Optimisation de la géométrie du voisinage pour la segmentation d'images texturées', Proc. XXXVIèmes Journées de Statistique, 24–28 May 2004, Montpellier, France, p. 6, (in French).
    12. 12)
      • C.J. Veenman , D.M.J. Tax . LESS: a model-based classifier for sparse subspaces. IEEE Trans. Pattern Anal. Mach. Intell. , 9 , 1496 - 1500
    13. 13)
      • T.K. Ho . The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. , 8 , 832 - 844
    14. 14)
      • Smolarz, A.: `Etude qualitative du modèle auto-binomial appliqué à la synthèse de texture', Proc. XXIXèmes Journées de Statistique, 26–30 May 1997, Carcassonne, France, p. 712–715, (in French).
    15. 15)
      • L. Breiman . Bagging predictors. Mach. Learn. , 2 , 123 - 140
    16. 16)
      • Kim, Y., Kim, J.: `Gradient lasso for feature selection', Proc. 21st Int. Conf. on Machine Learning, 4–8 July 2004, Banff, Alberta, Canada, p. 60–67.
    17. 17)
      • Cunningham, P., Carney, J.: `Diversity versus quality in classification ensembles based on feature selection', Proc. 11th European Conf. on Machine Learning, 30 May–2 June 2000, Barcelona, Catalonia, Spain, p. 109–116.
    18. 18)
      • D.W. Opitz , R. Maclin . Popular ensemble methods: an empirical study. J. Artif. Intell. Res. , 169 - 198
    19. 19)
      • Dietterich, T.G.: `Ensemble methods in machine learning', Proc. First Int. Workshop on Multiple Classifier Systems, 21–23 June 2000, Cagliari, Italy, p. 1–15.
    20. 20)
      • P. Beauseroy , A. Smolarz , X. He . Sélection aléatoire d'espaces de représentation pour la décision binaire en environnement non-stationnaire: application à la segmentation d'images texturées. Rev. Nouvelles Technol. Inf. , 87 - 105
    21. 21)
      • E. Bauer , R. Kohavi . An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Mach. Learn. , 105 - 139
    22. 22)
      • Li, F., Yang, Y., Xing, E.P.: `From lasso regression to feature vector machine', Proc. 19th Annual Conf. on Neural Information Processing Systems, 5–10 December 2005, Vancouver and Whistler, British Columbia, Canada, p. 779–786.
    23. 23)
      • C.J.C. Burges . A tutorial on support vector machines for pattern recognition. Data Min. Knowl. Discov. , 2 , 121 - 167
    24. 24)
      • Keerthi, S.S.: `Generalized lars as an effective feature selection tool for text classification with svms', Proc. 22nd Int. Conf. on Machine Learning, 7–11 August 2005, Bonn, Germany, p. 417–424.
    25. 25)
      • A. Tsymbal , S. Puuronen , D.W. Patterson . Ensemble feature selection with the simple bayesian classification. Inf. Fusion , 2 , 87 - 100
    26. 26)
      • P. Brodatz . (1966) Textures: a photographic album for artists and designers.
    27. 27)
      • Zenobi, G., Cunningham, P.: `Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error', Proc. 12th European Conf. on Machine Learning, 3–7 September 2001, Freiburg, Germany, p. 576–587.
    28. 28)
      • Ng, A.Y.: `Feature selection, ', Proc. 21st Int. Conf. on Machine Learning, 4–8 July 2004, Banff, Alberta, Canada, p. 78–85.
    29. 29)
      • N. Cristianini , J. Shawe-Taylor . (2000) An introduction to support vector machines.
    30. 30)
      • Freund, Y., Schapire, R.E.: `Experiments with a new boosting algorithm', Proc. 13th Int. Conf. on Machine Learning, 3–6 July 1996, Bari, Italy, p. 148–156.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2009.0056
Loading

Related content

content/journals/10.1049/iet-cvi.2009.0056
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading