http://iet.metastore.ingenta.com
1887

Software online bug detection: applying a new kernel method

Software online bug detection: applying a new kernel method

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IET Software — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

This study presents a new online bug detection approach for safety critical software systems. The novelty of the proposed approach is the use of support vector machine (SVM) with a customised kernel function to accelerate the detection of bugs early before they could cause the program to fail. The new kernel function is built based on a novel sequence-matching technique to measure the similarities between passing and failing executions, represented as sequences of the program predicates. The SVM method constructs a hyperplane that optimally divides the program execution space into two regions of failing and passing executions. The hyperplane could be further applied to detect the symptoms of failure during the program execution. Here the experiments with the Rhythmbox and SPEC2000 test programs, demonstrate the ability of the proposed method in early bug detection with small overhead on the program execution time. Moreover, the proposed approach in this study has revealed 83 out of 132 bugs (i.e. 63%) in Siemens while only 10% of code is required to be manually examined to locate the origins of failure. This is the most promising result compared with the latest approaches to early bug detection.

References

    1. 1)
      • J. Hecht . New scientist magazine.
    2. 2)
      • Baah, G.K., Gray, A., Harrold, M.J.: `On-line anomaly detection of deployed software: a statistical machine learning approach', Proc. Int. Workshop on Software Quality Assurance, 2006, Oregon, USA, p. 70–77.
    3. 3)
      • Hangal, S., Lam, M.: `Tracking down software bugs using automatic anomaly detection', Proc. Int. Conf. on Software Engineering, May 2002, Orlando, Florida, USA, p. 291–301.
    4. 4)
      • Fei, L., Lee, K., Li, F., Midkiff, S.P.: `Argus: online statistical bug detection', Proc. Fundamental Approaches to Software Engineering, 2006, Vienna, p. 308–323.
    5. 5)
      • Parsa, S., Arabi, S., Vahidi, M.: `A learning approach to early bug prediction in deployed software', Proc. Int. Conf. Artificial Intelligence: Methodology, Systems and Applications, 2008, Varna, Bulgaria, p. 400–404.
    6. 6)
    7. 7)
      • Liu, C., Yan, X., Fei, L., Han, J., Midkiff, S.P.: `Sober: statistical model-based bug localization', Proc. Int. Symp. Foundations of Software Engineering, 2005, Lisbon, p. 286–295.
    8. 8)
      • J. Shawe-Taylor , N. Cristianini . (2004) Kernel methods for pattern analysis.
    9. 9)
      • R. Herbrich . (2002) Learning kernel classifiers theory and algorithms.
    10. 10)
      • N. Cristianini , J. Shawe-Taylor . (2000) An introduction to support vector machines and other kernel-based learning methods.
    11. 11)
      • Software-artifact Infrastructure Repository, available at: http://sir.unl.edu, accessed March 2010.
    12. 12)
      • Jiang, L., Su, Z.: `Context-aware statistical debugging: from bug predictors to faulty control flow paths', Proc. Int. Conf. Automated Software Engineering, 2007, Atlanta, p. 184–193.
    13. 13)
      • Zeller, A.: `Isolating cause-effect chains from computer programs', Proc. Int Symp. Foundations of Software Engineering, 2002, Charleston, South Carolina, p. 1–10.
    14. 14)
      • Liblit, B., Naik, M., Zheng, A., Aiken, A., Jordan, M.: `Scalable statistical bug isolation', Int. Conf. Programming Language Design and Implementation, 2005, Chicago, p. 15–26.
    15. 15)
      • Rhythmbox, available at: http://www.rhythmbox.org, accessed April 2010.
    16. 16)
      • Nainar, P.A., Chen, T., Rosin, J., Liblit, B.: `Statistical debugging using compound Boolean predicates', Proc. Int. Symp. Software Testing and Analysis, 2007, London, UK, p. 5–15.
    17. 17)
      • Zheng, A.X., Jordan, M.I., Liblit, B., Naik, M., Aiken, A.: `Statistical debugging: simultaneous identification of multiple bugs', Proc. Int. Conf. Machine Learning, 2006, New York, p. 1105–1112.
    18. 18)
    19. 19)
    20. 20)
      • Pytlik, B., Renieris, M., Krishnamurthi, S., Reiss, S.: `Automated fault localization using potential invariants', Proc. Int. Workshop Automated and Algorithmic Debugging, 2003, Ghent, Belgium, p. 273–276.
    21. 21)
      • Liblit, B.: `Cooperative bug isolation', 2004, PhD, University of California.
    22. 22)
      • A. Zeller . (2009) Why Programs fail: a guide to systematic debugging.
    23. 23)
      • Liblit, B., Aiken, A., Zheng, X., Jordan, M.I.: `Bug isolation via remote program sampling', Proc. Conf. Programming Language Design and Implementation, 2003, San Diego, p. 141–154.
    24. 24)
      • Zheng, A.X., Jordan, M.I., Liblit, B., Aiken, A.: `Statistical debugging of sampled programs', Proc. Int. Conf. NeuralInformation Processing Systems, MIT Press, 2004, Cambridge.
    25. 25)
      • P. Ammann , J. Offutt . (2008) Introduction to software testing.
    26. 26)
      • Untch, R.H., Offutt, J., Harrold, M.J.: `Mutation analysis using mutant schemata', Proc. Int. Symp. Software Testing and Analysis, 1993, New York, NY, USA, p. 139–148.
    27. 27)
      • Jia, Y., Milu, M.H.: `A customizable, runtime-optimized higher order mutation testing tool for the full C language', Proc. Int. Conf. Testing: Academic and Industrial Conf. Practice and Research Techniques, August 2008, Windsor, UK, p. 29–31.
    28. 28)
    29. 29)
      • Milu, www.dcs.kcl.ac.uk/pg/jiayue/milu, accessed February 2010.
    30. 30)
      • Chen, D., He, Q., Wang, X.: `On linear separability of data sets in feature space', Int. Conf. Development and Learning, 2007, London, UK, 70, p. 2441–2448.
    31. 31)
      • Fei, L., Midkiff, S.P.: `Artemis: practical runtime monitoring of applications for execution anomalies', Proc. Programming Language Design and Implementation, 2006, Ottawa, Canada, p. 84–95.
    32. 32)
      • Hutchins, M., Foster, H., Goradia, T., Ostrand, T.: `Experiments of the effectiveness of dataflow and controlflow based test adequacy criteria', Proc. Int. Conf. Software Engineering, 1994, Sorrento, Italy, p. 191–200.
    33. 33)
      • LIBSVM – A library for support vector machines, available at: http://www.csie.ntu.edu.tw/~cjlin/libsvmtools.
    34. 34)
      • CodeSurfer Path Inspector, available at: http://www.grammatech.com/products/codesurfer.
    35. 35)
      • http://www.testwell.fi, accessed January 2010.
    36. 36)
      • Liblit B., Bug 137834, available at: http://bugzilla.gnome.org/show bug. cgi? id=137834.
    37. 37)
      • Liblit B., Bug 137460, available at: http://bugzilla.gnome.org/show bug. cgi? id=137460.
    38. 38)
      • Parsa, S., Vahidi, M., Arabi, S.: `Finding causes of software failure using ridge regression and association rule generation methods', Int. Conf. Software Engineering, 2009, Phuket, Thailand, p. 873–878.
    39. 39)
      • Parsa, S., Arabi, S., Vahidi, M., Minaei, B.: `Statistical software debugging: from bug predictors to the main causes of failure', Int. Conf. Applications of Digital Information and Web Technologies, 2009, London, UK, p. 802–807.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-sen.2010.0057
Loading

Related content

content/journals/10.1049/iet-sen.2010.0057
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address