Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon free Value of ranked voting methods for estimation by analogy

One long-standing issue in estimation by analogy (EBA) is finding closest analogies. Prior studies revealed that existing similarity measures are easily influenced by extreme values and irrelevant features. Instead of identifying closest projects based on the aggregated similarity degrees, the authors propose to use ranked voting methods that rank projects per feature, and then aggregate those ranks over all features using voting count rules. The project(s) with highest score will be the winners and form new estimate for the target project. This also enables us to automatically come up with the preferred number of analogies for each target project, since the winner set may contain more than a single winner. Empirical evaluation with Jack-knifing procedure has been carried out, in which nine datasets come from two repositories (PROMISE and ISBSG) were used for benchmarking. The proposed models are compared with some well known estimation methods: regular K-based EBA, stepwise regression, ordinary least-square regression and categorical regression tree. The performance figures of the proposed models were promising. The use of voting methods present some useful advantages: (i) saving time in finding appropriate K number of analogies for each individual project, (ii) no need for project pruning and (iii) no data standardisation are required.

References

    1. 1)
      • 10. Klamler, C.: ‘On the closeness aspect of three voting rules: Borda, Copeland, Maximin’, Group Decis. Negot., 2005, 14, (3), pp. 233240 (doi: 10.1007/s10726-005-0958-3).
    2. 2)
      • 19. Kirsopp, C., Mendes, E., Premraj, R., Shepperd, M.: ‘An empirical analysis of linear adaptation techniques for case-based prediction’. Int. Conf. CBR, 2003, pp. 231245.
    3. 3)
      • 17. ISBSG, 2007: ‘International software benchmark and standard group’, Data CDRelease, 2007, 10, www.isbsg.org.
    4. 4)
      • 4. Keung, J., Kitchenham, B., Jeffery, D.R.: ‘Analogy-X: providing statistical inference to analogy-based software cost estimation’, IEEE Trans. Softw. Eng., 2008, 34, (4), pp. 471484 (doi: 10.1109/TSE.2008.34).
    5. 5)
      • 8. Kocaguneli, E., Menzies, T., Bener, A., Keung, J.: ‘Exploiting the essential assumptions of analogy-based effort estimation’, IEEE Trans. Softw. Eng., 2011, 38, (2), pp. 425438 (doi: 10.1109/TSE.2011.27).
    6. 6)
      • 21. Menzies, T., Chen, Z., Hihn, J., Lum, K.: ‘Selecting best practices for effort estimation’, IEEE Trans. Softw. Eng., 2006, 32, pp. 883895 (doi: 10.1109/TSE.2006.114).
    7. 7)
      • 11. Ekrem, K., Menzies, T., Keung, J.: ‘On the value of ensemble effort estimation’, IEEE Trans. Softw. Eng., 2011, 38, (6), pp. 14031416.
    8. 8)
      • 5. Shepperd, M., Schofield, C.: ‘Estimating software project effort using analogies’, IEEE Trans. Softw. Eng., 1997, 23, pp. 736743 (doi: 10.1109/32.637387).
    9. 9)
      • 2. Chiu, N.H., Huang, S.J.: ‘The adjusted analogy-based software effort estimation based on similarity distances’, J. Syst. Softw., 2007, 80, pp. 628640. (doi: 10.1016/j.jss.2006.06.006).
    10. 10)
      • 9. Fishburn, P.C.: ‘Condorcet social choice functions’, SIAM J. Appl. Math., 1977, 33, pp. 469489 (doi: 10.1137/0133030).
    11. 11)
      • 14. Eckert, D., Klamler, C., Mitlöhner, J., Schlötterer, C.: ‘A distance-based comparison of basic voting rules’, Cent. Eur. J. Oper. Res., 2006, 14, (4), pp. 377386 (doi: 10.1007/s10100-006-0011-x).
    12. 12)
      • 15. Pham, D.T., Ghanbarzadeh, A., Koç, E., Otri, S., Rahim, S., Zaidi, M.: ‘The bees algorithm – a novel tool for complex optimisation problems’. Second Virtual Int. Conf. Intelligent Production Machines and Systems (I*PROMS-06), Cardiff, UK, 2006, pp. 454459.
    13. 13)
      • 12. Miranda, E.: ‘Improving subjective estimates using paired comparisons’, IEEE Softw., 2001, 18, (1), pp. 8791 (doi: 10.1109/52.903173).
    14. 14)
      • 20. Dejaeger, K., Verbeke, W., Martens, D., Baesens, B.: ‘Data mining techniques for software effort estimation: a comparative study’, IEEE Trans. Softw. Eng., 2012, 38, (2), pp. 375397 (doi: 10.1109/TSE.2011.55).
    15. 15)
      • 16. Boetticher, G., Menzies, T., Ostrand, T.: ‘PROMISE repository of empirical software engineering data. Available at http://promisedata.org/repository’, West Virginia University, Department of Computer Science, 2010.
    16. 16)
      • 1. Azzeh, M.: ‘A replicated assessment and comparison of adaptation techniques for analogy-based effort estimation’, Empir. Softw. Eng., 2012, 17, (1–2), pp. 90127 (doi: 10.1007/s10664-011-9176-6).
    17. 17)
      • 18. Kadoda, G., Cartwright, M., Chen, L., Shepperd, M.: ‘Experiences using case based reasoning to predict software project effort’. Proc. EASE: Evaluation and Assessment in Software Engineering Conf., Keele, UK, 2000.
    18. 18)
      • 7. Mendes, E., Watson, I., Triggs, C., Mosley, N., Counsell, S.: ‘A comparative study of cost estimation models for web hypermedia applications’, Empir. Softw. Eng., 2003, 8, pp. 163196 (doi: 10.1023/A:1023062629183).
    19. 19)
      • 13. Koch, S., Mitlöhner, J.: ‘Software project effort estimation with voting rules’, J. Decis. Support Syst., 2009, 46, (4), pp. 895901 (doi: 10.1016/j.dss.2008.12.002).
    20. 20)
      • 3. Jorgensen, M., Indahl, U., Sjoberg, D.: ‘Software effort estimation by analogy and ‘regression toward the mean’, J. Syst. Softw., 2003, 68, pp. 253262 (doi: 10.1016/S0164-1212(03)00066-9).
    21. 21)
      • 6. Azzeh, M., Neagu, D., Cowling, P.: ‘Fuzzy grey relational analysis for software effort estimation’, Empir. Softw. Eng., 2010, 15, pp. 6090 (doi: 10.1007/s10664-009-9113-0).
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-sen.2012.0119
Loading

Related content

content/journals/10.1049/iet-sen.2012.0119
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
Errata
An Erratum has been published for this content:
Errata
This is a required field
Please enter a valid email address