Home            Contact us            FAQs
    
      Journal Home      |      Aim & Scope     |     Author(s) Information      |      Editorial Board      |      MSP Download Statistics

     Research Journal of Applied Sciences, Engineering and Technology


An Automated Assessment System for Evaluation of Students' Answers Using Novel Similarity Measures

Madhumitha Ramamurthy and Ilango Krishnamurthi
Department of CSE, Sri Krishna College of Engineering and Technology, Coimbatore-641008, TamilNadu, India
Research Journal of Applied Sciences, Engineering and Technology  2016  3:258-263
http://dx.doi.org/10.19026/rjaset.12.2332  |  © The Author(s) 2016
Received: July ‎2, ‎2015  |  Accepted: August ‎2, ‎2015  |  Published: February 05, 2016

Abstract

Artificial Intelligence has many applications in which automating a human behavior by machines is one of very important research activities currently in progress. This paper proposes an automated assessment system which uses two novel similarity measures which evaluate students’ short and long answers and compares it with cosine similarity measure and n-gram similarity measure. The proposed system evaluates the information recall and comprehension type answers in Bloom’s taxonomy. The comparison shows that the proposed system which uses two novel similarity measures outperforms the n-gram similarity measure and cosine similarity measure for information recall questions and comprehension questions. The system generated scores are also compared with human scores and the system scores correlates with human scores using Pearson and Spearman’s correlation.

Keywords:

Artificial intelligence, assessment, education, sentence similarity, similarity, WordNet,


References

  1. Alfonseca, E., R.M. Carro, M. Freire, A. Ortigosa, D. Pérez and P. Rodríguez, 2005. Authoring of adaptive computer assisted assessment of free-text answer. Educ. Technol. Soc., 8(3): 53-65.
  2. Bachman, L.F., N. Carr, G. Kamei, M. Kim, M.J. Pan and C. Salvador, 2002. A reliable approach to automatic assessment of short answer free responses. Proceeding of the 19th International Conference on Computational Linguistics. Taipei, Taiwan, 2: 1-4.
    CrossRef    
  3. Burstein, J., C. Leacock and R. Swartz, 2001. Automatic evaluation of essays and short answers. In: Danson, M. (Ed.), Proceeding of the 6th International Computer Assisted Conference. Loughborough, UK.
  4. Hu, X. and H. Xia, 2010. Automated assessment system for subjective questions based on LSI. Proceeding of the 3rd International Symposium Intelligent Information Technology and Security Informatics (IITSI, 2010), pp: 250-254.
    CrossRef    
  5. Kanejiya, D., A. Kumar and S. Prasad, 2003. Automatic evaluation of students’ answers using syntactically enhanced LSA. Proceeding of the Workshop on Building Educational Applications using Natural Language Processing (HLT-NAACL-EDUC’ 03), 2: 53-60.
  6. Kerr, D., H. Mousavi and M. Iseli, 2013. Automatic short essay scoring using natural language processing to extract semantic information in the form of propositions. CRESST Report 831.
  7. Kumaran, V.S. and A. Sankar, 2013. An automated assessment of students’ learning in e-learning using concept map and ontology mapping. In: Wang, J.F. and R. Lau (Eds.), ICWL, 2013. LNCS 8167, Springer-Verlag, Berlin, Heidelberg, pp: 274-283.
    CrossRef    
  8. Madhumitha, R. and K. Ilango, 2015. Parts of speech based sentence similarity computation measures. Int. J. Appl. Eng. Res., 10(21): 20176-20184.
  9. Papineni, K., S. Roukos, T. Ward and W.J. Zhu, 2001. BLEU: A method for automatic evaluation of machine translation. IBM Research Report RC22176 (W0109-022).
  10. Pérez, D., O. Postolache, E. Alfonseca, D. Cristea and P. Rodríguez, 2005. About the effects of using Anaphora Resolution in assessing free text student answers. Proceeding of the International Conference Recent Advances in Natural Language Processing (RANLP, 2005), pp: 380-386.
  11. Questions Skills, year. 6 Categories of Questions. Karen Teacher Working Group. Retrieved from: http://ktwg.org/ktwg_ texts.html.
    Direct Link
  12. Saxena, S. and P.R. Gupta, 2009. Automatic assessment of short text answers from computer science domain through pattern based information extraction. Proceeding of the ASCNT, 2009. CDAC, Noida, India, pp: 109-118.
  13. Siddiqi, R. and C.J. Harrison, 2008. On the Automated Assessment of Short Free-text Responses. Retrieved from: http://www.iaea.info/ documents/paper_ 2b711df83.pdf.
    Direct Link
  14. Whittington, D. and H. Hunt, 1999. Approaches to the computerized assessment of free text responses. Proceeding of the 3rd CAA Conference. Loughborough University, Loughborough.
  15. Yigal, A., P. Don, F. Marshall, H. Marissa and O. Susan, 2008. Automated scoring of short-answer open-ended GRE subject test items. GRE Board Research Report No. 04-02, ETS RR -08-20, ETS, Princeton, NJ.

Competing interests

The authors have no competing interests.

Open Access Policy

This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Copyright

The authors have no competing interests.

ISSN (Online):  2040-7467
ISSN (Print):   2040-7459
Submit Manuscript
   Information
   Sales & Services
Home   |  Contact us   |  About us   |  Privacy Policy
Copyright © 2024. MAXWELL Scientific Publication Corp., All rights reserved