Open Access Open Access  Restricted Access Subscription Access

Developing a Framework for Online Practice Examination and Automated Score Generation


Affiliations
1 Department of Computer Science & Engineering, Chittagong University of Engineering & Technology, Chittagong-4349, Bangladesh
 

Examination is the process by which the ability and the quality of the examinees can be measured. It is necessary to ensure the quality of the examinees. Online examination system is the process by which the participants can appear at the examination irrespective of their locations by connecting to examination site via Internet using desktop computers, laptops or smart phones. Automated score generation is the process by which the answer scripts of the examinations are evaluated automatically to generate scores. Although, there are many existing online examination systems, the main drawback of these systems is that they cannot compute automated score accurately, especially from the text-based answers. Moreover, most of them are unilingual in nature. As a result, examinees can appear at the examination in a particular language. Considering this fact, in this paper, we present a framework that can take Multiple Choice Questions (MCQ) examinations and written examinations in two different languages English and Bangla. We develop a database where the questions and answers are stored. The questions from the database are displayed in the web page with answering options for the MCQ questions and text boxes for the written questions. For generating the scores of the written questions, we performed several types of analysis of the answers of the written questions. However, for generating the scores of the MCQ questions, we simply compared between the database answers and the user’s answers. We conducted several experiments to check the accuracy of score generation by our system and found that our system can generate 100% accurate scores for MCQ questions and more than 90% accurate scores from text based questions.

Keywords

Multiple Choice Questions, Automated Scoring, Answer Analysis, Experimental Analysis.
User
Notifications
Font Size

  • . Qiao-fang, Z., and L. Yong-fei, 2012. Research and development of online examination system. 2nd International Conference on Computer and Information Application (ICCIA 2012), pp. 1-3.
  • . S. Valenti, F. Neri, and A. Cucchiarelli, “An overview of current research on automated essay grading,” Journal of Information Technology Education, vol. 2, pp.319-330,2003.
  • . T. Miller, “Essay assessment with latent semantic analysis,” Department of Computer Science, University of Toronto, Toronto, ON M5S 3G4, Canada, 2002.
  • . L. M. Rudner, V. Garcia, and C. Welch, “An evaluation of the IntelliMetric essay scoring system,” The Journal of Technology, Learning, and Assessment, vol. 4, no. 4, pp. 1-22, March 2006.
  • . Rudner, Lawrence (ca. 2002). “Computer Grading using Bayesian Networks-Overview”. Retrieved 2012-03-07
  • . FAN Ming-hu, SUN Bin “Design and implementation of general test questions library management system” Computer engineering and Design (in Chinese). vol. 28, May. 2007, pp.2185–2188
  • . H. Pang, S. Yang, L. Bian, “A Web Services Based Online Training and Exam System,” 4th International Conference on Wireless Communications, Networking and Mobile Computing (WiCOM '08), 12-14 Oct. 2008, Dalian, China, pp. 1 – 4
  • . Z. Meng and J. Lu, “Implementing the Emerging Mobile Technologies in Facilitating Mobile Exam System,” 2011 2nd International Conference on Networking and Information Technology IPCSIT vol.17 (2011) © (2011) IACSIT Press, Singapore
  • . Md. M. Islam, A. S. M. L. Hoque “Automated Essay Scoring Using Generalized Latent Semantic Analysis” Journal of Computers, vol. 7, no. 3, March 2012.
  • . W3SCHOOLS' ONLINE CERTIFICATION PROGRAM AT HTTP://WWW.W3SCHOOLS.COM/CERT/DEFAULT.ASP
  • . L. Yang, G. Lian-sheng and W. Bin “Study and implement of distribution system based on J2EE and MVC design pattern” Computer engineering and Design (in Chinese). vol. 28, Apr. 2007, pp.1655– 1658
  • . K. Breithaupt, A. Ariel, and B. P. Veldkamp,“Automated simultaneous assembly for multi-stage testing”. International Journal of Testing, 2005, page no. 319-330
  • . H. S. Tan, S. Y. Hou, “Design and development of the universal testing system based on LAN”, Journal of Central South University, 2000, page no. 367 – 370.
  • . Z. Liu, L. N. Wang, and X. M. Dong, “Design and implementation of test paper reviewing system for NIT network-based test”, Control Engineering of China, 2004, page no. 108- 110
  • . Z. Zhang, G. H. Zhan, “The design and implementation of open and intelligent marking system for computer operation test”, Computer Engineering and Application, 2001, page no. 14-16
  • . Z. M. Yuan, L. Zhang, “G. H. Zhan, A novel web-based online examination system for computer science education”, In proceeding of the 33rd Annual Frontiers in Education, 2003

Abstract Views: 249

PDF Views: 150




  • Developing a Framework for Online Practice Examination and Automated Score Generation

Abstract Views: 249  |  PDF Views: 150

Authors

S. M. Saniul Islam Sani
Department of Computer Science & Engineering, Chittagong University of Engineering & Technology, Chittagong-4349, Bangladesh
Rezaul Karim
Department of Computer Science & Engineering, Chittagong University of Engineering & Technology, Chittagong-4349, Bangladesh
Mohammad Shamsul Arefin
Department of Computer Science & Engineering, Chittagong University of Engineering & Technology, Chittagong-4349, Bangladesh

Abstract


Examination is the process by which the ability and the quality of the examinees can be measured. It is necessary to ensure the quality of the examinees. Online examination system is the process by which the participants can appear at the examination irrespective of their locations by connecting to examination site via Internet using desktop computers, laptops or smart phones. Automated score generation is the process by which the answer scripts of the examinations are evaluated automatically to generate scores. Although, there are many existing online examination systems, the main drawback of these systems is that they cannot compute automated score accurately, especially from the text-based answers. Moreover, most of them are unilingual in nature. As a result, examinees can appear at the examination in a particular language. Considering this fact, in this paper, we present a framework that can take Multiple Choice Questions (MCQ) examinations and written examinations in two different languages English and Bangla. We develop a database where the questions and answers are stored. The questions from the database are displayed in the web page with answering options for the MCQ questions and text boxes for the written questions. For generating the scores of the written questions, we performed several types of analysis of the answers of the written questions. However, for generating the scores of the MCQ questions, we simply compared between the database answers and the user’s answers. We conducted several experiments to check the accuracy of score generation by our system and found that our system can generate 100% accurate scores for MCQ questions and more than 90% accurate scores from text based questions.

Keywords


Multiple Choice Questions, Automated Scoring, Answer Analysis, Experimental Analysis.

References