Skip to content
This repository has been archived by the owner on May 6, 2024. It is now read-only.

Latest commit

 

History

History
82 lines (60 loc) · 3.71 KB

LEARN.md

File metadata and controls

82 lines (60 loc) · 3.71 KB

AIMER

Artificial Intelligence Mark Evaluator & Recognizer


LEARN.md

The ./digital_portfolio folder contains all the required technical and general explanations, it also includes a short video demo of AIMER in action.

You could also view only the technical write-ups online at here.


Background

AIMER is an OCR (Optical Character Recognition) answer-sheet grader. It was mainly made and submitted for the "Performance Task" of the official College Board's exam of its Advanced Placement Computer Science Principles ("AP CSP") course. It won a local high school programming competition that it was nominated for, and It was also evaluated by the College Board on July, 2022 towards a full 5/5 examination score; it has been publicized under a GPL-3.0 license at this repository since then.


A score report card of the AP Computer Science Principles' exam displaying a 5/5 score. A high school programming competition winning certificate.
Performance Tasks (e.g., AIMER) contribute 30% and up to 1.5 of the total score, while the remaining 70% are based on the answers to 70 multiple choice questions---all of which were, ironically, graded by a program similar to AIMER.



The name stands for "Artificial Intelligence Mark Evaluator & Recognizer," and although the "Artificial Intelligence" part is a bit of an exaggeration, it nonetheless employs the use of computer vision through OpenCV in order to achieve its goal.


Running

To get it running, you first need to have python3 installed. Then, you can run the config.bat to install the two dependencies of OpenCV and NumPy
Lastly, simply connect a USB camera to your device and double-click the main.py file.

Notice that the best results are achieved in a dimly lit environment---not overly bright and not too dark---on a background that is not the same color as the answer paper and has nothing other than said paper on it.

Also, you may specify the numbers of questions and choices inside of the grader.py file by modifying the NUM_QUESTIONS and NUM_CHOICES global constants (in the below demo's case there were 4 choices and 5 questions.)
You could also change the ANSWER_KEYS global list to your liking, in our case the correct (key) answers were:

  1. 1 (A)
  2. 3 (C)
  3. 4 (D)
  4. 2 (B)
  5. 1 (A)

And the student's answers were:

  1. 1 (A)
  2. 3 (C)
  3. 2 (B)
  4. 2 (B)
  5. 4 (D)

So, the resulting score was 60% as they got 3 out of 5 questions correct.


Demo

demo.mp4

A sample paper with 5 questions and 4 choices. Correct answers were 1, 3, 4, 2, and 1.