Capabilities for Multiple Choice Question Distractor Generation and Elementary Mathematical Problem Solving by Recurrent Neural Networks
Open Access
Author:
Bakes, Riley
Area of Honors:
Engineering Science
Degree:
Bachelor of Science
Document Type:
Thesis
Thesis Supervisors:
C Lee Giles, Thesis Supervisor Lucas Jay Passmore, Thesis Honors Advisor Judith Todd Copley, Faculty Reader
Keywords:
recurrent neural networks natural language processing mathematical language processing education assistive tools encoder decoder attentional GRU encoder decoder LSTM random forest machine learning
Abstract:
In this thesis we within significant computational resource limitations investigate generally the ability for encoder decoder recurrent neural networks to solve a diverse array of elementary math problems with the motivation of exploring the potential for machine learning as a cost effective education assistive tool. We quantitatively measure performances of recurrent models on a given question type using a test set of unseen problems with a binary scoring and partial credit system. We also investigate how performance scales with increased training data and attentional mechanics, and discuss for which maths models performed best on. From our findings we propose the use of encoder decoder recurrent neural networks for the generation of mathematical multiple choice question distractors and introduce a computationally inexpensive decoding schema called character offsetting which qualitatively shows promise for doing so for several question types. Character offsetting involves freezing the hidden state and top k probabilities of a decoder’s initial probability outputs given an encoders input, then performing k basic greedy decodings given each of the frozen outputs as the decoded sequence start.