Library for processing MOOC data dumps. Currently limited to Coursera data.
Papers published using this code on our MOOC corpus are available in this repository for download here: Published Findingshttps://github.com/WING-NUS/lib4moocdata/tree/master/coursera/docs
If you use this code for your own research, we request you to let us know by email or github issues and cite us.
- Chandrasekaran, M. K., Kan, M.-Y., Ragupathi, K., Tan, B. C. Y. 2015. “Learning instructor intervention from MOOC forums”. In Proceedings of the 8th International Conference on Educational Data Mining, Madrid, Spain. pp. 218-225. International Education Data Mining Society.
- Chandrasekaran, Epp, C.D., M. K., Kan, M.-Y., Litman, D., 2017. “Using Discourse Signals for Robust Instructor Intervention Prediction”. In Proceedings of the Thirty-First AAAI conference on Artificial Intelligence (AAAI-17), San Francisco, USA. pp. 3415-3421. AAAI.
To use this library you need to procure data dumps of MOOCs you won from Coursera. Coursera exports data from its MOOCs after compeltion for use by the university that is hosting it on its platform. These data dumps are .sql exports from MySQL databases. A typical data export consists of the following .sql files Coursera data export
A .txt file with clickstream data is also provided. We do not ywt process them in this library
For replicating our published results (in our papers), it is sufficient to import files (1), (2) and (3).
Step by step instructions on runnning experiments to replicate our How to run this code?EDM 2015 and AAAI 2017 papers are accessible here.
To use the library to process and analyse your data you will first need to install the MySQL database and ingest the .sql files into the database. Prerequisites
Command to ingest .sql files using MySQL command line interface (CLI): mysql> source <path to .sql file>/<name of the.sql file>
Note that Coursera supplies a sql export for every course. This means DDL statements across the files from different courses will be redundant. More importatnly there is no field for coursecode in any of the tables. So, you have to either: i) create a separate MySQL database for each course dump (1 per each course iteration) or ii) add a 'coursecode' field to every table and issue update statements to populate the coursecode field after running the *.sql import
The scripts require you to have installed Perl 5 and some dependant perl packages. Installation
For Linux, Mac users
Linux and Mac users should have perl already installed as part of your OS. You can check this with the command
perl -v in your terminal.
CPAN has tools to easy install perl modules. Please see this step-by-step tutorial Dependant Perl Modules (Packages) to installhttp://www.cpan.org/modules/INSTALL.html
The packages to install are:
- Lingua::EN::Bigram ## Fails on linux centos 6