pyKT is a python library build upon PyTorch to train deep learning based knowledge tracing models. The library consists of a standardized set of integrated data preprocessing procedures on more than 7 popular datasets across different domains, 5 detailed prediction scenarios, more than 10 frequently compared DLKT approaches for transparent and extensive experiments. More details about pyKT can see our website and docs.
Use the following command to install pyKT:
Create conda envirment.
conda create --name=pykt python=3.7.5
source activate pykt
pip install -U pykt-toolkit -i https://pypi.python.org/simple
The hyper parameter tunning results of our experiments about all the DLKT models on various datasets can be found at https://drive.google.com/drive/folders/1MWYXj73Ke3zC6bm3enu1gxQQKAHb37hz?usp=drive_link.
- https://github.com/hcnoh/knowledge-tracing-collection-pytorch
- https://github.com/arshadshk/SAKT-pytorch
- https://github.com/shalini1194/SAKT
- https://github.com/arshadshk/SAINT-pytorch
- https://github.com/Shivanandmn/SAINT_plus-Knowledge-Tracing-
- https://github.com/arghosh/AKT
- https://github.com/JSLBen/Knowledge-Query-Network-for-Knowledge-Tracing
- https://github.com/xiaopengguo/ATKT
- https://github.com/jhljx/GKT
- https://github.com/THUwangcy/HawkesKT
- https://github.com/ApexEDM/iekt
- https://github.com/Badstu/CAKT_othermodels/blob/0c28d870c0d5cf52cc2da79225e372be47b5ea83/SKVMN/model.py
- https://github.com/bigdata-ustc/EduKTM
- https://github.com/shalini1194/RKT
- https://github.com/shshen-closer/DIMKT
- https://github.com/skewondr/FoLiBi
- https://github.com/yxonic/DTransformer
- https://github.com/lilstrawberry/ReKT
- DKT: Deep knowledge tracing
- DKT+: Addressing two problems in deep knowledge tracing via prediction-consistent regularization
- DKT-Forget: Augmenting knowledge tracing by considering forgetting behavior
- KQN: Knowledge query network for knowledge tracing: How knowledge interacts with skills
- DKVMN: Dynamic key-value memory networks for knowledge tracing
- ATKT: Enhancing Knowledge Tracing via Adversarial Training
- GKT: Graph-based knowledge tracing: modeling student proficiency using graph neural network
- SAKT: A self-attentive model for knowledge tracing
- SAINT: Towards an appropriate query, key, and value computation for knowledge tracing
- AKT: Context-aware attentive knowledge tracing
- HawkesKT: Temporal Cross-Effects in Knowledge Tracing
- IEKT: Tracing Knowledge State with Individual Cognition and Acquisition Estimation
- SKVMN: Knowledge Tracing with Sequential Key-Value Memory Networks
- LPKT: Learning Process-consistent Knowledge Tracing
- QIKT: Improving Interpretability of Deep Sequential Knowledge Tracing Models with Question-centric Cognitive Representations
- RKT: Relation-aware Self-attention for Knowledge Tracing
- DIMKT: Assessing Student's Dynamic Knowledge State by Exploring the Question Difficulty Effect
- ATDKT: Enhancing Deep Knowledge Tracing with Auxiliary Tasks
- simpleKT: A Simple but Tough-to-beat Baseline for Knowledge Tracing
- SparseKT: Towards Robust Knowledge Tracing Models via K-sparse Attention
- FoLiBiKT: Forgetting-aware Linear Bias for Attentive Knowledge Tracing
- DTransformer: Tracing Knowledge Instead of Patterns: Stable Knowledge Tracing with Diagnostic Transformer
- stableKT: Enhancing Length Generalization for Attention Based Knowledge Tracing Models with Linear Biases
- extraKT: Extending Context Window of Attention Based Knowledge Tracing Models via Length Extrapolation
- ReKT: Revisiting Knowledge Tracing: A Simple and Powerful Model
We now have a paper you can cite for the our pyKT library:
@inproceedings{liupykt2022,
title={pyKT: A Python Library to Benchmark Deep Learning based Knowledge Tracing Models},
author={Liu, Zitao and Liu, Qiongqiong and Chen, Jiahao and Huang, Shuyan and Tang, Jiliang and Luo, Weiqi},
booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2022}
}