A machine learning project that aims to test the extent to which GPT-2 is able to understand Python syntax.
Live demo available (Please click "Restart Space" if it has timed out due to inactivity).
The model used is a GPT-2 large which has been trained on around 40 MBs worth of Python code scraped from Github for approximately 16 GPU hours on 4x Nvidia GTX 1080Tis. Results show that the model is able to successfully learn Python syntax, and given more training time, computation resources and data, should be able to provide accurate code autocompletion.
A notebook tutorial that goes over the finer details of this project can be found here.
MIT