Skip to content

Trains convo neural network, converts to onnx, infer in Unity with Sentis NN

License

Notifications You must be signed in to change notification settings

sergiosolorzano/qmnist-unity-sentis-nn

Repository files navigation

Description

This Unity app runs inference on Unity on a convolutional neural network (CNN) that we train in a jupyter notebook to recognize a user's handwritten digits.

star


Setup

The project includes a jupyter notebook in the Train-MNIST directory with a CNN trained to reconize handwritten digits on the Q-MNIST dataset. Training runs on GPU if CUDA is available, else CPU. Using pytorch's onnx module the the CNN model is converted to ONNX format which Unity3D's Sentis neural network can use for inference. I exported the trained model into Unity3D to predict the user's handwritten digits.

Capturing the user's hand writing and running model inference on Sentis

The user's hand movements are tracked on world space with a Line Renderer gameobject when the user writes on screen. The writing is rendered to camera with a Render Texture gameobject. We read the pixel data from the Render Texture into a Texture2D. The Texture2D is converted into a tensor, which is then used by Sentis as input to perform inference with the model.

The model's predicted value is finally displayed to the user.

handwritten_digit_recogniztion_mnist_ai_model_in_unity.Original.mp4

Inference to Frame speed considerations

The app is setup for the user to press the button to run inference while game play pauses. This is because the running execution in a single frame would cause low or stuttering framerates in gameplay. Unity Sentis provides a a method to peek on the network's layers, keeping a balance game play required frames and inference use of resources. You can read more here.

Acknowledgements

Thanks to Unity for the Sentis quick start samples.

If you find this helpful you can buy me a coffee :)

Buy Me A Coffee