These are Machine Learning examples I have created or adapted to give you a better understanding of my machine learning understanding. - Hamilton
- I created a CNN (Convolutional Neural Network) model to classify Street View House Numbers (SVHN)
- This was the capstone project for the Coursera class Getting Started with Tensorflow 2 class
- Current students of the Coursera Getting tarted with Tensorflow 2 class should not look at this example
- CNN_SVHN_TF2_Capstone_Project_by_Hamilton_2020_12_3.pdf - .ipynb
- I created this model that translated from English to Germman
- This was the capstone project for the Coursera Customzing Your Models with Tensorflow 2 class
- This project taught us Encoder/Decoder seq2seq architectures, using LSTMs (Long Short Term Memory)
- This project was for learning purposes only and not production
- Current students of this Customizing Your Models with Tensorflow 2 class should not look at this
- Neural_Translation_Model_Capstone_Project_by_Hamilton_2021_1_12.pdf - .ipynb
- Translates Russian to English
- Implements a Transformer model including attention in Keras / Tensorflow along with the BERT subword tokenizer
- Transformers are the state of the art for Natural Language Processing in Machine Learning
- Russian_Transformer_Model_for_Language_Translation_v2.pdf - .ipynb
- Translates the same 5 English strings (all accurately) as my Capstone project above from English to German
- Translates using the HuggingFace pipeline, and with slightly lower level calls to the T5 and MarianMT models
- Language_Translation_Using_the_T5_Model_And_HuggingFace_Framework.ipynb
- This example downloads BERT (Bidirectional Encoder Representations from Transformers) model from tfdev.hub
- It also serves the model for inference via TensorFlow Serving
- Trained the BERT model on the IMDB Movie Review dataset to make positive and negative sentiment classification predictions
- Sentiment_Analysis_Fine_Tuning_a_BERT_model_on_IMDB.ipynb
- I adapted these short examples using the HuggingFace API
- Question_Answering_Models_BERT_Roberta_Electra_Pretrained_on_Squad2.ipynb
- This example downloads BERT (Bidirectional Encoder Representations from Transformers) model from tfdev.hub
- It fine tunes the training on one (any one) of the GLUE (General Language Understanding Evaluation) datasets
- More info about GLUE datasets can be found at: https://arxiv.org/pdf/1909.13719.pdf
- This example is copied from: https://www.tensorflow.org/tutorials/text/solve_glue_tasks_using_bert_on_tpu
- BERT_Glue_E2E.ipynb
- Segments dogs and cats out of images taken from the Oxford IIT Pet Dataset
- Image_Segmentation_Using_U-Net.pdf - .ipynb
- Compares Mask R-CNN using ResNet V2 and EfficientDet D7 for Object Detection for Object Detection
- Uses Tensorflow 2 with TFHub
- Object_Detection_Inference_Using_TF2_and_TFHub.pdf - .ipynb
-
I setup this demo with a modestly improved UI on Google Cloud using Docker
-
See a Blenderbot example conversation
-
Try it out at: https://blenderbot90m-wg5fqcbcta-uw.a.run.app (Can take 30 seconds for Google Cloud to load docker image in and startup)
-
Note, this demo uses the 90 million parameter small model. The much larger 2.7 billion and 9.4 billion parameters models will produce better conversations and can be run on more expensive hardware.
-
Here's the paper on BlenderBot developed by the Facebook AI team: https://arxiv.org/pdf/2004.13637.pdf* For comparison here's Mitsuki bot from Pandorabots which I did not find as good: https://https://chat.kuki.ai/