Example 📓 Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using 🧠 Amazon SageMaker.
-
Updated
Aug 15, 2024 - Jupyter Notebook
Example 📓 Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using 🧠 Amazon SageMaker.
Example notebooks for working with SageMaker Studio Lab. Sign up for an account at the link below!
Jupyter notebooks demonstrating setup and use of the R-ArcGIS bridge. The repo includes datasets required to run the Jupyter notebooks.
The notebook explains the various steps to obtain the results of publication: "Is Space-Time Attention All You Need for Video Understanding?"
A trio of Google-Colab notebooks (ipynb) for training a GPT-2 (127M) model from scratch (useful for other / non-English languages) using gpt-2-simple
A basic machine learning model built in python jupyter notebook to classify whether a set of tweets into two categories: racist/sexist non-racist/sexist.
Can be used as-is for better slashed zeros recognition (especially with the Consolas font). This Repository includes a Jupyter notebook with instructions to train/finetune a Tesseract OCR model.
This repository contains mini project in machine learning with notebook files
@yamilesquivel , drawings and efect over images
A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.
Train Basic Model on CIFAR10 Dataset - 🎨🖥️ Utilizes CIFAR-10 dataset with 60000 32x32 color images in 10 classes. Demonstrates loading using torchvision and training with pretrained models like ResNet18, AlexNet, VGG16, DenseNet161, and Inception. Notebook available for experimentation.
Jupyter notebook for creating Research Object in ROHub for defining a checklist for master students.
I gave training on ML and DL. This repository contains notes from that training as jupyter notebooks.
This notebook demonstrates a neural network implementation using NumPy, without TensorFlow or PyTorch. Trained on the MNIST dataset, it features an architecture with input layer (784 neurons), two hidden layers (132 and 40 neurons), and an output layer (10 neurons) with sigmoid activation.
Add a description, image, and links to the training topic page so that developers can more easily learn about it.
To associate your repository with the training topic, visit your repo's landing page and select "manage topics."