MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
-
Updated
Jan 20, 2017 - Python
MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
A Neural Network Implementation in Java
Xavier is a small object-oriented XML library for Lazarus and Delphi
A Simple JavaScript Betting Game to Play With Your Friends.
JavaScript tool to manager your incomes and expenses.
Sample to show how input dim(input_dim for FC layer, for CNN, it's prod(tensor.shape[1:])) reciprocal could be used for initialize NN weights rather than sqrt(input_dim_reciprocal) hinted by Xavier:dragon:
🔩 Automatically script to setup and configure your NVIDIA Jetson [Nano, Xavier, TX2i, TX2, TX1, TK1] . This script run different modules to update, fix and patch the kernel, install ROS and other...
Multi-booting for the Jetson AGX Xavier with NVMe SSD
The purpose of this project is to apply mediapipe to more AI chips.
封装Jetson Multimedia API的编解码库,基于https://github.com/jocover/jetson-ffmpeg基础进行的修改,未集成于ffmpeg,可单独使用。
Add a description, image, and links to the xavier topic page so that developers can more easily learn about it.
To associate your repository with the xavier topic, visit your repo's landing page and select "manage topics."