From linear regression towards neural networks...
-
Updated
Apr 30, 2024 - C++
From linear regression towards neural networks...
SIMD implementation in normal CPU code, SSE and AVX. I created this project to test if the AVX and SSE code for my neural network works correctly and to compare its performance with regular CPU code. The focus is on operations like dot product, Adam optimizer, and gradient updates.
Designed a robotic system using inference. Created a project idea, collected data set for classification, and justified network design choices based on technical analysis of accuracy and speed on the target system.
a simple neural network
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
Add a description, image, and links to the adam-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the adam-optimizer topic, visit your repo's landing page and select "manage topics."