Built my own deep learning framework (e.g Tensorflow/PyTorch) to help me learn the basics better.
Implemented backprop and calculated gradients by myself. Results are comparable to using pytorch and tensorflow after training.
- Supports Relu and Sigmoid activation functions
- Supports BinaryCrossEntropy and Mean Squared Error Loss functions
Bonus
- implemented dynamic learning rate adjustment