Deeplearning from scratch book: http://www.hanbit.co.kr/store/books/look.php?p_code=B8475831198
Deeplearning from scratch(밑바닥부터 시작하는 딥러닝) 책을 공부하면서 정리한 내용입니다.
- hungry.py (print)
- man.py (class)
- numpy_intro (행렬연산)
- matplotlib_intro (시각화)
- logic_gate
- activation function
- three layer neuralnet
- MNIST example and batch
- loss function
- gradient descent
- neuralnet training
- two-layer net
- layer naive
- activation layers propagation
- backpropagation
- gradient check
- train neuralnet
- optimizer
- optimizer compare naive
- optimizer compare MNIST
- weight initialize, activation histogram
- weight initialize compare
- batch norm test
- overfit weight decay
- overfit dropout
- hyperparameter optimization
- warmingup
- simple convent
- train convent
- gradient check
- visualize filter
- apply filter
- deep convent – VGGNet
- train deepnet – VGGNet
- misclassified MNIST
- half float network
위 코드 중에서 자주 사용되는 function, util등을 코드화
학습에 사용하기 위한 dataset