주로 tf.layers를 이용했으며, 추후 tf.contrib.slim으로 refactoring한 파일도 추가예정
- OOP style로 Deep learning 모형을 짜는 예제 (with mnist)
- Tutorial of implementing DL model by OOP.ipynb
- Drop out을 적용하는 예제 (with mnist)
- Tutorial of implementing Drop out.ipynb
- Batch normalization을 적용하는 예제 (with mnist)
- Tutorial of implementing Batch normalization.ipynb
- Batch normalization과 Drop out을 동시 적용하는 예제 (with fashion mnist)
- Tutorial of implementing Batch normalization and Drop out.ipynb
- Transfer learning에 대한 예제 (with mnist)
- Tutorial of implementing Transfer learning.ipynb
- Vggnet, Googlenet, Resnet
- "Learning Deep Features for Discriminative Localization" 논문에서 제안된 Class activation map의 간단한 구현 (with mnist clutter)
- Tutorial of implementing Class activation map.ipynb
- Word2Vec의 skip-gram의 간단한 구현
- 추후 Word2Vec cbow 구현 추가
- Tutorial of implementing Word2Vec.ipynb
- Character level GRU로 가변길이의 영어단어의 긍/부정을 예측하는 모형을 학습
- Tutorial of implementing Sequence classification with RNN series.ipynb
- "Auto-Encoding Variational Bayes" 논문에서 소개된 Variational Auto-Encoder의 간단한 구현 (with mnist)
- Tutorial of implementing Variational Auto-Encoder.ipynb
- "Generative Adversarial Nets" 논문에서 소개된 Generative Adversarial Net의 간단한 구현 (with mnist)
- Tutorial of implementing Generative Adversarial Net.ipynb
- "Conditional Generative Adversarial Nets" 논문에서 소개된 Conditional Generative Adversarial Net의 간단한 구현 (with mnist)
- Tutorial of implementing Conditional GAN.ipynb
- "Convolution Neural Networks for Sentence Classification" 논문에서 소개된 모형의 간단한 구현 (with naver movie review)
- Sentence classification by MorphConv.ipynb