Skip to content

namkugkim/Study_deeplearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 

Repository files navigation

*** ToDo Lists
sota 논문들과 코드를 잘 모아놓은 사이트입니다. https://paperswithcode.com/sota
Machine learning cheat sheet 페이지 소개드립니다. https://ml-cheatsheet.readthedocs.io/en/latest/

Standford CS 230 - Deep Learning cheatsheet
https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-convolutional-neural-networks
https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-recurrent-neural-networks
https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-deep-learning-tips-and-tricks
GAN http://cs236.stanford.edu
Paris-Saclay 대학의 딥러닝 코스 자료. https://m2dsupsdlclass.github.io/lectures-labs/

*** AutoEncoder의 모든것
https://youtu.be/o_peo6U7IRM

latent space 이해하기  
https://blog.insightdatascience.com/generating-custom-photo-realistic-faces-using-ai-d170b1b59255
https://github.com/SummitKwan/transparent_latent_gan
https://www.youtube.com/watch?v=O1by05eX424

Auto-encoding variational bayes를 알아보자
https://tv.naver.com/v/4628354?fbclid=IwAR1Eic8IU7EIXATM1-Xhf74T6G3THHrk6RDJpBzo5tPbN8ZsNYUp7GY54Uw

**** 이번 nips의 tutorial 자료; 딥마인드와 FAIR의 두 거물인 Alex Graves 와 Marc'Aurelio Ranzato가 진행한 Unsupervised deep learning
Unsupervised deep learning tutorial 자료는 요기
https://ranzato.github.io/publications/tutorial_deep_unsup_learning_part1_NeurIPS2018.pdf
https://ranzato.github.io/publications/tutorial_deep_unsup_learning_part2_NeurIPS2018.pdf

NeurIPS 2018 프로시딩에 수록된 논문들 중,
http://papers.nips.cc/book/advances-in-neural-information-processing-systems-31-2018

*** CNN, RNN 에 대한 시각화 자료
https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-convolutional-neural-networks

**** 논문제목에 GAN이 들어간 것들을 열거합니다(총16편)
(위 프로시딩을 열고, 브라우저의 찾기기능으로 'GAN'을 검색하였습니다)

논문제목(구글번역)은 편의상 한글로 기재하고, 논문설명주소를 각각 첨부합니다. 

포즈 유도된 사람 이미지 합성을 위한 소프트 게이팅 워핑 -GAN
http://papers.nips.cc/paper/7329-soft-gated-warping-gan-for-pose-guided-person-image-synthesis

GAN은 동일하게 생성됩니까? 대규모 연구
http://papers.nips.cc/paper/7350-are-gans-created-equal-a-large-scale-study

KDGAN : Generative Adversarial Networks를 이용한 지식 증류
http://papers.nips.cc/paper/7358-kdgan-knowledge-distillation-with-generative-adversarial-networks

FD-GAN : 강력한 사람 재식별을 위한 포즈유도기능 GAN 증분
http://papers.nips.cc/paper/7398-fd-gan-pose-guided-feature-distilling-gan-for-robust-person-re-identification

PacGAN : 생성적 적대 네트워크에서 두 샘플의 힘
http://papers.nips.cc/paper/7423-pacgan-the-power-of-two-samples-in-generative-adversarial-networks

부르 간 (Bourgan) : 메트릭 삽입 기능이 포함된 생성 네트워크
http://papers.nips.cc/paper/7495-bourgan-generative-networks-with-metric-embeddings

MetaGAN : 소수 샷 학습에 대한 적대적인 접근법
http://papers.nips.cc/paper/7504-metagan-an-adversarial-approach-to-few-shot-learning

BinGAN : 정규화 된 GAN을 사용하여 컴팩트 이진 설명자 학습
http://papers.nips.cc/paper/7619-bingan-learning-compact-binary-descriptors-with-a-regularized-gan

GAN을 위한 볼록한 이중성 프레임 워크
http://papers.nips.cc/paper/7771-a-convex-duality-framework-for-gans

GAN 및 GMM
http://papers.nips.cc/paper/7826-on-gans-and-gmms

메모리 재생 GAN : 잊지 않고 새로운 카테고리 생성하기 학습
http://papers.nips.cc/paper/7836-memory-replay-gans-learning-to-generate-new-categories-without-forgetting

MMD GAN의 그라디언트 조절기에서
http://papers.nips.cc/paper/7904-on-gradient-regularizers-for-mmd-gans

바나치 와서스타인 GAN
http://papers.nips.cc/paper/7909-banach-wasserstein-gan

정규화된 최적운송을 갖춘 훈련 GAN의 수렴성 및 견고성
http://papers.nips.cc/paper/7940-on-the-convergence-and-robustness-of-training-gans-with-regularized-optimal-transport

인과 관계 InfoGAN을 사용하여 계획 가능한 표현 학습
http://papers.nips.cc/paper/8090-learning-plannable-representations-with-causal-infogan

잡음이 많은 레이블에 대한 조건부 GAN의 견고성
http://papers.nips.cc/paper/8229-robustness-of-conditional-gans-to-noisy-labels

"Towards generative adversarial networks as a new paradigm for radiology education" https://arxiv.org/pdf/1812.01547.pdf
Deep Learning in Medical AI 2018/2019 using PyTorch & Google Collab https://colab.research.google.com/drive/1hGVoskPSerJtUydCoKeUskQG8X4lt3k9 (edited)
NLP

*** [특강] 딥러닝 자언어처리 + 챗봇 https://event-us.kr/chatbothon/event/4540 
버트를 이용한 챗봇 구현' 발표자료 
Wav2letter++, the fastest open source speech system, and flashlight https://code.fb.com/ai-research/wav2letter/
카카오브레인 박상길님의 'BERT 톺아보기 (http://docs.likejazz.com/bert/)
BERT 관련한 포스트(http://jalammar.github.io/illustrated-bert/)
Elmo 관련 slideshare 자료 https://www.slideshare.net/shuntaroy/a-review-of-deep-contextualized-word-representations-peters-2018 https://vimeo.com/277672840
transformer 의 encoder(BERT)와 decoder: 
Part I(https://medium.com/dissecting-bert/dissecting-bert-part-1-d3c3d495cdb3)
Appendix(https://medium.com/dissecting-bert/dissecting-bert-appendix-the-decoder-3b86f66b0e5f)
카카오 브레인 이동현님의 pytorchic-bert(https://github.com/dhlee347/pytorchic-bert)
bioBERT - CT reports 에서 TNM stage를 뽑아낼때 유용하게 쓸 수 있을 것 같습니다. https://github.com/dmis-lab/biobert

**** PACS / DICOM tools for DL
https://github.com/joshy/pypacscrawler

LSH (Annoy) for approximate nearest neighbors search ; 머신러님 피쳐 저장 전용 데이터베이스 입니다. PyTorch와 연동됩니다.
https://euclidesdb.readthedocs.io/en/latest/getstarted.html

Keras: AlexNet, VGG, Inception, MobileNet, ShuffleNet, ResNet, DenseNet, Xception, Unet, SqueezeNet, YOLO, RefineNet
https://github.com/Machine-Learning-Tokyo/DL-workshop-series/blob/master/ConvNets.ipynb

image super-resolution using residual dense blocks https://github.com/idealo/image-super-resolution/blob/master/README.md

https://tv.naver.com/v/4696573

About

DeepLearning Study

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages