https://ai.googleblog.com/2020/12/end-to-end-transferable-deep-rl-for.html?m=1
https://jackietseng.github.io/conference_call_for_paper/conferences-with-ccf.html
https://towardsdatascience.com/reformer-the-efficient-transformer-dd9830164703
https://www.kaggle.com/andradaolteanu/pytorch-rnns-and-lstms-explained-acc-0-99/notebook
-
Batch Normalization
https://shuuki4.wordpress.com/2016/01/13/batch-normalization-%EC%84%A4%EB%AA%85-%EB%B0%8F-%EA%B5%AC%ED%98%84/ -
Optimizer
https://github.com/wonchul-kim/Machine_Learning/blob/master/deep%20learning/optimizers.ipynb -
cross-entropy
https://theeluwin.postype.com/post/6080524 -
CNN
- CNN: https://seongkyun.github.io/study/2019/01/25/num_of_parameters/
- downsampling(pooling or subsampling):
- Max pooling
- Global average pooling
- convolutional layer with stride=2, kernel=3x3 .... better than others
- upsampling(unpooling):
- recover pooling: nearest neighbor unpooling, bed of nails unpooling, max unpooling
- using convolutional layer's stride: transpose convolution(deconvolution, fractionally-strided convolution, upconvolution, backward strided convolution)
https://analysisbugs.tistory.com/104 https://zzsza.github.io/data/2018/06/25/upsampling-with-transposed-convolution/
- dilated convolution
- separable convolution
https://zzsza.github.io/data/2018/02/23/introduction-convolution/ - deformable convolution
-
affine transformation
https://kr.mathworks.com/discovery/affine-transformation.html?requestedDomain= -
kernels for image processing
https://en.wikipedia.org/wiki/Kernel_(image_processing) -
hough transform
https://wkdtjsgur100.github.io/Hough-Transform/ -
BiT
https://developers-kr.googleblog.com/2020/06/open-sourcing-bit-exploring-large-scale.html