Skip to content

wonchul-kim/Machine_Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

https://www.marktechpost.com/free-resources/?fbclid=IwAR2jeZxmXuRu_YK3XamLVIrwBNw_PpoG7IgY5vIENSfAreUMmAEgFh0c6oc

https://ai.googleblog.com/2020/12/end-to-end-transferable-deep-rl-for.html?m=1

https://medium.com/@sunwoopark/slow-paper-glow-generative-flow-with-invertible-1x1-convolutions-837710116939

https://jackietseng.github.io/conference_call_for_paper/conferences-with-ccf.html

https://www.topbots.com/neurips-2020-covid-19-research-papers/?fbclid=IwAR3J8CDTLayz4GB324pFKBFCrUjcmKokeoJ8GHvuhi0iANRvbEFsBiHoy1A

https://venturebeat.com/2020/12/16/at-neurips-2020-researchers-proposed-faster-more-efficient-alternatives-to-backpropagation/?fbclid=IwAR3WohvqqTDgaPNN9XiHS-PTJB6d0xEJc_0wl_ba0Cr_rqr2OdlH2qhOThM

http://www.aitimes.kr/news/articleView.html?idxno=18700&fbclid=IwAR3iBZe2y8eBUmaN6_GOZMgLga968q_MCcvXtKafePUjn5cgy_h0oT-XJ6g

https://www.microsoft.com/en-us/research/blog/adversarial-machine-learning-and-instrumental-variables-for-flexible-causal-modeling/

https://arxiv.org/abs/2011.15091?fbclid=IwAR0r5FKZqR-MiA4qS2c9LTAqSEHIguYTMf1aqXLBVu7vr9fe61mxopbQS54

https://towardsdatascience.com/reformer-the-efficient-transformer-dd9830164703

Lagrangian Neural Networks

https://github.com/kmario23/deep-learning-drizzle?fbclid=IwAR0969qFLTgHPjsI_3jrAxA_Abg4sWBXv27HQg2-FCSpSvfMsB7tTajnbSE

https://www.kaggle.com/andradaolteanu/pytorch-rnns-and-lstms-explained-acc-0-99/notebook

https://spectrum.ieee.org/tech-talk/artificial-intelligence/machine-learning/understanding-causality-is-the-next-challenge-for-machine-learning?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+IeeeSpectrumFullText+(IEEE+Spectrum+Full+Text)

Study Machine Learning

  • CNN: https://seongkyun.github.io/study/2019/01/25/num_of_parameters/
  • downsampling(pooling or subsampling):
    1. Max pooling
    2. Global average pooling
    3. convolutional layer with stride=2, kernel=3x3 .... better than others
  • upsampling(unpooling):
    1. recover pooling: nearest neighbor unpooling, bed of nails unpooling, max unpooling
    2. using convolutional layer's stride: transpose convolution(deconvolution, fractionally-strided convolution, upconvolution, backward strided convolution)

https://analysisbugs.tistory.com/104 https://zzsza.github.io/data/2018/06/25/upsampling-with-transposed-convolution/

Graph_Neural_Networks

Lectures

References

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published