Skip to content
No description or website provided.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
README.md

README.md

awesome-daily-blog

Date: 8/8/2019

  1. https://www.simform.com/serverless-architecture-guide/
  2. https://github.com/openfaas/faas
  3. https://github.com/nuclio/nuclio
  4. https://towardsdatascience.com/boosting-algorithms-explained-d38f56ef3f30
  5. https://medium.com/machine-learning-world/linear-algebra-svd-and-pca-5979f739e95a

Date:9/8/2019

  1. https://medium.com/dissecting-bert/dissecting-bert-part-1-d3c3d495cdb3
  2. https://towardsdatascience.com/deconstructing-bert-distilling-6-patterns-from-100-million-parameters-b49113672f77
  3. https://github.com/jessevig/bertviz
  4. https://medium.com/swlh/a-simple-guide-on-using-bert-for-text-classification-bbf041ac8d04
  5. https://towardsdatascience.com/bert-in-keras-with-tensorflow-hub-76bcbc9417b
  6. https://towardsdatascience.com/nlp-extract-contextualized-word-embeddings-from-bert-keras-tf-67ef29f60a7b
  7. https://medium.com/dissecting-bert/dissecting-bert-part2-335ff2ed9c73
  8. https://medium.com/@_init_/why-bert-has-3-embedding-layers-and-their-implementation-details-9c261108e28a
  9. https://towardsdatascience.com/bert-to-the-rescue-17671379687f

Date:11/9/2019

  1. http://jalammar.github.io/illustrated-word2vec/
  2. http://ruder.io/word-embeddings-1/index.html
  3. http://ruder.io/word-embeddings-softmax/
  4. https://github.com/RaRe-Technologies/gensim/blob/develop/gensim/models/word2vec.py
  5. https://rutumulkar.com/blog/2015/word2vec/
  6. http://blog.aylien.com/a-review-of-the-recent-history-of-natural-language-processing/

Date: 12/8/2019

  1. https://medium.com/@_init_/how-self-attention-with-relative-position-representations-works-28173b8c245a
  2. https://medium.com/@_init_/why-bert-has-3-embedding-layers-and-their-implementation-details-9c261108e28a
  3. https://medium.com/@_init_/how-self-attention-with-relative-position-representations-works-28173b8c245a

Date: 13/8/2019

  1. http://ruder.io/nlp-imagenet/
  2. https://medium.com/@ranko.mosic/googles-bert-nlp-5b2bb1236d78
  3. https://github.com/kwonmha/bert-vocab-builder

Date: 14/8/2019

  1. https://medium.com/@thimblot/deploying-a-flask-application-on-aws-with-gitlab-ci-cd-part-1-87392be2129e

Date: 16/8/2019

  1. https://medium.com/@jonathan_hui/gan-some-cool-applications-of-gans-4c9ecca35900
  2. https://machinelearningmastery.com/impressive-applications-of-generative-adversarial-networks/
  3. https://arxiv.org/pdf/1801.07736.pdf
  4. https://arxiv.org/pdf/1905.01976.pdf
  5. https://becominghuman.ai/generative-adversarial-networks-for-text-generation-part-1-2b886c8cab10

Date:18/8/2019

  1. https://towardsdatascience.com/tensorflow-control-flow-tf-cond-903e020e722a
  2. https://towardsdatascience.com/learning-attention-mechanism-from-scratch-f08706aaf6b6
  3. https://github.com/Garima13a/Attention-Mechanism-Basics/blob/master/Attention_Basics_Solution.ipynb

Date: 19/8/2019

  1. https://arxiv.org/pdf/1904.00962.pdf
  2. https://github.com/titu1994/keras-LAMB-Optimizer
  3. https://arxiv.org/pdf/1902.09314.pdf

Date: 20/8/2019

  1. http://www.cs.cmu.edu/~./hovy/papers/16HLT-hierarchical-attention-networks.pdf
  2. https://github.com/minqi/hnatt
  3. https://medium.com/intuitive-deep-learning/autoencoders-neural-networks-for-unsupervised-learning-83af5f092f0b
  4. https://towardsdatascience.com/understanding-adam-how-loss-functions-are-minimized-3a75d36ebdfc
  5. https://towardsdatascience.com/understanding-partial-auto-correlation-fa39271146ac

Date:5/9/2019 : Custom Keras Layer

  1. https://keras.io/layers/writing-your-own-keras-layers/
  2. https://www.saama.com/blog/deep-learning-diaries-building-custom-layers-in-keras/

Date:6/9/2019 : Text Similarity

  1. https://medium.com/@adriensieg/text-similarities-da019229c894
  2. https://github.com/adsieg/text_similarity
You can’t perform that action at this time.