Skip to content

vitalyvels/deep-learning-for-biology-hse-2018-course

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Deep Learning for Biology course materials / HSE 2018

This is a repository of course materials for the Deep Learning for Biology course.

The course was taught Fall 2018 at Higher School of Economics (Moscow), Faculty of Computer Science, Master’s Programme 'Data Analysis in Biology and Medicine'.

The contents

Syllabus

1. Artificial Intelligence: Current state and Overview

  • Short history
  • Current results in Deep Learning
  • Images and Video
  • Speech and Sound
  • Text and Language
  • Robotic control
  • ML for systems
  • Problems with DL
  • Other approaches to AI
  • Knowledge and Representation
  • Symbolic approaches
  • Evolutionary computations and Swarm intelligence
  • Hardware

2. Introduction to Neural Networks

  • Intro into NN: neuron, neural network, backpropagation,
  • Feed-forward NNs (FNN)
  • Autoencoders (AE)

3. Keras practice

4. Convolutional NNs (CNN) and Image processing

5-6. Real-life modern CNNs

  • Activations, Regularization, Augmentation, etc
  • Models: LeNet, AlexNet, VGG, GoogLeNet, Inception, ResNet, DenseNet, XCeption
  • How to use pretrained models in Keras. Notebook: using pretrained CNN models

7. Transfer Learning

8. Advanced CNNs

9. Recurrent NNs (RNNs)

  • RNN basics, Backpropagation through time
  • Long short-term memory (LSTM)
  • Advanced RNNs: Bidirectional RNNs, Multidimensional RNNs

10. Practice: Generating text using RNNs

11. Practice: Text classification using RNNs

  • Working with texts: vectorizing, one-hot encoding, word embeddings, word2vec etc
  • Keras example: sentence-based classification using RNN/LSTM/BLSTM
  • Keras example: sentence-based classification using 1D CNN
  • Keras example: sentence-based classification using RNN+CNN
  • Notebook with examples

12. Sequence Learning (seq2seq)

  • Multimodal Learning
  • Seq2seq
  • Encoder-Decoder
  • Beam search
  • Attention mechanisms, Visualizing attention, Hard and Soft attention, Self-Attention
  • Augmented RNNs
  • Connectionist Temporal Classification (CTC)
  • Non-RNN Sequence Learning, problems with RNNs
  • Convolutional Sequence Learning
  • Self-Attention Neural Networks (SAN): Transformer Architecture
  • Transformer: The next steps (Image Transformer, BERT, Universal Transformer)

About

Deep Learning for Biology course materials / Higher School of Economics, 2018

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%