Skip to content

StevenTu01/Coursera

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Survey for Dynamic Neural network

Over the last decade several advances have been made in the paradigm of artificial neural networks with specific emphasis on architectures and learning algorithms. it is envisaged that dynamic neural networks, in addition to better representation of biological neural systems, offer better computational capabilities compared to their static counterparts.

Research

Dynamic Neural Networks: A Survey (Yizeng et al. 2021)

Dynamic Neural Networks: An Overview (Sinha & Gupta, 2000)

Foundations and modelling of dynamic networks using Dynamic Graph Neural Networks: A survey (Joakim et al. 2020)

Temporal Networks (Petter & Jari, 2011)

Representation Learning for Dynamic Graphs: A Survey (Seyed et al.2019)

Community Discovery in Dynamic Networks: A Survey (Giulio & Remy, 2017)

Toward an interoperable dynamic network analysis toolkit (Kathleen et al. 2007)

A deep learning approach to link prediction in dynamic networks (Xiaoyi et al. 2014)

Deciding how to decide: Dynamic routing in artificial neural networks (Mason & Pertro, 2017)

Community Discovery in Dynamic Networks: a Survey (Giulio & Remy)

An introduction to domain adaptation and transfer learning (Wouter & Marco, 2019)

DyNet: The Dynamic Neural Network Toolkit

Adaptive inference method

Convolutional Networks with Adaptive Inference Graphs (Veit & Belongie, 2018)

Glance and Focus: a Dynamic Approach to Reducing Spatial Redundancy in Image Classification (Yulin et al. 2020)

Adaptive Mixtures of Local Experts (Robert et al. 1991)

Adaptive computation time for recurrent neural networks [Alex, 2016]

Depth-Adaptive Transformer [Maha et al. 2020]

FastBERT: a Self-distilling BERT with Adaptive Inference Time [Weijie et al.2020]

DeeBERT: Dynamic Early Exiting for Accelerating BERT Inference [Ji et al. 2020]

BERT Loses Patience: Fast and Robust Inference with Early Exit [Wangchunshun et al. 2020]

Resolution Adaptive Networks for Efficient Inference [Le et al. 2020]

Multi-scale dense networks for resource efficient image classification [Gao et al. 2017]

Depth-Adaptive Transformer [Maha et al.2020]

Channel gating neural networks [Weizhe et al. 2019]

Branchynet: Fast inference via early exiting from deep neural networks [Surat et al. 2017]

Lifelong Learning with Dynamically Expandable Networks [Jeongtae et al. 2018]

Efficiency of Dynamic Neural Network

Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference (Jacob et al. 2018)

CondConv: Conditionally Parameterized Convolutions for Efficient Inference (Brandon et al. 2019)

Dropout: A Simple Way to Prevent Neural Networks from Overfitting (Nitish et al. 2014)

Compatibility

Learning Dynamic Routing for Semantic Segmentation (Yanwei et al. 2020)

SkipNet: Learning Dynamic Routing in Convolutional Networks (Wang et al. 2017)

Learning to Control Fast-Weight Memories: An Alternative to Dynamic Recurrent Networks (Jurgen, 1992)

Dynamic neural network with GNN

Streaming Graph Neural Networks (Yao et al. 2018)

Predicting dynamic embedding trajectory in temporal interaction networks (Srijan et al.2019)

Graph Neural Networks: A Review of Methods and Applications (Jie et al. 2019)

Semi-supervised classification with graph convolutional networks (Thomas & Max, 2017)

Dynamic neural network with CNN

ELASTIC: Improving CNNs with Dynamic Scaling Policies [Huiyu et al. 2018]

Language Modeling with Gated Convolutional Networks [Yann et al. 2017]

Long Short-Term Memory [Sepp, 1997]

Deep Residual Learning for Image Recognition [Kaiming et al. 2016]

A Deep Learning Framework for Dynamic Network Link Prediction [Jinyin et al.2019]·

Exploit All the Layers: Fast and Accurate CNN Object Detector with Scale Dependent Pooling and Cascaded Rejection Classifiers [Fan et al. 2020]

Dynamic convolution: Attention over convolution kernels (Yinpeng et al. 2020)

DyNet: Dynamic Convolution for Accelerating Convolutional Neural Networks (Yikang et al. 2020)

Dynamic neural network with CV

Dynamic computational time for visual attention (Zhichao et al.2017)

About

Data Science

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published