Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function
-
Updated
Jul 28, 2023 - Python
Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function
Rethinking Image Inpainting via a Mutual Encoder Decoder with Feature Equalizations. ECCV 2020 Oral
PyTorch implementation of Sinusodial Representation networks (SIREN)
深度学习系统笔记,包含深度学习数学基础知识、神经网络基础部件详解、深度学习炼丹策略、模型压缩算法详解,以及如何实现深度学习推理框架实战。
Korean OCR Model Design(한글 OCR 모델 설계)
💩 Sigmoid Colon: The biologically inspired activation function.
PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].
An easy-to-use library for GLU (Gated Linear Units) and GLU variants in TensorFlow.
ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani
[TCAD 2018] Code for “Design Space Exploration of Neural Network Activation Function Circuits”
Implementation for the article "Trainable Activations for Image Classification"
Unofficial pytorch implementation of Piecewise Linear Unit dynamic activation function
A PyTorch implementation of funnel activation https://arxiv.org/pdf/2007.11824.pdf
QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch
Source for the paper "Universal Activation Function for machine learning"
Multilayer neural network framework implementation, used for classification and regression task. Can use multiple activation functions with backpropagation based on autograd library. Contains polynomial activation function for regression task.
PyTorch reimplementation of the paper "Padé Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks" [ICLR 2020].
3D visualization of common activation functions
Triton reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].
Add a description, image, and links to the activation-functions topic page so that developers can more easily learn about it.
To associate your repository with the activation-functions topic, visit your repo's landing page and select "manage topics."