Code Repository for Liquid Time-Constant Networks (LTCs)
-
Updated
Aug 21, 2023 - Python
Code Repository for Liquid Time-Constant Networks (LTCs)
Repository for the tutorial on Sequence-Aware Recommender Systems held at TheWebConf 2019 and ACM RecSys 2018
Liquid Structural State-Space Models
PyxLSTM is a Python library that provides an efficient and extensible implementation of the Extended Long Short-Term Memory (xLSTM) architecture. xLSTM enhances the traditional LSTM by introducing exponential gating, memory mixing, and a matrix memory structure, enabling improved performance and scalability for sequence modeling tasks.
Implementation of GateLoop Transformer in Pytorch and Jax
Python package for Arabic natural language processing
Sequential model for polyphonic music
VOGUE: Variable Order HMM with Duration
Source code for "A Lightweight Recurrent Network for Sequence Modeling"
Pytorch implementation of Simplified Structured State-Spaces for Sequence Modeling (S5)
Caption Images with Machine Learning
Computer vision tools for analyzing behavioral data, including complex event detection in videos.
An unofficial implementation of "TransAct: Transformer-based Realtime User Action Model for Recommendation at Pinterest" in Tensorflow
A TensorFlow implementation of "Sequence Modeling with Hierarchical Deep Generative Models with Dual Memory" (published in CIKM2017).
Implementation of Smith-Waterman local alignment model- find closest local alignments in two given amino acid sequences. BLOSUM was used as the scoring matrix.
Audio and Music Synthesis with Machine Learning
A Generative Model for Audio in the Frequency Domain
Generate music with LSTM model
Multiple EM for Motif Elicitation for discovering motifs in a group of related DNA or protein sequences.
Add a description, image, and links to the sequence-modeling topic page so that developers can more easily learn about it.
To associate your repository with the sequence-modeling topic, visit your repo's landing page and select "manage topics."