Skip to content

Conformer block with Rotary Position Embedding, modified from lucidrains' implement

License

Notifications You must be signed in to change notification settings

KdaiP/conformer-RoPE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

conformer-RoPE

A refined Conformer block with Rotary Position Embedding, modified from lucidrains' implement

Modification:

  1. Use Rotary Position Embedding instead of relative embedding

  2. Use pytorch official implement of GLU and Swish activation, which are slightly faster

  3. Use pytorch official implement of scaled_dot_product_attention, which can automatically switch to flash attention or xformers if possible

  4. Remove the the dependency of einops, now we only need pytorch

Usage

import torch
from conformer import Conformer

model = Conformer(n_layers=3, 
                  hidden_channels=192, 
                  filter_channels=768, 
                  n_heads=2, 
                  kernel_size=3)

x = torch.randn([32, 192, 35]) # input shape: [batch_size, hidden_channels, time]
model(x) # (32, 192, 35)

About

Conformer block with Rotary Position Embedding, modified from lucidrains' implement

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages