Skip to content

assistant tools for attention visualization in deep learning

License

Notifications You must be signed in to change notification settings

chenzhik/VisualizerX

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VisualizerX

a modification based on https://github.com/luo3300612/Visualizer support to get multiple local variable from one function

install

pip install bytecode
python setup.py install

Usage

decorate the function with 'get_local'

Example 1

Model

from visualizer import get_local
@get_local('attention_map1', 'attention_map2')  # input the local variables
def your_attention_function(*args, **kwargs):
    ...
    attention_map1 = ... 
    attention_map2 = ... 
    ...
    return ...

Visualize

from visualizer import get_local
get_local.activate() # activate decorator before import model!!
from ... import model 

# load model and data
...
out = model(data)

cache = get_local.cache # ->  {'your_attention_function.attention_map1': [attention_map], 'your_attention_function.attention_map2': [attention_map]}

Example 2

from visualizer import get_local

class Attention(nn.Module):
    def __init__(self):
        ...
    
    @get_local('attn_map')
    def forward(self, x):
        ...
        attn_map = ...
        ...
        return ...

About

assistant tools for attention visualization in deep learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 72.2%
  • Python 27.8%