Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extracting Weights from Model? #1

Open
Lysosome opened this issue Feb 28, 2019 · 1 comment
Open

Extracting Weights from Model? #1

Lysosome opened this issue Feb 28, 2019 · 1 comment

Comments

@Lysosome
Copy link

Hi! I'm interested in extracting the attention weights on each variable (I have 75 different variables) to determine the relative importance of each variable in predicting the output. How would I go about extracting these weights?

@rjpg
Copy link

rjpg commented Mar 10, 2020

I have read the paper ... it is very confusing ... Simple things are made complicated.

There is no "relative importance of each variable" since they use sigmoid function instead of softmax ...

The word "attention" implies softmax function since it gives "relative importance" to some data ... sigmoid does not do that...

Using sigmoid it is just a residual block (skip connection)...

In resume another example of scientific theater.

There are some case studies that are well defined to compare results. Like raw temporal data of HAR UCI.

It is very easy to put a standard model giving bad results and our own model giving nice results when you define the temporal windows. Chose a know well define case study to say that your model gives state of the art results.

Make your code public and leave this know well define case study data out for us to test ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants