Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can i obtain the attentional weights of feature interaction in AFM? #10

Closed
chenfu12138 opened this issue Dec 13, 2018 · 5 comments
Closed
Milestone

Comments

@chenfu12138
Copy link

No description provided.

@shenweichen
Copy link
Owner

shenweichen commented Dec 13, 2018

Hi @chenfu12138 there are 2 steps to get the attentional weights of feature interactions in AFM.
First,make sure you have installed the latest release version of deepctr(now v0.1.5) from pip.
You can installed that version through pip install deepctr==0.1.5.

Step1: Modify two lines of the source codes

Please modify the layers.py in your local machine,maybe the path is
xxx\Anaconda3\Lib\site-packages\deepctr\layers.py

In line 134 and 135
change the following two lines

attention_weight =tf.nn.softmax(tf.tensordot(attention_temp,self.projection_h,axes=(-1,0)),dim=1) 
attention_output = tf.reduce_sum(attention_weight*bi_interaction,axis=1) 

to

self.normalized_att_score=tf.nn.softmax(tf.tensordot(attention_temp,self.projection_h,axes=(-1,0)),dim=1) 
attention_output = tf.reduce_sum(self.normalized_att_score*bi_interaction,axis=1)

I will modify it in the next released version.

Step2: Get the attentional weights !

After you have finished training the AFM model.

from tensorflow.python.keras.models import Model
from tensorflow.python.keras.layers import Lambda

afmlayer = model.layers[-3]
afm_weight_model = Model(model.input,outputs=Lambda(lambda x:afmlayer.normalized_att_score)(model.input))
attentional_weights = afm_weight_model.predict(model_input,batch_size=4096)

You can try it~

@chenfu12138
Copy link
Author

@shenweichen It worked,thanks! I have obtained the attentional weights like this
[[0.05291238]
[0.05570798]
[0.29247418]
...
[0.2924742 ]
[0.05001876]
[0.04996691]]
But i dont konw how to match the attentional weight in the array with the specific feature interaction one by one,And it seems that the match relationship between them varies from every training?

@shenweichen
Copy link
Owner

shenweichen commented Dec 14, 2018

Hi @chenfu12138
You can use the following codes

import itertools
import deepctr
from tensorflow.python.keras.models import Model
from tensorflow.python.keras.layers import Lambda

feature_dim_dict = {"sparse": sparse_feature_dict, "dense": dense_feature_list}
model = deepctr.models.AFM(feature_dim_dict)
model.fit(model_input,target)#

afmlayer = model.layers[-3]
afm_weight_model = Model(model.input,outputs=Lambda(lambda x:afmlayer.normalized_att_score)(model.input))
attentional_weights = afm_weight_model.predict(model_input,batch_size=4096)
feature_interactions = list(itertools.combinations(list(feature_dim_dict['sparse'].keys()) + feature_dim_dict['dense'] ,2))

The attentional_weights[:,i,0] is the feature_interactions[i]'s attentional weight of all samples

Try it and star it if it helps 😉

@chenfu12138
Copy link
Author

@shenweichen Perfect solution! Its so nice of you! Star it of course!!!

@shenweichen
Copy link
Owner

shenweichen commented Dec 19, 2018

Hi, the latest version v0.2.0 has been released ,please upgrade through pip install -U deepctr ~

@shenweichen shenweichen added this to the v0.1.6 milestone Dec 20, 2018
@shenweichen shenweichen pinned this issue Dec 27, 2018
@shenweichen shenweichen unpinned this issue Sep 21, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants