Skip to content
No description, website, or topics provided.
Jupyter Notebook Python
Branch: master
Clone or download
Pull request Compare This branch is 4 commits ahead, 53 commits behind ginkyenglee:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

How to use Explaining_Decision_of_Time_Series_Data

(1). Define model structure from trained model for calculation of relevance score

weights = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope='scopename')
activations = tf.get_collection('activation_collection_name')
X = activations[0]

conv_ksize = "[convolution layerfilter size]"#[1, 4, 4 , 1]
pool_ksize = "[pooling layer filter size]"#[1 ,4, 4, 1]
conv_strides "[convolution layer stride size]"= #[1, 1, 1, 1]
pool_strides ="[pooling layer stride size]" #[1, 4, 4, 1]


taylor = Taylor(activations, weights, conv_ksize, pool_ksize, conv_strides, pool_strides, 'Taylor',part)

Rs = []
for i in range("number of class"):
    Rs.append(taylor(i))[Rs],feed_dict={X:batch_in, model.keep_prob :p})

ref : [Explaining NonLinear Classification Decisions with Deep Taylor Decomposition]:


  • tensorflow (1.9.0)
  • numpy (1.15.0)
  • matplotlib (2.2.2)


Apache License 2.0


If you have any question, please contact Ginkyeong Lee (, Sohee Cho (, Seonman Heo ( )

XAI Project

This work was supported by Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (MSIT) (No.2017-0-01779, A machine learning and statistical inference framework for explainable artificial intelligence)

  • Project Name : A machine learning and statistical inference framework for explainable artificial intelligence (의사결정 이유를 설명할 수 있는 인간 수준의 학습·추론 프레임워크 개발)

  • Managed by Ministry of Science and ICT/XAIC

  • Participated Affiliation : KAIST, Korea Univ., Yonsei Univ., UNIST, AITRICS

  • Web Site :

You can’t perform that action at this time.