Skip to content

Commit

Permalink
cross ref
Browse files Browse the repository at this point in the history
  • Loading branch information
Xmaster6y committed Dec 28, 2023
1 parent 91fc944 commit e19db29
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion pages/_drafts/layer-wise-relevance-propagation.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,11 @@ With $R_j^{[l]}$ being the $j$-th neuron's relevance of the layer $l$, and the p

$$
\begin{equation}
%\label{eq:aggregate}
\label{eq:aggregate}
R_{j}^{[l]}=\sum_{k}\dfrac{w_{jk}}{\sum_j w_{kj}}R_k^{[l+1]}
\end{equation}
$$

### Different Rules

### Technical Details
Expand Down

0 comments on commit e19db29

Please sign in to comment.