From 83a525bf59388cb8d1bfddea4f8b515f457c1231 Mon Sep 17 00:00:00 2001 From: Jirka B Date: Tue, 1 Apr 2025 08:13:46 -0400 Subject: [PATCH] docs: fix broken link to SelfAttention --- course_UvA-DL/05-transformers-and-MH-attention/MHAttention.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/course_UvA-DL/05-transformers-and-MH-attention/MHAttention.py b/course_UvA-DL/05-transformers-and-MH-attention/MHAttention.py index 85b07fa8d..6e7e0155f 100644 --- a/course_UvA-DL/05-transformers-and-MH-attention/MHAttention.py +++ b/course_UvA-DL/05-transformers-and-MH-attention/MHAttention.py @@ -118,7 +118,7 @@ # * [Attention? # Attention! # (Lilian Weng, 2018)](https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html) - A nice blog post summarizing attention mechanisms in many domains including vision. -# * [Illustrated: Self-Attention (Raimi Karim, 2019)](https://towardsdatascience.com/illustrated-self-attention-2d627e33b20a) - A nice visualization of the steps of self-attention. +# * [Illustrated: Self-Attention (Raimi Karim, 2019)](https://medium.com/data-science/illustrated-self-attention-2d627e33b20a) - A nice visualization of the steps of self-attention. # Recommended going through if the explanation below is too abstract for you. # * [The Transformer family (Lilian Weng, 2020)](https://lilianweng.github.io/lil-log/2020/04/07/the-transformer-family.html) - A very detailed blog post reviewing more variants of Transformers besides the original one.