From 7bd768a6dc9630db695dbddb6ada6dbe7f6886e1 Mon Sep 17 00:00:00 2001 From: linogaliana Date: Mon, 28 Aug 2023 09:14:55 +0000 Subject: [PATCH] Erreur image --- content/NLP/index.qmd | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/NLP/index.qmd b/content/NLP/index.qmd index 3acbadee1..12c6f89ff 100644 --- a/content/NLP/index.qmd +++ b/content/NLP/index.qmd @@ -131,7 +131,7 @@ sophistiqué d'analyse du langage: ::: {#fig-encoder} -![Surus](surus.png){#fig-surus} +![Illustration transformer architecture](https://substackcdn.com/image/fetch/w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81c2aa73-dd8c-46bf-85b0-90e01145b0ed_1422x1460.png){#fig-encoder-decoder} Illustration of the original transformer architecture proposed in [Attention Is All You Need, 2017](https://arxiv.org/abs/1706.03762) (source: [Sebastien Raschka](https://magazine.sebastianraschka.com/p/understanding-encoder-and-decoder))