Skip to content

Commit

Permalink
Merge overleaf-2023-06-05-1722 into main
Browse files Browse the repository at this point in the history
  • Loading branch information
veekaybee committed Jun 5, 2023
2 parents 55f67a7 + 19bd664 commit 3cd02c8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion embeddings.tex
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@ \section{Introduction}
\begin{figure}[H]
\centering
\includegraphics[width=.7\linewidth]{figures/embeddings_1.png}
\caption{Embeddings papers in Arxiv by month. It's interesting to note the decline in frequency of embeddings-specific papers, possibly in tandem with the rise of deep learning architectures like GPT \href{https://github.com/veekaybee/embeddings_code/blob/main/fig_2_embeddings_papers.ipynb}{source}}
\caption{Embeddings papers in Arxiv by month. It's interesting to note the decline in frequency of embeddings-specific papers, possibly in tandem with the rise of deep learning architectures like GPT \href{https://github.com/veekaybee/what_are_embeddings/blob/main/notebooks/fig_2_embeddings_papers.ipynb}{source}}
\end{figure}

Building and expanding on the concepts in Word2Vec, the Transformer \citep{vaswani2017attention} architecture, with its self-attention mechanism, a much more specialized case of calculating context around a given word, has become the de-facto way to learn representations of growing multimodal vocabularies, and its rise in popularity both in academia and in industry has caused embeddings to become a staple of deep learning workflows.
Expand Down

0 comments on commit 3cd02c8

Please sign in to comment.