Skip to content

happyBeagle/Transformer_Practice

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

5 Commits
ย 
ย 

Repository files navigation

Transformer Practice

๐Ÿฎ Reference

๐Ÿญ Intro

Deep Learning ํ•™์Šต์„ ํ•˜๊ฑฐ๋‚˜ ๋Œ€ํšŒ๋ฅผ ๋‚˜๊ฐ€๋ณด๋ฉด Transformer๊ธฐ๋ฐ˜ ๋ชจ๋ธ ํ˜น์€ Transformer๊ทธ ์ž์ฒด๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์„ฑ๋Šฅ์„ ๋‚ด๊ฑฐ๋‚˜ ์‚ฌ์šฉ์„ ํ•˜์˜€๋‹ค๋Š” ์†Œ๋ฆฌ๋ฅผ ๋งŽ์ด ๋“ฃ๋Š”๋‹ค. ๊ทธ๋Ÿฐ๋ฐ ๋‚˜๋Š” ๋Œ€ํšŒ์—์„œ ํ˜น์€ ์Šคํ„ฐ๋”” ๋ชจ์ž„์—์„œ Transformer๋ฅผ ์‚ฌ์šฉํ•ด ๋ณด์•˜์ง€๋งŒ, ํ•ด๋‹น ๋ชจ๋ธ์— ๋Œ€ํ•œ ์ดํ•ด๋„๊ฐ€ ๋ถ€์กฑํ•˜์˜€๋‹ค๋Š” ๊ฒƒ์„ ๊นจ๋‹ฌ์•˜๋‹ค. ๊ทธ๋ž˜์„œ ์ด์ œ Transformer์ž์ฒด์— ๋Œ€ํ•ด ํ•™์Šต์„ ํ•ด๋ณด๊ณ ์žํ•œ๋‹ค.

Although the Transformer is the most famous model for it's high perfomance in Deep Learning fields, I have been implementing without deep understanding. This repository is for studying the Transformer.

$> tree -d
.
โ”œโ”€โ”€ /config
โ”‚     โ””โ”€โ”€ configuration files 
โ”œโ”€โ”€ /modules
โ”‚     โ”œโ”€โ”€ multihead_attention_layer.py
โ”‚     โ”œโ”€โ”€ encoder_layer.py
โ”‚     โ”œโ”€โ”€ decoder_layer.py
โ”‚     โ”œโ”€โ”€ transformer.py
โ”‚     โ””โ”€โ”€ utils.py
โ”œโ”€โ”€ train.py
โ””โ”€โ”€ evaluation.py

๐Ÿง What Is Transformer?

  • TODO

๐Ÿฐ How to Develop Transformer?

  • TODO

๐Ÿฅง TODO List

  • markdown ์ •๋ฆฌํ•˜๊ธฐ
  • repository ๊ตฌ์กฐ ์ •๋ฆฌํ•˜๊ธฐ
  • transformer ๊ตฌํ˜„ํ•˜๊ธฐ
    • attention ๊ตฌํ˜„ํ•˜๊ธฐ
    • transformer ๊ตฌํ˜„
    • training code ๊ตฌํ˜„
    • evaluation code ๊ตฌํ˜„

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published