torch == 1.7.1
torchtext == 0.8.1
๐ ์คํฐ๋ ๋!
๐ transformers
๋ฅผ ์ฌ์ฉํ์ง ์๊ณ ํจํค์ง ํํ๋ก ๋
ผ๋ฌธ ์ฌํํ๊ธฐ
๐ 2021.01.06~2021.06.30๊น์ง ๋๋ด๊ธฐ ๋ชฉํ
๐ฎ ๋ ผ๋ฌธ ๊ตฌํ ๋จ๊ณ๋ ์ด์ ๊ฐ์
๋
ผ๋ฌธ ์ฝ๊ธฐ
-> reference code ์ฝ๊ธฐ
-> ์ฝ๋ ์ง๊ธฐ
-> ๋์ผ ์กฐ๊ฑด ์คํ์ผ๋ก ์ฑ๋ฅ์ด ์ฌํ๋๋์ง ํ์ธํ๊ธฐ
์๋ฒ | paper | ์์์ผ์ | ์ํ | arxiv | notion | reference code |
---|---|---|---|---|---|---|
01 | Convolutional Neural Networks for Sentence Classification (2014) | 0106 | Done | paper | notion | reference code official code |
02 | Sequence to Sequence Learning with Neural Networks (2014) | 0203 | WIP | paper | notion | reference code1 reference code2 |
03 | Neural Machine Translation by Jointly Learning to Align and Translate (2016) | 0303 | Done | paper | notion | reference code |
04 | Attention is All You Need (2017) | 0406 | WIP | paper | notion | reference code |
05 | Deep contextualized word representations (2018) | 0504 | Done | paper | notion | - |
06 | BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (2019) | 0608 | Done | paper | notion | - |
๐ https://tutorials.pytorch.kr/ ๋ ธํธ๋ถ ๋ค์ด ๋ฐ์์ ์คํํด๋ณด๊ณ ๋ชจ๋ฅด๋ ๊ฒ ์ ๋ฆฌํ๊ณ , ์์ฝํ๊ณ ์ฐ์ต๋ฌธ์ ๋ง๋ค์ด์ ํ๊ธฐ
๐ 2020.09.30~2020.12.31๊น์ง ๋๋ด๊ธฐ ๋ชฉํ
์๋ฒ | ํํ ๋ฆฌ์ผ | ์์์ผ์ | ์ํ | ์ฃผ์๋ด์ฉ | ๋งํฌ |
---|---|---|---|---|---|
01 | what is torch.nn? | 1010 | ์๋ฃ | nn.Functional ,nn.Module ,nn.Linear ,optim |
tutorial |
02 | Tensorboard | 1012 | ์๋ฃ | tensorboard |
tutorial |
03 | ์ด๋ฆ๋ถ๋ฅ | 1013 | ์๋ฃ | nn.Linear ,Dataset ,DataLoader |
tutorial |
04 | ์ด๋ฆ์์ฑ | 1019 | ์๋ฃ | nn.GRU ,.to(device) |
tutorial |
05 | seq2seq ๋ฒ์ญ | 1030 | ์๋ฃ | nn.Embedding ,torch.save ,torch.load |
tutorial |
06 | torchtext ๋ถ๋ฅ | 1023 | ์๋ฃ | torchtext ,Field ,nn.EmbeddingBag |
tutorial |
07 | torchtext ๋ฒ์ญ | 1026 | ์๋ฃ | TabularDataset ,BucketIterator |
tutorial |
08 | seq2seq ๋ชจ๋ธ๋ง | 1022 | ์๋ฃ | nn.TransformerEncoder |
tutorial |