4ab46088eec4,005
http://nhiro.org.s3.amazonaws.com/9/9/990e295a4f4d320f78fa1a2753e43c36.jpg https://gyazo.com/990e295a4f4d320f78fa1a2753e43c36
(OCR text)
6
Transformer( 2017)
というわけでTransformerの話
2017 "Attention Is All You Need"E
が出る。再起結合も畳み込みも一切使わず
注意機構だけで良い成績を出す。
>We propose a new simple network
architecture, the Transformer, based solely
on attention mechanisms, dispensing with
recurrence and convolutions entirely.