Natural Language Processing with Deep Learning
https://gyazo.com/bb45d0ef28f156cfa28d7b75ae6a44e5
Amazon
https://www.kspub.co.jp/book/detail/1529243.html
Chapter 1: natural language processing approach
1.1 Traditional Natural Language Processing
1.2 Expectations for [deep learning
1.3 Characteristics of text data
1.4 Expansion into other fields
Chapter 2: Fundamentals of Neural Networks
2.1 Supervised learning
2.2 Forward Propagating Neural Networks
2.3 Activation Functions
2.4 Gradient method
2.5 Error Back Propagation Method
2.6 Recurrent Neural Networks #RNN
2.7 Gated Recurrent Neural Networks
LSTM
GRU
2.8 Tree-structured recurrent neural nets
recursive neural networks = Tree-RNN
2.9 Convolutional Neural Networks #CNN
Chapter 3: Fundamentals of Deep Learning in Linguistic Processing
3.1 Preparation: Bridging the world of symbols and the world of vectors
3.2 Language Model
3.3 distributed representation
LBL model log-bilinear model
word2vec skip-gram CBoW
3.4 Series transformation model
seq2seq
Chapter 4: Developments in Deep Learning Specific to Linguistic Processing
4.1 attention mechanism
4.2 Memory Network
4.3 Output Layer Acceleration
Chapter 5 Applications
5.1 Machine Translation
5.2 Document Summary
5.3 Dialogue
5.4 Question and Answer
Chapter 6: Techniques to Improve Generalization Performance
6.1 Decomposition of generalization error
6.2 Methods Effective in Reducing Estimation Error
6.3 Methods Effective in Reducing Optimization Error
6.4 Super-parameter selection
Chapter 7 Implementation
7.1 GPUs and GPGPUs
7.2 Minibatching in RNNs
7.3 Random sampling
7.4 Reducing Memory Usage
7.5 Implementation of Error Back Propagation Method
---
This page is auto-translated from /nishio/深層学習による自然言語処理 using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I'm very happy to spread my thought to non-Japanese readers.