ReLU
Rectified Linear Unit
https://gyazo.com/848e2236bab02732e1979df7acdf684c
Better performance compared to the conventional S curve, tanh, was shown in the following paper
Glorot, X., Bordes, A. and Bengio, Y., 2011, June. Deep sparse rectifier neural networks. In Proceedings of the fourteenth international conference on artificial intelligence and statistics (pp. 315-323).
http://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf
https://gyazo.com/049c141201bc18c684db3943db8a2c43
https://gyazo.com/febe2e044d1c5b503e7f66a8ffa68bb5
---
This page is auto-translated from /nishio/ReLU using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I'm very happy to spread my thought to non-Japanese readers.