We will quickly discuss a practical example of word embeddings and then introduce to "capsule networks".
Capsule networks (CapsNet) introduce a new tool in deep learning to better model hierarchical relationships by nesting sets of neural layers. Sabour, Frosst and Hinton (2017) show that "a discrimininatively trained, multi-layer capsule system achieves state-of-the-art performance on MNIST and is considerably better than a convolutional net at recognizing highly overlapping digits". According to Anonymous (2017), benchmarks show that capsule networks are also more resistant to white box adversarial attacks than convolutional neural networks (CNN).
Co-creator: Geoffrey Hinton
Background:
Hinton, G. E., Krizhevsky, A., & Wang, S. D. (2011, June). Transforming auto-encoders. In International Conference on Artificial Neural Networks (pp. 44-51). Springer Berlin Heidelberg. Available at: http://www.cs.toronto.edu/~fritz/absps/transauto6.pdf
Sabour, S., Frosst, N., & Hinton, G. E. (2017). Dynamic Routing Between Capsules. arXiv preprint arXiv:[masked]. Available at https://arxiv.org/pdf/1710.09829.pdf
Anonymous (2017). Matrix capsules with EM routing. In International Conference on Learning Representations 2018 (ICLR 2018). Available at: https://openreview.net/pdf?id=HJWLfGWRb
Claim the event and start manage its content.
I am the organizer