sticker is a sequence labeler using neural networks.
sticker is a sequence labeler that uses either recurrent neural networks or dilated convolution networks. In principle, it can be used to perform any sequence labeling task, but so far the focus has been on:
- Part-of-speech tagging
- Topological field tagging
- Dependency parsing
Where to go from here
sticker uses techniques from or was inspired by the following papers:
- Finding Function in Form: Compositional Character Models for Open Vocabulary Word Representation. Wang Ling, Chris Dyer, Alan W Black, Isabel Trancoso, Ramón Fermandez, Silvio Amir, Luís Marujo, Tiago Luís, 2015, Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing
- Transition-based dependency parsing with topological fields. Daniël de Kok, Erhard Hinrichs, 2016, Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics
- Viable Dependency Parsing as Sequence Labeling. Michalina Strzyz, David Vilares, Carlos Gómez-Rodríguez, 2019, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
You can report bugs and feature requests in the sticker issue tracker.
sticker is licensed under the Blue Oak Model License version
1.0.0. The Tensorflow protocol buffer definitions in
are licensed under the Apache License version 2.0.