基于深度学习的自然语言处理库
-
Updated
Nov 3, 2018 - Python
基于深度学习的自然语言处理库
一个微型&算法全面的中文分词引擎 | A micro tokenizer for Chinese
中文分词软件基准测试 | Chinese tokenizer benchmark
A NLP package for Chinese text:Preprocessing, Tokenization, Chinese Fonts, Word Embeddings, Text Similarity and Sentiment Analysis 轻量级中文自然语言处理软件包
Add a description, image, and links to the chinese-tokenizer topic page so that developers can more easily learn about it.
To associate your repository with the chinese-tokenizer topic, visit your repo's landing page and select "manage topics."