External tokenizer plugin for DokuWiki
License
whoopdedo/dokuwiki-plugin-extokenizer
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
====== External Tokenizer Plugin ====== In complex language scripts, such as Chinese, words cannot be distinguished by whitespace alone. This plugin uses an external program to separate words in a text for the fulltext search index. All documentation is available online at: * http://dokuwiki.org/plugin:extokenizer (c) 2011 by Tom N Harris <tnharris@whoopdedo.org> See COPYING for license info.
About
External tokenizer plugin for DokuWiki
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published