Skip to content

whoopdedo/dokuwiki-plugin-extokenizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

====== External Tokenizer Plugin ======

In complex language scripts, such as Chinese, words cannot be distinguished
by whitespace alone. This plugin uses an external program to separate words
in a text for the fulltext search index.

All documentation is available online at:

  * http://dokuwiki.org/plugin:extokenizer

(c) 2011 by Tom N Harris <tnharris@whoopdedo.org>
See COPYING for license info.

About

External tokenizer plugin for DokuWiki

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages