Skip to content

ffreemt/hlm-texts

Repository files navigation

hlm-texts Codacy BadgeCode style: blackLicense: MITPyPI version

Hlm at your fingertips

This repo currently just has

  • 340-脂砚斋重批红楼梦.txt: hlm_zh
  • david-hawks-the-story-of-the-stone.txt: hlm_en
  • yang-hlm.txt: hlm_en1
  • joly-hlm.txt: hlm_en2

It may be expanded to other versions. If you wish to have one particular version included, make a pull request (fork, upload files and click the pull request button) or provide a text file to me.

Special Dependencies

hlm_texts depends on polyglot that in turn depends on libicu

To install libicu

For Linux/OSX

E.g.

  • Ubuntu: sudo apt install libicu-dev
  • Centos: yum install libicu
  • OSX: brew install icu4c

Then use poetry or pip to install PyICU pycld2 Morfessor, e.g.

pip install PyICU pycld2 Morfessor

or

poetry add PyICU pycld2 Morfessor

For Windows

Download and install the pyicu and pycld2 (possibly also Morfessor) whl packages for your OS/Python version from https://www.lfd.uci.edu/~gohlke/pythonlibs/#pyicu and https://www.lfd.uci.edu/~gohlke/pythonlibs/#pycld2 (possibly also Morfessor https://www.lfd.uci.edu/~gohlke/pythonlibs/)

Installation

pip install hlm-texts

# pip install hlm-texts -U  # to upgrade to the newest version

or install the newest version

pip install git+https://github.com/ffreemt/hlm-texts

or git clone the repo and install from the source

git clone
cd hlm-texts
pip install -r requirements.text

Usage

Texts from various editions

from hlm_texts import hlm_en, hlm_zh, hlm_en1, hlm_en2
  • hlm_zh: 340-脂砚斋重批红楼梦.txt
  • hlm_en: david-hawks-the-story-of-the-stone.txt
  • hlm_en1: yang-hlm.txt
  • hlm_en2: joly-hlm.txt

with blank lines removed and paragraphs retaind.

Sentence tokenizing: sent_tokenizer

for tokenizing text or list of texts into sentences

from hlm_texts import sent_tokenizer, hlm_en

hlm_en_sents = sent_tokenizer(hlm_en, lang="en")

Tokenizing long English texts for the first time can take a while (3-5 minutes for hlm_en, hlm_en1, hlm_en2). Subsequent operations are, however, instant since sent_tokenizer is cached in ~/joblib_cache (\Users\xyz\joblib_cacheforWindows 10`).

Final Note

The repo is for study purpose only. If you believe that your interests have been violated in any way, please let me know. I'll promptly follow it up with appropriate actions.

About

Hlm at your fingertips

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages