Skip to content

Commit

Permalink
Fix tokenizer issue
Browse files Browse the repository at this point in the history
  • Loading branch information
severinsimmler committed Feb 4, 2017
1 parent b465857 commit b65a2ed
Showing 1 changed file with 0 additions and 2 deletions.
2 changes: 0 additions & 2 deletions dariah_topics/preprocessing.py
Original file line number Diff line number Diff line change
Expand Up @@ -169,8 +169,6 @@ def tokenize(doc_txt, expression=regular_expression, lower=True, simple=False):
Example:
>>> list(tokenize("This is one example text."))
['this', 'is', 'one', 'example', 'text']
Todo:
* More elegant way to exclude the dashes
"""
if lower:
doc_txt = doc_txt.lower()
Expand Down

0 comments on commit b65a2ed

Please sign in to comment.