Stanford Tokenizer works, filtered tokens < 3 chars #2

Merged
merged 1 commit into from Jan 30, 2013

Projects

None yet

2 participants

@rjurney
Contributor
rjurney commented Jan 29, 2013

It works but was adding some shitty punctuation tokens.

@thedatachef thedatachef merged commit ab83f22 into thedatachef:master Jan 30, 2013
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment