Skip to content

[Optimize](chinese) Optimize chinese tokenizer index process#115

Merged
qidaye merged 1 commit intoapache:clucenefrom
zzzxl1993:chinese_tokenizer
Sep 1, 2023
Merged

[Optimize](chinese) Optimize chinese tokenizer index process#115
qidaye merged 1 commit intoapache:clucenefrom
zzzxl1993:chinese_tokenizer

Conversation

@zzzxl1993
Copy link
Collaborator

  1. chinese tokenzier use sDocument

Copy link
Member

@airborne12 airborne12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@qidaye qidaye left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@qidaye qidaye merged commit 5754b41 into apache:clucene Sep 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants