Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IKAnalyzer的两个问题 #47

Closed
GoogleCodeExporter opened this issue Oct 21, 2015 · 1 comment
Closed

IKAnalyzer的两个问题 #47

GoogleCodeExporter opened this issue Oct 21, 2015 · 1 comment

Comments

@GoogleCodeExporter
Copy link

1. 
Dictionary.addWords,当添加的词库太大(我这里是将近30万)时��
�会出现bug,20万左右时没有这个问题
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at org.wltea.analyzer.dic.DictSegment.lookforSegment(DictSegment.java:228)
    at org.wltea.analyzer.dic.DictSegment.fillSegment(DictSegment.java:199)
    at org.wltea.analyzer.dic.DictSegment.fillSegment(DictSegment.java:204)
    at org.wltea.analyzer.dic.DictSegment.fillSegment(DictSegment.java:204)
    at org.wltea.analyzer.dic.DictSegment.fillSegment(DictSegment.java:170)
    at org.wltea.analyzer.dic.Dictionary.addWords(Dictionary.java:119)

2. 
当词库从20万跃至30万时,分词速度急剧下降,这是什么原因�
��

环境为:win7 + jdk1.6
version: IKAnalyzer2012_u3.zip 


What version of the product are you using? On what operating system?


Please provide any additional information below.


Original issue reported on code.google.com by jaysoona...@gmail.com on 27 Mar 2012 at 11:05

@GoogleCodeExporter
Copy link
Author

请设置你的jvm内存参数而后再做测试。
你所说的性能下降是因为你的jvm内存不足,在频繁的执行GC

Original comment by linliang...@gmail.com on 28 Mar 2012 at 1:25

  • Changed state: Invalid

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant