You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've created 18 topics for my model and used TopTokensCoherenceScore. When I call artm_model.score_tracker['TopTokensCoherenceScore'].average_coherence it gives me correct result for each collection pass, but when I call artm_model.score_tracker['TopTokensCoherenceScore'].coherence (or coherence[-1] for last collection pass) it gives me output in following format:
Sorry for late response.
It seems that there was incorrect use of topic_name field for labeling coherence scores
Right now bugfix for this is merged into master (see #1019 )
Could you please check that library built from current master branch fixes format?
I've created 18 topics for my model and used TopTokensCoherenceScore. When I call
artm_model.score_tracker['TopTokensCoherenceScore'].average_coherence
it gives me correct result for each collection pass, but when I callartm_model.score_tracker['TopTokensCoherenceScore'].coherence
(orcoherence[-1]
for last collection pass) it gives me output in following format:I suppose correct format should be looks like this
P.S. bigartm version - 0.10.1
The text was updated successfully, but these errors were encountered: