Skip to content

fix: use _max_concurrent_semantic in Semantic queue worker#905

Merged
zhoujh01 merged 1 commit intovolcengine:mainfrom
vincent067:fix-issue-873-semantic-concurrency
Mar 24, 2026
Merged

fix: use _max_concurrent_semantic in Semantic queue worker#905
zhoujh01 merged 1 commit intovolcengine:mainfrom
vincent067:fix-issue-873-semantic-concurrency

Conversation

@vincent067
Copy link
Contributor

Summary

Fixes #873

The _max_concurrent_semantic variable was stored in QueueManager.__init__ and passed to SemanticProcessor, but it was not used when starting the Semantic queue worker thread.

Problem

In _start_queue_worker:

max_concurrent = self._max_concurrent_embedding if queue.name == self.EMBEDDING else 1  # Hardcoded to 1!

This caused the Semantic queue to always have max_concurrent = 1, ignoring the configured vlm.max_concurrent value for queue-level concurrency.

Solution

max_concurrent = self._max_concurrent_embedding if queue.name == self.EMBEDDING else self._max_concurrent_semantic

Impact

Users can now configure Semantic queue concurrency through ov.conf, enabling faster processing when there are many pending Semantic tasks.

Thanks for the great bug report! 👍

Fixes volcengine#873

The _max_concurrent_semantic variable was stored in QueueManager.__init__
but not used when starting the Semantic queue worker thread. This caused
the Semantic queue to always have max_concurrent=1, ignoring the configured
vlm.max_concurrent value.

Changes:
- Use self._max_concurrent_semantic instead of hardcoded 1 for Semantic queue
@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.


Li Wenjun seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

@github-actions
Copy link

Failed to generate code suggestions for PR

@zhoujh01 zhoujh01 merged commit a34744a into volcengine:main Mar 24, 2026
1 of 2 checks passed
@github-project-automation github-project-automation bot moved this from Backlog to Done in OpenViking project Mar 24, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

[Bug]: _max_concurrent_semantic variable stored but not used in Semantic queue worker

3 participants