Skip to content

Commit

Permalink
batch size optional limit check
Browse files Browse the repository at this point in the history
  • Loading branch information
nekufa committed Dec 6, 2023
1 parent 2af4e9d commit 16ec256
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion sharded_queue/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -234,7 +234,8 @@ async def process(self, tube: Tube, limit: Optional[int] = None) -> int:
batch_size = instance.batch_size()
if batch_size is None:
batch_size = self.settings.batch_size
batch_size = min(limit, batch_size)
if limit is not None:
batch_size = min(limit, batch_size)
processed = False
for pipe in pipes:
msgs = await storage.range(pipe, batch_size)
Expand Down

0 comments on commit 16ec256

Please sign in to comment.