Skip to content

time_entry.filter() extremely slow with pagination – ~50s per 100 entries (Need efficient way to fetch ~53k records) #352

@sumeet1495

Description

@sumeet1495

Hi,
I'm using the Python Redmine library to fetch time entries like this:

entries = client.time_entry.filter(offset=0, limit=100)

But each request (100 entries) takes ~50–60 seconds, which becomes very slow when trying to fetch all ~53,000 records (takes hours).
There’s no network issue on my side, so the delay seems to be from either the Redmine API response or how the RedmineLib handles bulk pagination.

Is there any efficient way to:

Speed up time_entry fetching?

Fetch all 53k entries with less overhead?

Or cache responses locally and reprocess later?

Any advice or alternate approach would be appreciated.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions