New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem: MCP Server must process all transfer packages sent to it at once #911
Comments
Would this same throttling logic apply for transfers started via the Archivematica API but not using Automation Tools code? I'm guessing yes, since my understanding is that the Automation Tools use the API, but I wanted to check before I make an ass out of ume. We use the Archivematica (pipeline) API to start and approve transfers, but have written our own code to do that because it gives us more visibility/control over that process. Having this logic baked into Archivematica would be really helpful in terms of removing some (not entirely effective) logical complexity in this code. |
@helrond that's the idea here — anything started via the API, either by automation tools or otherwise. |
This has been tested in a few different environments, particularly at NHA. MCPServer resource usage is not unbounded anymore. Users can adjust |
Please describe the problem you'd like to be solved.
Automation tools runs a cron job to poll MCP Server for the status of transfer packages it has sent to it. Each time MCP Server submits a job to Gearman to check on the status of the package even if it is backed up and working on packages sent earlier. This is a significant performance bottleneck.
Describe the solution you'd like to see implemented.
Move responsibility for throttling (or controlling rate of transfers) to the MCP Server. Its
API returns a UUID for the transfer to Automation Tools. MCPServer would then decide when to actually start processing the transfer based on processing availability. When MCPServer is too busy, it could return a ‘busy’ / retry response to Automation Tools.
Describe alternatives you've considered.
Additional context
related to PR artefactual/archivematica#1472
For Artefactual use:
Please make sure these steps are taken before moving this issue from Review to Done:
The text was updated successfully, but these errors were encountered: