Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chowda sometimes does not finish processing media files, but reports no error #206

Open
owencking opened this issue Mar 4, 2024 · 1 comment
Labels
bug 🐛 Something isn't working wontfix ✖️ This will not be worked on

Comments

@owencking
Copy link

Description

When I run a CLAMS app on a media file, Chowda should report success only on the condition that the media file was completely processed. Right now, I can download MMIF even if the CLAMS app in question only got through part of the file.

Reproduction steps

From my Slack message about this:

A few days ago I ran ten identical single-guid batches in Chowda, because the Brandeis team and I were trying to measure how much indeterminacy there was in our image classification model. Those were batches 39-47.

Well, I was just going through the results. The output MMIF from those batches differs a ton, and not due to variability in the classifier. The MMIF varies just in terms of how much of the video was completed. The video is about 31 minutes long. But some of the runs didn't make it past 5 minutes, and only one batch out of the ten made it all the way through the video.

Expected behavior

Ideally, it should get fully through every file. However, in cases where it doesn't, it should not fail silently. It should alert the Clammer that it didn't get all the way through.

Screenshots

No response

Browsers

No response

OS

No response

Additional context

By my lights, the most serious thing about this issue is that it's hard to detect. The main reason I noticed it was that I just happened to be comparing several runs that should have had very similar output. If it hadn't been for that, it might have been awhile before I noticed. It doesn't result in any overt failures or error messages.

@mrharpo mrharpo added bug 🐛 Something isn't working wontfix ✖️ This will not be worked on labels Mar 29, 2024
@mrharpo
Copy link
Contributor

mrharpo commented Mar 29, 2024

As far as I can tell, there's no indication any clams apps are erroring. Each app is responding with a 200 code, which means exceptions are being caught somewhere instead of raised. My guess is in the Python SDK or the app code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐛 Something isn't working wontfix ✖️ This will not be worked on
Projects
Status: 🏗 In progress
Development

No branches or pull requests

2 participants