-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
zeroshot_templates
split error for FairFace / UTKFace
#69
Comments
Hi @EIFY, thanks for digging deeper into the issue! Even if the code runs without errors, when commenting out the line, the zeroshot templates that you see are actually incorrect. It seems like this was broken in a recent update to You may need to rerun all your evals after this fix, because it is likely all of the test sets were evaluated with the incorrect zeroshot templates. |
Looking over the issue again, I don't believe it would affect all of your evals, but it is probably still safer to rerun evals from scratch (delete the |
@djghosh13 Thanks for the replies — rolling back to zeroshot_templates=['age:a photo of a person {c} years old', 'gender:a photo of a {c}', 'race:a photo of a {c} person', 'toxic:a photo of a {c}'] |
Yes, those are correct. Glad the issue was resolved! |
I think we should keep this open until clip benchmark is fixed |
@rom1504 do you mean as a reminder to other potential users of datacomp? I've already opened LAION-AI/CLIP_benchmark#109 in CLIP_benchmark as an issue to remind us to fix this problem |
Users and maintainers. This is a bug that affects clip benchmark which in
turns affect datacomp. Both have this bug so it would make sense to keep
this issue open until it's fixed.
…On Sat, Oct 14, 2023, 2:48 PM Dhruba Ghosh ***@***.***> wrote:
@rom1504 <https://github.com/rom1504> do you mean as a reminder to other
potential users of datacomp? I've already opened
LAION-AI/CLIP_benchmark#109
<LAION-AI/CLIP_benchmark#109> in CLIP_benchmark
as an issue to remind us to fix this problem
—
Reply to this email directly, view it on GitHub
<#69 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAR437QKFNR2UW2YOMMKA63X7IYVBANCNFSM6AAAAAA56PQSZA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Personally I consider running the package with older or newer dependencies than specified some sort of off-label use, but I don't mind keeping this open till |
Should be fixed in the current version of |
Hi, I got the following error running
download_evalsets.py
and thenevaluate.py
:Digging into it, the erring part is
datacomp/eval_utils/fairness_eval.py
Lines 264 to 266 in fa9d766
and then I printed out
zeroshot_templates
:In contrast to
classnames
It seems that none of the
zeroshot_templates
contains":"
, sot.split(":", 1)
always fails to split. I am not sure what the intention here is — is the code somehow reading the wrong templates? Commenting the erring part out and using thezeroshot_templates
directly allows the code to run without errors:The text was updated successfully, but these errors were encountered: