Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It is working great but classification of sexy images is poor. How can I improve? #18

Closed
FurkanGozukara opened this issue Aug 13, 2023 · 1 comment

Comments

@FurkanGozukara
Copy link

FurkanGozukara commented Aug 13, 2023

I would like it to classify sexy / explicit images better

Are there any newer models?

Currently used model version is : 1.7.1+4cf622eb13c07947a38e6e3336221657007e635d

Here some incorrectly classified images. these are supposed to be sexy but they are not classified such and many more

for example this site much better at classification with best model : https://nsfwjs.com/

https://github.com/NsfwSpy/NsfwSpy.NET/assets/19240467/b410003d-1cbb-482f-ba00-744740aed93d

https://github.com/NsfwSpy/NsfwSpy.NET/assets/19240467/21653348-5ff8-4260-8f04-488ec19bc196

@d00ML0rDz
Copy link
Collaborator

Hey,

I'm not quite sure which version of the model that is. If you're using version 3.5.0 from Nuget, you will have the latest model.

Just taken a look at both of those images, and from what I can see they're being classified correctly.

We define "sexy" images as "Images of people in their underwear and men who are topless.". Those pictures are neither so classifying them as "Neutral" is what I would expect.

Let me know if you have any more questions or can help in some other way 🙂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants