Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I tried a dozen images or so when I saw this profiled on an ai tool aggregate site. it got 7 wrong and five were false NEGATIVE.. NSFW Warning #60

Open
bebop210 opened this issue Apr 7, 2024 · 4 comments

Comments

@bebop210
Copy link

bebop210 commented Apr 7, 2024

First of all i want to iterate I think this is a noble project and a good potential way to help law enforcement bust CSAM or revenge porn or help web mods or even ordinary users of social media to keep material they find objectionable off their feeds. I also understand that of course you'd rather see false positives than negatives especially in these early days. This issue however is not good news.
Screenshot 2024-04-06 220923
Screenshot 2024-04-06 221043
Screenshot 2024-04-06 221219
Screenshot 2024-04-06 221503
Screenshot 2024-04-06 221656
Screenshot 2024-04-06 221913

I'm using MS Edge Canary to access the site on Win11. I'm using an AMD Ryzen7 7840hs in case you leverage client side hardware for anything. Now most of the pics here were generated by a self hosted Stable Diffusion but not all, but that might be a good place to start looking for the models blindspot. I did not censor these pics to convey how glaring the false negative is.

@yyarrow
Copy link

yyarrow commented Apr 7, 2024 via email

@longems
Copy link

longems commented Apr 7, 2024 via email

@Huterox
Copy link

Huterox commented Apr 7, 2024 via email

@moluzhang
Copy link

moluzhang commented Apr 7, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants