-
Notifications
You must be signed in to change notification settings - Fork 6.1k
Give more customizable options for safety checker #815
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 2 commits
cde3c93
5cbc2d3
a79e651
7e0d960
a8343a3
3307ff4
a9abef0
9ee2787
2a689ed
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,5 @@ | ||
from dataclasses import dataclass | ||
from typing import List, Union | ||
from typing import List, Optional, Union | ||
|
||
import numpy as np | ||
|
||
|
@@ -20,11 +20,11 @@ class StableDiffusionPipelineOutput(BaseOutput): | |
num_channels)`. PIL images or numpy array present the denoised images of the diffusion pipeline. | ||
nsfw_content_detected (`List[bool]`) | ||
List of flags denoting whether the corresponding generated image likely represents "not-safe-for-work" | ||
(nsfw) content. | ||
(nsfw) content. If safety checker is disabled `None` will be returned. | ||
patrickvonplaten marked this conversation as resolved.
Show resolved
Hide resolved
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @patrickvonplaten one question/request -- would it be possible to have a setting that still returns whether or not an image is nsfw, but does not black out that image? This would allow devs who are building on top of this library to do something like show a popup to a user (e.g. 'you are about to view a NSFW image, do you want to proceed') One possible way to implement this is to pass a flag into the safety checker module that disables the 'return a black image' part of the checker. Another way is to expose the safety checker class so that end users can add the checker in at the end (solely to get the bool[] indicating nsfw-ness) |
||
""" | ||
|
||
images: Union[List[PIL.Image.Image], np.ndarray] | ||
nsfw_content_detected: List[bool] | ||
nsfw_content_detected: Optional[List[bool]] | ||
|
||
|
||
if is_transformers_available() and is_torch_available(): | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -71,6 +71,16 @@ def __init__( | |
new_config["steps_offset"] = 1 | ||
scheduler._internal_dict = FrozenDict(new_config) | ||
|
||
if safety_checker is None: | ||
logger.warn( | ||
f"You have disabed the safety checker for {self.__class__} by passing `safety_checker=None`.Please" | ||
patrickvonplaten marked this conversation as resolved.
Show resolved
Hide resolved
|
||
" make sure you have very good reasons for this and have considered the consequences of doing so.The" | ||
patrickvonplaten marked this conversation as resolved.
Show resolved
Hide resolved
|
||
" `diffusers` team does not recommend disabling the safety under ANY circumstances and strongly" | ||
" suggests to not disable the `safety_checker` by NOT passing `safety_checker=None` to" | ||
" `from_pretrained`.For more information, please have a look at" | ||
patrickvonplaten marked this conversation as resolved.
Show resolved
Hide resolved
|
||
" https://github.com/huggingface/diffusers/pull/254" | ||
) | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think referencing another PR dilutes the message. I'd propose something like:
I think it's worthwhile to make this as clear as we can, maybe @natolambert, @mmitchellai, @yjernite can provide better wording. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I would even say:
I mean, I think proposing a use case when removing the safety filter is acceptable kind of justifies why we are proposing this PR to remove it. |
||
self.register_modules( | ||
vae=vae, | ||
text_encoder=text_encoder, | ||
|
@@ -335,10 +345,15 @@ def __call__( | |
# we always cast to float32 as this does not cause significant overhead and is compatible with bfloa16 | ||
image = image.cpu().permute(0, 2, 3, 1).float().numpy() | ||
|
||
safety_checker_input = self.feature_extractor(self.numpy_to_pil(image), return_tensors="pt").to(self.device) | ||
image, has_nsfw_concept = self.safety_checker( | ||
images=image, clip_input=safety_checker_input.pixel_values.to(text_embeddings.dtype) | ||
) | ||
if self.safety_checker is not None: | ||
safety_checker_input = self.feature_extractor(self.numpy_to_pil(image), return_tensors="pt").to( | ||
self.device | ||
) | ||
image, has_nsfw_concept = self.safety_checker( | ||
images=image, clip_input=safety_checker_input.pixel_values.to(text_embeddings.dtype) | ||
) | ||
else: | ||
has_nsfw_concept = None | ||
|
||
if output_type == "pil": | ||
image = self.numpy_to_pil(image) | ||
|
Uh oh!
There was an error while loading. Please reload this page.