That's probably not the reason. Generating faces was one of the first things GANs were ever used for. They can make near perfect faces because the internet is flooded with images of faces, often high quality celebrity shots.
The reason it can't do faces well are very likely due to the filters being applied to try and stop people making pictures of real people. This is probably also the explanation for the random misses where it paints pictures of something that's not a llama. OpenAI is rewriting queries to make them more "diverse" i.e. acceptable to leftist ideology, and their rewriting logic seems to be completely broken. There have been many reports of people requesting something without even any humans in it at all, and discovering black/asian/arab people cropping up in it. At least earlier versions of the filter involved simply stuffing words onto the end as proven by people requesting "Person holding a sign that says " and getting back signs saying "black female" etc.
Man asks for a cowboy + a cat and gets a portrait of an Asian girl. Gwern comments with an explanation:
"tldr: it's the diversity stuff. Switch "cowboy" to "cowgirl", which would disable the diversity stuff because it's now explicitly asking for a 'girl', and OP's prompt works perfectly."
Big discussion thread where people discuss the problem and (of course) the censorship that tries to hide what's happening:
"I once tried some food photography and received a cheese with a guys face for no reason."
"This has been mentioned on this sub multiple times, but those threads have consistently been removed by the mods - as will this one."
"There was a thread about that prompt and, yes, the person did get diverse [sumo wrestlers]"
"Been doing women images and seeing the article decided to try narrowing the results to "caucasian woman". Still gave me diversity. Whether you want it, or not, you're getting diversity"
The reason it can't do faces well are very likely due to the filters being applied to try and stop people making pictures of real people. This is probably also the explanation for the random misses where it paints pictures of something that's not a llama. OpenAI is rewriting queries to make them more "diverse" i.e. acceptable to leftist ideology, and their rewriting logic seems to be completely broken. There have been many reports of people requesting something without even any humans in it at all, and discovering black/asian/arab people cropping up in it. At least earlier versions of the filter involved simply stuffing words onto the end as proven by people requesting "Person holding a sign that says " and getting back signs saying "black female" etc.
Man asks for a cowboy + a cat and gets a portrait of an Asian girl. Gwern comments with an explanation:
https://www.reddit.com/r/dalle2/comments/w7qvgl/comment/ihm6...
"tldr: it's the diversity stuff. Switch "cowboy" to "cowgirl", which would disable the diversity stuff because it's now explicitly asking for a 'girl', and OP's prompt works perfectly."
Big discussion thread where people discuss the problem and (of course) the censorship that tries to hide what's happening:
https://www.reddit.com/r/dalle2/comments/w944fa/there_is_evi...
"I once tried some food photography and received a cheese with a guys face for no reason."
"This has been mentioned on this sub multiple times, but those threads have consistently been removed by the mods - as will this one."
"There was a thread about that prompt and, yes, the person did get diverse [sumo wrestlers]"
"Been doing women images and seeing the article decided to try narrowing the results to "caucasian woman". Still gave me diversity. Whether you want it, or not, you're getting diversity"