Scalzi on Nostr: Considering that LLMs started off devolving within hours into Nazi rhetoric so vile ...
Considering that LLMs started off devolving within hours into Nazi rhetoric so vile they had to be taken offline, I can't say that I see this sort of overcorrection as the most horrible thing that could have happened. Also, a reminder that LLM aren't in fact intelligent, artificially or otherwise. They output what they're programmed to.
https://www.pcmag.com/news/google-explains-what-went-wrong-with-geminis-image-generation
https://www.pcmag.com/news/google-explains-what-went-wrong-with-geminis-image-generation