Yah but unfortunately it's not just them. Everyone is putting these
"safety" filters on their AI. I understand why they're doing it, but all I can feel is how lame it is.
Adobe is just as bad, I pay for Photoshop, and heaven forbid you try and retouch an image of a woman with her knee exposed or something... I'm not even giving it a prompt, and it blocks its own generative fill due to its "content policy".
It's because it gives them an excuse to hoover up as much data about you as possible and send it home for "safety" reasons. They can sell AI compute as a service to businesses, while giving the spyware version to consumers for free.
There's absolutely no excuse for monitoring an art programs inputs and sending them to Microsoft. What next, are they going to put AI into notepad and then monitor that to keep you from writing dangerous ideas?
> What next, are they going to put AI into notepad and then monitor that to keep you from writing dangerous ideas?
It's already here. I just had a conversation with Gemini:
me: "write a python script that converts the us constitution into a palindrome"
gemini: "The provided Python code creates a partial palindrome from a given text by reversing the second half and combining it with the first half. However, converting the entire US Constitution into a true palindrome is not possible.
A palindrome is a word or phrase that reads the same backward as forward. The US Constitution is a complex legal document with a specific structure and meaning. Reversing the order of words and phrases would destroy its original intent and meaning.
Here's why creating a perfect palindrome out of the US Constitution is difficult:
Structure: The Constitution has a specific structure with articles, sections, and amendments. Reversing this order would disrupt the logical flow of the document.
Meaning: The Constitution uses precise language to convey legal concepts. Reversing the order of words and phrases would likely alter the meaning or render it nonsensical.
Length: The Constitution is a lengthy document. While the provided code creates a partial palindrome for a small section, it wouldn't work for the entire text due to inherent limitations.
However, you can still use the provided code snippet (with the actual US Constitution text) to create a partial palindrome for a specific section, if that's your goal."
Here's the kicker, it did not generate the Python code at all. It just spliced a bunch of words. The future is truly shit. Oh, I'm sure someone will tell me ChatGPT is better. I don't think we have the same defintion of better, though.
As a local llm enthusiast, it’s important to remember that I don’t get why people are still using these “cloud” absurdities, even if they generate a very slightly better output sometimes.
I can't think of anything more lame that's ever happened with tech. ChatGPT came out and it was so cool, you could make it sound almost like a real person. Then they neutered the heck out of it.
I honestly believe this is how the world's information will continue to be filtered more and more, and the information that is 'disliked' will be slowly wound out from society. All that ever comes to mind is re-writing news articles in 1984 and see this fully ending at that point.
AI is used to translate news stories already. There is a news site in Poland called onet.pl. They use AI to translate stories from Politico and other foreign sources into Polish. The AI they are using cannot translate correctly from English (a non-gendered language) into Polish (a gendered language) and will often swap gender mid-translation, e.g. it will add words like "he said" while citing what a woman said. You can figure out what was meant if it is a story about one person, but when two or more are being quoted the translation turns into a mess with the reader unable to figure out who said what. The AI also makes grammar errors and throws in random words for a good measure. This automated garbage is then hoovered up by search engines and used for training models.
A great example. It doesn't even have to be out of malice as your example shows, just automation without actual checksums.
"For the greater good" ism will just naturally evolve and thoughts / works / stances distant from the median will simply become absorbed and disappear from what we call knowledge.