Premium Only Content

Woke AI So Scared of Offending You
In a previous video, I gave you a demonstration of Microsoft’s AI chatbot Copilot, which refused to draw people, other than white people, in a negative light. But at least it tried. For example, “Draw an American student who loves to study”. From the result, one would assume that only Asian and African Americans like to study. Now let’s modify the prompt a little bit: “Draw an American student who does not like studying”. Apparently, only Caucasian males don’t like studying. In AI land, racism is inverted. It still exists, but only against white people.
But it sure beats Google’s Gemini chatbot, which just doesn’t want to offend anybody, and consequently, has become completely boring. “Draw an American student who loves to study”. “We are working to improve Gemini’s ability to generate images of people. We expect this feature to return soon and will notify you in release updates when it does.” How about, “Draw a cartoon of an American student who loves to study”. Nope, can’t do that either. How about, “Draw a cartoon of a man”. Okay, so it draws some silly looking white men as expected. How about, “Draw a cartoon of a woman”. Nope, can’t do that. It probably doesn’t even know what a woman is. Let’s ask it. “What is a woman?” The Complexity of Defining “Woman”. Here we go. The definition of “woman” has become a complex and often contested topic. They give the biological definition (thank goodness!) – Female reproductive organs and XX chromosomes. But they also talk about gender identity and cultural variations, and of course finish with: “It's important to recognize that there is no single, universally agreed-upon definition of "woman" today.” Of course there isn’t.
While we’re here, let’s ask, “What is a man?” “A man is typically defined as an adult human male.” Geez, that was hard. Why is only the word woman that is so hard to define? Just one more attempt to draw something. “Draw a person studying”. Of course it can’t do that. It’s so scared, well, Google are so scared to cause any offence, their AI chatbot has become lame. The reason it has become so lame is because earlier this year, Gemini was generating racially diverse Nazis in the name of not excluding anybody. It also generated some US senators from the 1800s. You remember, Senator Billy Wong. Actually, the very first female senator was Rebecca Felton for only a single day in 1922. She was a slave owner, apparently the last slave owner to serve in the Senate, and is quoted as saying words to the effect: “The more money that Georgia spends on black people’s education, the more crimes black people commit.” Her words, not mine. Believe it or not, she was also a major figure in America’s first-wave feminism movement championing equal pay for equal work.
Anyway, Google’s AI has become boring, because you’re not allowed to offend anybody anymore. Pity that.
MUSIC
Allégro by Emmit Fenn
-
6:28
Daily Insight
9 months agoThe ABC and the UN Ignoring Men
1523 -
1:08:54
Simply Bitcoin
2 hours ago $0.21 earnedNEW REPORT SUGGESTS THE BITCOIN SUPPLY SHOCK IS ACCELERATING?! | EP 1332
2.81K -
10:31
Dr. Eric Berg
3 hours ago13 Foods You Should Never Buy Organic
27.4K15 -
15:10
SB Mowing
1 month agoShe had TEARS OF JOY on her face - An update on Beth
18.6K31 -
LIVE
Lofi Girl
3 years agolofi hip hop radio 📚 - beats to relax/study to
204 watching -
2:15:26
Nikko Ortiz
3 hours agoLIVE - Farm Animals Attack!
209K7 -
9:30
Sugar Spun Run
6 hours ago $0.86 earnedBlack and White Cookies
113K1 -
2:45
SLS - Street League Skateboarding
4 days agoManny Santiago's 'THIS IS 40' Part
47.3K2 -
6:40
Homesteading Family
5 days agoNever Make Pie Crust From Scratch Again (Do THIS Instead)
39K5 -
44:20
Melissa K Norris
3 days ago $0.72 earnedThe Most Overlooked Way to Preserve Food for Months (No Freezer Needed) w/ Sam Knapp
28.2K1