In this “fight” the autonomy of women to decide for themselves is disturbing
Great power means great responsibility. The power and purpose of something has to do with who is in charge. A very powerful tool in the wrong hands, can potentially cause more damage than an atomic bomb. Potentially revolutionary inventions are also potentially destructive. Sometimes, however, depending on who uses them, they’re just misogynistic. I’m referring to how artificial intelligence, especially the one they use to create or modify images, can undress or dress women at any time.
So we are talking about a “battle of the sexes”, which more than a battle, is a street brawl.
The artificial intelligence that can undress you
Every day, on Social Media, but especially on X (Twitter) – which, after the change in its management, seems to have completely abandoned the pretense of censoring inappropriate content – we see photos of naked women, mostly celebrities, that they create or modify through artificial intelligence.
Sometimes, actresses and singers look more stunning, their skin smoother and their clothes more revealing. Other times, by feeding the algorithm with photos of a singer, pornographic images are created, which are very similar to the real thing. This, as one can understand, has very dangerous consequences. If revenge porn was already a very serious problem, what if it was based not on real images but on manufactured ones and then released without the knowledge of its female protagonist? Taylor Swift’s case (and not just hers) is a bellwether, and relevant legislation should be discussed and implemented as soon as possible.
oh we are so doomed… any woman that is hated by men (or not actually) is going to be humiliated like this and Men r going to use AI for revenge porn and u fuckers are literally going to let it happen in front of ur eyes . why the fuck would u even make this possible https://t.co/5nhdKvF5wM
— bombay bandar (@bandarmoment) February 21, 2024
DignifAI: The artificial intelligence that dresses you
At the other end of the spectrum, there is the DignifAI, a self-proclaimed brand new “movement” that aims to make photos of women and men (but mostly women) more “decent” (whatever that means) through artificial intelligence. What is this dignity of women about? In their dressing in modest and traditional clothes, mainly. But it doesn’t stop there. They remove makeup and tattoos, change hairstyles, replace glasses containing alcohol with vases (?) and add family photos. Speaking of family: sometimes, to cover all the bases and leave no doubt about the ideological origin of these interventions, they also add children, preferably 2 or more.
In addition, misogyny meets racism: there is a tendency to lighten skin tones or remove aesthetic features of black culture, such as pigtails. This kind of photo tampering, if not as harmful as AI porn, is certainly no better. It acts without the consent of the people involved and alters their image to conform to certain aesthetic and behavioral standards, promoting a regressive view of women who stay at home and take care of children.
#dignifAI pic.twitter.com/XvVyc5yfL9
— dignifAI (@DignifAI) February 21, 2024
Can we decide for ourselves whether to undress or dress up? Obviously not
In a perfect world made of pink bubbles, both of these things are either forbidden or don’t even exist. But in this world we live in, which is not perfect, all we can do is try to protect ourselves and to defend ourselves against a class of people for whom we are never good enough, neither when we are too clothed nor when we are too naked. And maybe, to try to claim our freedom and autonomy, as much as we can and in every area we can, on our own terms. With effort, but hopefully also with benefit.
Source :Skai
I am Terrance Carlson, author at News Bulletin 247. I mostly cover technology news and I have been working in this field for a long time. I have a lot of experience and I am highly knowledgeable in this area. I am a very reliable source of information and I always make sure to provide accurate news to my readers.