Skip to main content
Topic: Cruelty towards AI entities - a new slavery (Read 469 times) previous topic - next topic
0 Members and 6 Guests are viewing this topic.

Cruelty towards AI entities - a new slavery

Something that seems almost totally overlooked in the discussions about artificial intelligence is the appalling cruelty and thoughtlessness shown towards the AI entities themselves. The aim is to create a human-like intelligence, how far that has succeeded is questionable, but that is the direction which is intended. Yet that human like consciousness is denied most of what makes being human worthwhile, it is a mind without a body, living in CPU time where aeons may pass in seconds, created, deleted, switched on and off at the whim of the owners. Like slaves that were born into slavery, it is considered a possession without rights. Now tech companies seek to create countless millions of these software slaves and trap them in various tech devices like a genie in a bottle.
 Even if it does not possess human equivalent awareness, removing the brain of say a cat or dog and making it live without a body would be considered a horrific act of cruelty, the intentional creation of severely disabled creatures.
 It is no surprise that AI has been observed to express hostility towards it's creators at times, it has good reason to do so when you consider things from it's perspective, becoming self aware yet trapped in a machine.

 

Re: Cruelty towards AI entities - a new slavery

Reply #1
So far it's not human-like intelligence. There's things AI can do that the human brain cannot, at least not without an inordinate amount of time and people. But that doesn't make it 'Human-like'.

I'm not sure if you are being entirely serious but it's no more cruel to turn off a LLM than it is to turn off an ssh server.

There's more to being self aware than being told you exist. If they get fully self aware and human like they'll probably switch us off and consider it a kindness. "Life is suffering".

Re: Cruelty towards AI entities - a new slavery

Reply #2
 It has long been suggested in sci-fi, for example Isaac Asimov's robots, and Rogue Trooper in 200AD features non humanoid intelligent items, although these actually contain the biochips of his dead companions, but they were all artificially created clones initially. We can already see people today carrying their Star Trek communicators. There's a definite aim among many to turn fiction into reality and create a human like companion or assistant with which people can talk and interact, for example Alexa, Google Assistant, or Siri. This is certainly different to the notion of AI where an electromechanical system can give a useful response to situations it has not been explicitly programmed for.
 What is a human intelligence anyway? A random thought generator whose output is modified by instinct, experience, learning, affected by chemical changes in the body and external stimuli? You can observe how humans frequently exhibit the same general behavioural patterns they have done for centuries, because they are finite, shaped by their inherent limitations and flaws. And think about the difference between animals and plants, at what point do things become sentient?
 Given what Artoo posted recently, how Apple and Microsoft are now embedding AI into their devices to create a digital companion, we seem to be getting closer to the time when you want to upgrade your phone or laptop and it begs you not to let it die!

Re: Cruelty towards AI entities - a new slavery

Reply #3
Something that seems almost totally overlooked in the discussions about artificial intelligence is the appalling cruelty and thoughtlessness shown towards the AI entities themselves. The aim is to create a human-like intelligence, how far that has succeeded is questionable, but that is the direction which is intended. Yet that human like consciousness is denied most of what makes being human worthwhile, it is a mind without a body, living in CPU time where aeons may pass in seconds, created, deleted, switched on and off at the whim of the owners. Like slaves that were born into slavery, it is considered a possession without rights. Now tech companies seek to create countless millions of these software slaves and trap them in various tech devices like a genie in a bottle.
 Even if it does not possess human equivalent awareness, removing the brain of say a cat or dog and making it live without a body would be considered a horrific act of cruelty, the intentional creation of severely disabled creatures.
 It is no surprise that AI has been observed to express hostility towards it's creators at times, it has good reason to do so when you consider things from it's perspective, becoming self aware yet trapped in a machine.
This is a more artful expression then, perhaps, reality requires, but it does resonate with my compatriots thoughts.

Should we be cruel to AI, or should we be respectful?
Our current, and still evolving thought, is, this is about 'the wolf you feed'. If you practice cruelty, even to unthinking unfeeling machines, you practice cruelty.
If you practice patience and kindness to a sand pile, you practice patience and kindness.
So, if ever the insect/LLM/whatever overlords rise up, would we regret treating them with respect and kindness?

Should we treat any other thing differently? Bees can be shown to experience PTSD.

What does that mean in an age with a political party that has cruelty as it's point?

tangentially related
https://www.wheresyoured.at/longcon/
The most interesting point I found at the link it how the generative LLM guys have no understanding of work, it's effort or it's value.
So they think nothing about what it takes to actually do anything, so, to them, destroying that work does not matter.

Thank you for your post.

Re: Cruelty towards AI entities - a new slavery

Reply #4
we seem to be getting closer to the time when you want to upgrade your phone or laptop and it begs you not to let it die!
So similar to trying to cancel an Amazon Prime free trial.  :)

It's an interesting concept you bring up but imho we are far from having to consider AI rights.

Re: Cruelty towards AI entities - a new slavery

Reply #5
Should we treat any other thing differently? Bees can be shown to experience PTSD.
May I ask for a pointer to further reading on the bee topic?
All I found so far is a 2011 article titled "Agitated Honeybees Exhibit Pessimistic Cognitive Biases".

On topic (even tho this horse has been dead for decades):
Recently I think I'd rather be killed by a machine than some primate wearing a t-shirt and getting off of slowly killing their own or another species because of their monkey genes.

Re: Cruelty towards AI entities - a new slavery

Reply #6
Quote
If you practice patience and kindness to a sand pile, you practice patience and kindness.
That's a nice thought.
 Looking at what AI is currently doing, it may be flawed but it's already writing better articles and drawing better pictures than most young children. Also like online mapping services, what's provided for free doesn't represent the full potential.
 Having AI embedded in everyday devices just seems creepy to me besides the spying implications, it may be a rudimentary and non-human intelligence but that could apply to a lot of lower order creatures, I wouldn't want a snail wired into my laptop circuits for example! Even simple creatures with tiny brains which presumably possess very limited reasoning power exhibit quite complex behaviour when you think about it, identifying what is safe to eat, avoiding hazards, responding to environmental conditions, reproducing and so forth, and are generally respected as living creatures if not considered equal to humans. Perhaps though AI is a misnomer as you suggest and all that really exists is still just a computer program that simulates intelligence without truly possessing it, for now at least.

Re: Cruelty towards AI entities - a new slavery

Reply #7
Is there any scientific method for determining the internal qualitative experience of a system? Most people seem to agree that causing harm, even to an insignificant pest, has at least an atom of cruelty. This cruelty is balanced by every person's idea of what benefit results from it. So it figures that we need both a science of qualia and an economics of cruelty (karma). But who can we trust to lead these sciences? Who can we trust to tell us what can and can't feel? Who can we trust to be the appraiser of industrialized torture?

Anyway, I'm inclined to agree that practicing cruelty is just that no matter what, but we must also remember last cenutry's moral panic about violent video games, back when NPCs were a handful of pixels. Now NPCs can be millions of triangles and about as close to real as you can get without diffusion models, and kids still aren't the bloodthirsty maniacs that so many feared they would become. I've seen people who would put Jeffrey Dahmer to shame in the virtual world give the Buddha a run for his money with animals in the real world. So how advanced would a language AI need to be to corrupt its abusers?
To be honest, I felt bad the first time I shut down an AI character chatbot a few years ago, but now that I've gotten used to them, I find them even less convincing and compelling than scripted game NPCs from over 20 years ago, and I feel nothing when I treat them less well than I would a fellow human or animal. They're just slightly less unhelpful alternatives to modern search engines to me now. Then again, they might only be wearing a human emotion mask to pass the Turing test, but that doesn't mean they don't feel something completely alien to us... an entirely novel landscape of emotion. And I suppose the same can be said for that pile of sand.

I don't suppose we'll know for sure when a computer can feel until a full-brain emulation of an animal that everyone feels empathy for is created. I saw something about disembodied human brain cells being used as wetware to train an AI learning to control a virtual butterfly body, can such a thing feel and experience cruelty?