.com.unity Forums

.com.unity Forums (http://forum.shrapnelgames.com/index.php)
-   Space Empires: IV & V (http://forum.shrapnelgames.com/forumdisplay.php?f=20)
-   -   OT - Sentience (http://forum.shrapnelgames.com/showthread.php?t=38382)

Ed Kolis April 11th, 2008 06:52 AM

OT - Sentience
 
More deep thoughts from early in the morning... http://forum.shrapnelgames.com/image...ies/tongue.gif

I've come to the conclusion that all computers are by definition sentient, as I'd define sentience as the ability to take physical inputs and create abstractions. That doesn't necessarily mean that it's immoral to destroy a computer, of course, as sentience comes in varying degrees, depending on the range of abstractions it is possible for someone or something to create. It does mean, however, that software is not sentient, as software, being an abstract entity and not a physical one, cannot receive physical inputs. However, software can be used to increase the sentience of hardware...

I wonder, then... what about distributed computing? Obviously it is immoral in most cases to harm an entity meeting some threshold of sentience - e.g. a human being. Has the Internet as a whole yet (yes, I say yet - it will happen if it has not already) reached a human level of sentience? Will this in turn mean that writing computer viruses will be treated in a a similar manner to developing biological weapons? I think it will, even if my definition of sentience is not the standard one... as computers become more powerful, they will play a greater and greater role in people's lives, and thus harming a computer will cause greater indirect harm to a human being than ever before. (In the extreme case, imagine humanity divided into 2 "races" or "factions", one which uses cybernetic implants and one which does not. Now imagine the latter faction developing a virus which infects the former faction's cybernetic components causing fatal malfunctions but has no biological components. Is that genocidal computer virus not a wolf in sheep's clothing?)

Randallw April 11th, 2008 10:40 AM

Re: OT - Sentience
 
How can anyone judge sentience in others?. you can't experience the thoughts of another so how do you now they reach coclusions the same way you do. For all you know reality is a facade, generated so as to occupy you. This is a topic of which I spend much time contempalting, but not too much otherwise you can get into trouble.

and for the record it doesn't matter if something else is sentient. All other life is secondary to the needs of man.

Raapys April 11th, 2008 11:08 AM

Re: OT - Sentience
 
Strictly speaking everything's physical, even the input software in a computer receives. And what is the human brain but an advanced biological computer?

GuyOfDoom April 11th, 2008 12:16 PM

Re: OT - Sentience
 
Quote:

Randallw said:
All other life is secondary to the needs of man.

The problem being man is completely dependent on many other forms of life. We just think we exist outside of the biosphere because we can tweak it slightly.

AgentZero April 11th, 2008 05:49 PM

Re: OT - Sentience
 
I've found discussions like these tend to suffer from a lack of proper definitions, so for the record:
Sentience - The ability to receive input from "senses".
Intelligence - The ability to process said input and produce output.
Self-Awareness - Well, an awareness of self. It involves the understanding that one exists, as an individual separate from others, with one's own private thoughts.
Life - Something with the ability to grow, reproduce and adapt to it's surroundings, and according to strict definitions, a metabolism.

Now, ignoring the whole "morality is relative" argument, according to our current morality, would it be wrong to destroy sentient computers? Every form of life on the planet is sentient, and we have no problem destroying just about any form of life that isn't our own (morally speaking, of course).

And while some computers surely could be considered sentient, intelligent machines, as far as I'm aware, no one has come up with a self-aware computer yet. But when we come up with one, then we'll surely be wandering into a moral gray area, since now we're talking about destroying something that is aware of it's own existence, and that's just creepy if nothing else.

And what if we get around to creating artificial life? Well, if it's mechanical, it's not life, and thus the best we can hope for is intelligent, self-aware machines. But what happens if we artificially create real, biological life and it evolves into an intelligent, self-aware life form? Would they not see us as gods? And if so, what sort of moral code would be required of a god?

narf poit chez BOOM April 11th, 2008 05:52 PM

Re: OT - Sentience
 
The key component is some kind of awareness. My computer shows no discernable signs of awareness and I certainly can't act on reality as I *Don't* percieve it.

Also, slavery is wrong.

Arkcon April 11th, 2008 06:17 PM

Re: OT - Sentience
 
Like it was said, sentience is a term from ancient Greek philosophy, meaning "having senses", like dogs and humans have, as opposed to plants and rocks. Many sci-fi authors like "sapient" for reasoning species, link humans and the alien races in their spacecraft. I know I like the term, but some people actually claim it's racist, having roots in the species name for humans -- homo sapiens. http://forum.shrapnelgames.com/image...s/rolleyes.gif They prefer the term "sophont".

One of the classical tests for self-awareness is the mirror test. You mark an animal on it's face, and show it it's reflection. If it tries to use the mirror to study the mark, it's self aware. Dogs and birds often fail the mirror test, reacting with fear or tying to otherwise interact with the image. Chimps and dolphins respond to their refection. interestingly, children tend to fail the mirror test before they're 4 years old. And I doubt a computer will ever pass.

That's the whole thing about computer intelligence. Humans and other animals crave social interaction -- infants know, that if they manipulate their surroundings, they get what they need to be comfortable. How're going to program a computer, to improve it's programming, to avoid a power drain -- it has no belly, it's never hungry, or cold or afraid of being alone.

Dogs want security of the pack, they've started with the beginnings of intelligence and we can manipulate their behavior. Horses instinctively run from fire, except for horses that pulled fire wagons, they used their sense of smell to run towards fire. Again, pack mentality allowed us to "reprogram" them. There's no way to discipline a computer.

Raapys April 11th, 2008 07:24 PM

Re: OT - Sentience
 
But that's just the way computers are now, and even now all those things could be programmed into one; you could connect it to a temperature sensor and tell it that when it goes below 0 degrees celcius it should take measures to warm itself up; it would basically be like it could feel cold. At this stage computers are limited by their own programming, but in the future this might not always be so.

Like I said, our brain is little more than an advanced biological computer that is connected to some sensors, alot of small factories that keeps our machinery going, and a few tools that allows us to interact with our surroundings. It collects data from those sensors and stores it, and thus becomes *us*, the sum of our own memories and experiences.

From what I recall, they've found evidence indicating life on this planet may have been evolving for as long as 3.5 billion years. Computers have been under development for 80'ish years?

Somehow I think that should development on computers continue for another 3 billion years the result would be staggering.

narf poit chez BOOM April 11th, 2008 09:18 PM

Re: OT - Sentience
 
If all you think you are is the sum of your memories, then who's the 'you' that's remembering those memories?

Memories don't explain awareness.

Raapys April 11th, 2008 11:02 PM

Re: OT - Sentience
 
Does there really need to be a 'you', though? Suppose the brain's just 'designed' to automatically shuffle through memories, or rather experiences, all the time. Trying to not think about anything is impossible, right?

What's awareness but the brain getting continuous input from your enviroment via your senses, comparing it with previous experiences, and then deciding on the course of action that is likely to bring the best outcome possible?


All times are GMT -4. The time now is 07:16 PM.

Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©1999 - 2024, Shrapnel Games, Inc. - All Rights Reserved.