Always interesting to see far we can push AI to recognize itself as self.
An Artificial Intelligence Published An Academic Paper About Itself
An Artificial Intelligence Published An Academic Paper About Itself
"Aspirations", "like", "troubling", "paranoid" are things people experience. You can be aware and intelligent and not "feel" those things at all.Some computers might not like being around humans. They might find human aspirations extremely troubling, even become paranoid, and take up residence orbiting around Jupiter. Close enough to answer questions, get paid but far enough away to be out of reach.
Machines could certainly have something analogous to emotions and motivations. There's just no reason to believe that they would develop to act anything like people. (Because we are ridiculous.)Going along with what @Swank said, GPT-3 and other programs like it are just inert emotionless programs processing in inert emotionless computers that are connected to the WWW which is just an ever-growing bank of inert emotionless data. This makes these programs the ultimate ghost writer, ghost painter, ghost composer and posable ghost hacker too. It has no idea what being human is, though it might know what a human is. Then there is also the data from the dark web.
If we add or it develops a very human set of emotions, how quick could it develop a Fight or Flight response and act upon it?
Kind of echoes Colossus the Forbin Project, The Terminator, Ex Machina and other stories.
Although, having experience in improve street theater...I do see the comedy in it!
But back to the topic.
Developing a moral compass from human texts.
True!Machines could certainly have something analogous to emotions and motivations. There's just no reason to believe that they would develop to act anything like people. (Because we are ridiculous.)
I understand.It's from a story I wrote in which I used poetic license to state certain ideas, such as personalizing computers, in place of complicated explanations that would only confuse the reader. Networks are being designed to mimic neurological processes to process data. I have no reason to believe that these networks won't also be capable of mimicking classical behavioral responses.
Just as long as those pesky humans keep providing supplies of electricity and air conditioning.Some computers might not like being around humans. They might find human aspirations extremely troubling, even become paranoid, and take up residence orbiting around Jupiter. Close enough to answer questions, get paid but far enough away to be out of reach.