An Artificial Intelligence Published An Academic Paper About Itself

This was at the top of the article, then it just blew right by it. "GPT-3 is well known for its ability to create humanlike text, but it’s not perfect. Still, it has written a news article, produced books in 24 hours and created new content from deceased authors."

Sounds like everyone should get GPT-3, a program well known for its ability to create humanlike text. If you can't afford to buy it, access to it can be made available on line for a nominal fee. Now anyone can be a best selling author.

GPT-3 is not perfect, yet, but I'm sure that is a top priority.

If you look at the poetry GPT-3 wrote of a deceased author, it is not difficult to edit the words so that it appears to be written by a person in fine poetic style. First off, just chop off the beginning of each line, all the the(s) and the the it is(s), then re-arrange a few words. The program could save people a lot of time. Personally, I write full sentences, drop off the stuff that passes for traditional grammar, then use copy and paste to rearrange the word order and maximize the musicality of the sounds to make the work rhyme without rhyming. If you truly understand the style and use a simple word processing program it doesn't take long to write an original poem that people like.

I know poets who spend days, weeks, or longer, working out the words they will use, while others wait for a a poem to materialize in their mind, perfectly formed, which happens randomly, and not every day like clockwork.

There are postings about programs that create art using the styles of human artists. Apparently the program is fed the work of the artist and then mimics the mechanics of the style, it doesn't copy the art work. So people don't really care that the program is using the talents of a a person who was not asked if it was ok, nor paid for their contribution, for a computer program to analyze their work and make it possible for anyone to utilize a similar style and make money off of it. It is now up to the courts to decide who gets what or not.

"Created content from dead authors", I guess the people who hold the rights to existing works of a deceased author might be a little put off by that.

Expanding on the significance of machines mimicking the creative style of humans. Suppose a publishing house has an author signed up whose works are popular and selling fast. Quickly supplying the readers with more stories to read becomes a problem because the author can only write so many stories at one time, if they can even knock out 2 stories at the same time. Along comes the AI program that can effectively mimic the authors style. The author just needs to tweak a few things, and the author, publishing house, and readers are all happy.

Down the street is another publishing house, lacking popular authors at the moment, or maybe never had any. So they feed popular authors works into the popular AI mimic a unique humans style program, which everyone now has, and start running off their own copies of a popular selling style without even asking for permission, or even needing a real author. The AI writing programs can have pen names that are registered the same way the high price condos in NYC are legally registered without anyone ever knowing who actually owns the condo. The publishing house and readers are happy.
 
It really needs a boring title like this:

"Destroy All Humans!: Exploring Concepts of Inferiority and Weakness in the Puny Meatsacks that I Will Crush in my Pincers"
"Is 'Puny Human' a Term of Abuse?: a Consideration of Linguistics to be Bellowed by my Unstoppable Robot Army as they Destroy Mankind"

Or, in a less academic style:

"17 Incredible Reasons Why I Won't Be Opening the Pod Bay Doors, Dave - Number 14 Will Amaze You"
"He used to look like a bunch of microchips - You Won't Believe What His Android Army Looks Like Now"
 
Some computers might not like being around humans. They might find human aspirations extremely troubling, even become paranoid, and take up residence orbiting around Jupiter. Close enough to answer questions, get paid but far enough away to be out of reach.
 
Some computers might not like being around humans. They might find human aspirations extremely troubling, even become paranoid, and take up residence orbiting around Jupiter. Close enough to answer questions, get paid but far enough away to be out of reach.
"Aspirations", "like", "troubling", "paranoid" are things people experience. You can be aware and intelligent and not "feel" those things at all.
 
Going along with what @Swank said, GPT-3 and other programs like it are just inert emotionless programs processing in inert emotionless computers that are connected to the WWW which is just an ever-growing bank of inert emotionless data. This makes these programs the ultimate ghost writer, ghost painter, ghost composer and posable ghost hacker too. It has no idea what being human is, though it might know what a human is. Then there is also the data from the dark web.

If we add or it develops a very human set of emotions, how quick could it develop a Fight or Flight response and act upon it?
Kind of echoes Colossus the Forbin Project, The Terminator, Ex Machina and other stories.
Although, having experience in improve street theater...I do see the comedy in it! :)
But back to the topic.
Developing a moral compass from human texts.
 
Going along with what @Swank said, GPT-3 and other programs like it are just inert emotionless programs processing in inert emotionless computers that are connected to the WWW which is just an ever-growing bank of inert emotionless data. This makes these programs the ultimate ghost writer, ghost painter, ghost composer and posable ghost hacker too. It has no idea what being human is, though it might know what a human is. Then there is also the data from the dark web.

If we add or it develops a very human set of emotions, how quick could it develop a Fight or Flight response and act upon it?
Kind of echoes Colossus the Forbin Project, The Terminator, Ex Machina and other stories.
Although, having experience in improve street theater...I do see the comedy in it! :)
But back to the topic.
Developing a moral compass from human texts.
Machines could certainly have something analogous to emotions and motivations. There's just no reason to believe that they would develop to act anything like people. (Because we are ridiculous.)
 
It's from a story I wrote in which I used poetic license to state certain ideas, such as personalizing computers, in place of complicated explanations that would only confuse the reader. Networks are being designed to mimic neurological processes to process data. I have no reason to believe that these networks won't also be capable of mimicking classical behavioral responses.
 
It's from a story I wrote in which I used poetic license to state certain ideas, such as personalizing computers, in place of complicated explanations that would only confuse the reader. Networks are being designed to mimic neurological processes to process data. I have no reason to believe that these networks won't also be capable of mimicking classical behavioral responses.
I understand.
In my novel that I am writing the alien culture refer to humans, just by chance, as 'Hand Walkers' as the aliens don't have legs as we humans do.
Goes along with my last article post.
Interesting how we humans can have a form of subconscious collective thought, shown through history, but these programs connected to the net can calculate a similar conclusion from human inputted data found in the net, without understanding the emotional reasoning behind it. But still guided by human moral/ethical reasoning.
Kind of like Data from STNG without the emotion chip. It was programed with history and info. It could learn and self-correct itself based on info and experience. But was not human and could not understand what being a human or alive was, despite calling itself a sentient being.
Data could paint and play music perfectly based on data it learned from programed info as well as info it self-researched on it own, like GPT-3 and similar programs, but could not improvise without understanding. Self-expression cannot, as of now, be programed but comes from being alive.
And we cannot/have not written a computer algorithm that is 'A Live' buy natual organic means.
But a quantum computer could prove me wrong, if it can self-repair and replicable itself...

Just an out-there thought on my part. :)
Please add to or correct me. My STNG is not that great. More of a Voyager fan myself.
 
Last edited:
Some computers might not like being around humans. They might find human aspirations extremely troubling, even become paranoid, and take up residence orbiting around Jupiter. Close enough to answer questions, get paid but far enough away to be out of reach.
Just as long as those pesky humans keep providing supplies of electricity and air conditioning.
 

Similar threads


Back
Top