AI is outperforming humans in both IQ and creativity in 2021

Are we ready as a forum to handle an AI subforum? We got enough active threads on the subject, and at the current rate of increase we’ll have ten times as many soon
Maybe... I just hope it doesn't get banned too.
 
So much nonsense in that article, its hard to know where to start.

AI is already far more creative than any human. As of March 2021, just a single American language model (GPT-3) was producing an entire public library—80,000 books worth—of new content every day.”
— Dr Alan D. Thompson

Dr Alan Thompson needs to get himself a vocabulary. He clearly means productive, or prolific, but certainly not creative.

As for being 40% better than humans at trivia; I would think they should be! They have search engines and almost ulimited databases of Internet content bolted on.

A couple of days ago that chat bot tried to tell me that the number 83 is a palindrome. Intelligence? Really?
 
So much nonsense in that article, its hard to know where to start.

AI is already far more creative than any human. As of March 2021, just a single American language model (GPT-3) was producing an entire public library—80,000 books worth—of new content every day.”
— Dr Alan D. Thompson

Dr Alan Thompson needs to get himself a vocabulary. He clearly means productive, or prolific, but certainly not creative.

As for being 40% better than humans at trivia; I would think they should be! They have search engines and almost ulimited databases of Internet content bolted on.

A couple of days ago that chat bot tried to tell me that the number 83 is a palindrome. Intelligence? Really?
Chat gpt and GPT 3 are different beasts.
Chat gpt is actually smaller.
GPT3 scores 120 on IQ tests
Chat gpt scores about 82.

"ChatGPT is not just smaller (20 billion vs. 175 billion parameters) and therefore faster than GPT-3, but it is also more accurate than GPT-3 when solving conversational tasks—a perfect business case for a lower cost/better quality AI product."
...
So chatgpt is basically Forrest Gump with access to the internet and a lot of confidence.

 
I seem to be able to trip it up every single time I use it. Basically there is no intelligence behind it (artificial or otherwise). Just look at this for example:

Q: what are the next three numbers in this sequence: 2, 4, 3, 9, 4, 16, 5, 25, 6

A: The given sequence appears to be alternating between even squares and consecutive odd numbers. Using this pattern, the next three numbers in the sequence would be:
7, 36, 8


The correct answer is obviously 36, 7, 49. The explanation it gives makes no sense (the sequence is definitively not "alternating between even squares and consecutive odd numbers"). A pitiful performance considering the question was very easy.

This seems to be a fairly good language interpreter bolted on to a search engine. I think it is quite good at extracting the meaning from questions and finding the answers if they are already out there somewhere in the ether. Apart from that its quite sucky.
 
I seem to be able to trip it up every single time I use it. Basically there is no intelligence behind it (artificial or otherwise). Just look at this for example:

Q: what are the next three numbers in this sequence: 2, 4, 3, 9, 4, 16, 5, 25, 6

A: The given sequence appears to be alternating between even squares and consecutive odd numbers. Using this pattern, the next three numbers in the sequence would be:
7, 36, 8


The correct answer is obviously 36, 7, 49. The explanation it gives makes no sense (the sequence is definitively not "alternating between even squares and consecutive odd numbers"). A pitiful performance considering the question was very easy.

This seems to be a fairly good language interpreter bolted on to a search engine. I think it is quite good at extracting the meaning from questions and finding the answers if they are already out there somewhere in the ether. Apart from that its quite sucky.
1677655426436.png

I get a different answer... puzling.
 
High performance on trivia is so important, isn't it?
You've put your finger on it there.

We humans, spend hours on trivia (pub quizes, TV game shows, threads in the playroom forum right here) and it is good memory training for the brain (stops us getting dementia), We also spend the whole of our young lives learning trivia (dates and names of battles, and "important, well to do" historical people, or the names of things, processes, old experiments, texts...) and later in work (the names of more prooesses, machines and commerical products.) And in our spare time we are obsessed with neighbourhood gossip and the finer details of the lives of celebrities or reality TV contestants. It is all a trivial pursuit!

Is any of it important? Is it necessary when a search engine can find it much more quickly (at least, one not compromised by paid adverts)?

What we should be teaching and learning is not facts but skills. Children should still be taught Maths and English, but also be taught how to write, critical thinking, interviewing techniques, finance, randomn sampling and probability. They should be taught to play a musical instrument, and how to draw, paint, and to model in clay. When we get home from work, it is these creative things that enrich our lives. All children should be taught sports, not just those that will make the school team, and more variety. If it can be an Olympic sport then it could be taught in a school. Or else, skills like car-driving, knitting, dressmaking, bricklaying, hedgelaying, horese-riding, birdwatching, carpet-laying... I could go on for pages, but you get my point. There are so many skills that we are supposed to somehow pick up, which would be useful to us in life, but are not formally introduced.

Crucially, while great at high performance trivia, a machine, connected to a search engine, dressed up to appear to be an artifical intelligence, is never going to learn those skills to the proficiency of a human.
 
View attachment 100673
I get a different answer... puzling.
Another wrong answer it seems :giggle:

There is a nice answer to this issue in this article in the Grauniad about Crochet. (Everyone is at it using ChatGPT, it's the new craze of course)


Taking out the relevant bit in the article:

" ChatGPT is a large language model of artificial intelligence, meaning that it is trained on large databases of text to replicate human communication, anticipating which words are likely to come after each other. These skills do not translate easily to numbers. The result? ChatGPT is bad at math.

“It may strike us as ironic that a computer system would be bad at math and good at creativity, but it does speak to an important fact about generative AI systems in general: they don’t understand context,” Newman said. “They don’t know what words or numbers actually mean, they are simply predicting what should come next.” "

This is very similar, in my mind, to the prowess of computing in Chess. A chessbot is merely looking at millions of permutations of moves off into the future, using an algorithm to predict the move that gives it's internal 'score' on the status of the game, the best possible increase, so to speak. Now, basically that's how humans play chess too, to a degree (I'd argue that a face-to-face game with two humans also includes more slippery concepts like psychology that a chessbot, currently of course, cannot know anything about.)

I fully agree that my own chess ability to look into future permutations is very, very much limited compared to even an average chessbot nowadays. But 'Hold my Beer'. I can also go off and: Integrate mathematical functions, drive a car, create art using pencils, imagine a world that doesn't exist, admire the sky at sunset, be interested in an article on the 1922 film Haxan, understand the actual meaning behind some text, write my own chessbot program....

I'll be a bit more impress on the day AI can be more than a one-trick pony.
 
This seems to be a fairly good language interpreter bolted on to a search engine. I think it is quite good at extracting the meaning from questions
I've seen the same impression expressed elsewhere, i.e. they're relatively impressed by the language interpretation, but far less so by the answers.

I get a different answer... puzling.
Perhaps this is because you asked a different question.
 
I have been playing around with ChatGPT and there are ways to use the system to generate better results. For example, I decided to challenge it to list me all castles in the UK. GPT told me there were too many to list, so I asked it how many there were - it estimated approx 4000 depending on the criterion.

I then asked it to give me a list of all castles beginning with the letter A, and then followed on asking it to give me all castles beginning with A not listed in its previous answer. I continued with these questions and referenced it's own answers which seemed to force it to delve into other data sets.

In the end it has listed me well over 500 castles, at which point I could dive into the particulars of that castle, location, age, historical significance etc.

There's a saying in the Tech world that Googling is an art, I honestly think interrogating ChatGPT is even more so an art, GPT cannot really infer what you want, it can only use word analysis to determine a statistically probabilistically correct answer. It has no real understanding of what you want, just a likely output relating to your input.

It really is interesting to test the limits of it. Weirdly enough, at one point it seemed to get totally confused and just listed Edinburgh Castle over and over again in one set of results.
 
What we should be teaching and learning is not facts but skills. Children should still be taught Maths and English, but also be taught how to write, critical thinking, interviewing techniques, finance, randomn sampling and probability. They should be taught to play a musical instrument, and how to draw, paint, and to model in clay.
Couldn't agree more!
 
I have been playing around with ChatGPT and there are ways to use the system to generate better results. For example, I decided to challenge it to list me all castles in the UK. GPT told me there were too many to list, so I asked it how many there were - it estimated approx 4000 depending on the criterion.

It clearly Googled it, because if you ask Google that's the exact number it immediately tells you :ROFLMAO:

It seems to me for the questions you were asking it, that ChatGPT is, in a way, just parsing the results of a Google search nicer than the same output of a standard Google search. Which, if ChatGTP is actually is able to extract details in a nicer format, would be a nice tool.

However, having seen it tackle some more complex issues other than data collation, (for example discussing a physics problem) although it starts out getting it right, it can get pretty confused and actually get go off on a tangent that makes it contradict itself and therefore end up with a wrong answer. Therefore I would have to take everything it comes out with, with a pinch of salt.
 
You've put your finger on it there.

We humans, spend hours on trivia (pub quizes, TV game shows, threads in the playroom forum right here) and it is good memory training for the brain (stops us getting dementia), We also spend the whole of our young lives learning trivia (dates and names of battles, and "important, well to do" historical people, or the names of things, processes, old experiments, texts...) and later in work (the names of more prooesses, machines and commerical products.) And in our spare time we are obsessed with neighbourhood gossip and the finer details of the lives of celebrities or reality TV contestants. It is all a trivial pursuit!

Is any of it important? Is it necessary when a search engine can find it much more quickly (at least, one not compromised by paid adverts)?

What we should be teaching and learning is not facts but skills. Children should still be taught Maths and English, but also be taught how to write, critical thinking, interviewing techniques, finance, randomn sampling and probability. They should be taught to play a musical instrument, and how to draw, paint, and to model in clay. When we get home from work, it is these creative things that enrich our lives. All children should be taught sports, not just those that will make the school team, and more variety. If it can be an Olympic sport then it could be taught in a school. Or else, skills like car-driving, knitting, dressmaking, bricklaying, hedgelaying, horese-riding, birdwatching, carpet-laying... I could go on for pages, but you get my point. There are so many skills that we are supposed to somehow pick up, which would be useful to us in life, but are not formally introduced.

Crucially, while great at high performance trivia, a machine, connected to a search engine, dressed up to appear to be an artifical intelligence, is never going to learn those skills to the proficiency of a human.
Yes, exactly. I once asked my history teacher why we were studying the French Revolution instead of American. He said that no matter what we studied it would be a small fraction of available knowledge. But he was teaching us HOW to study history in the process. That's the important thing.
 
There is a point where there ought to be some sort of goal. I personally enjoy how technology has put a variety of factoids at my fingers. But enjoyment isn't material betterment. And as we have seen, being able to communicate more easily has NOT brought us all closer together.

So when we are offered tools that replace normal lines of communications or education: What is the the thing that will make humanity better? Will AI help people with scientific energy or health research problems? Or will it just help validate the beliefs of some skinhead?
 
It clearly Googled it, because if you ask Google that's the exact number it immediately tells you :ROFLMAO:

It seems to me for the questions you were asking it, that ChatGPT is, in a way, just parsing the results of a Google search nicer than the same output of a standard Google search. Which, if ChatGTP is actually is able to extract details in a nicer format, would be a nice tool.

However, having seen it tackle some more complex issues other than data collation, (for example discussing a physics problem) although it starts out getting it right, it can get pretty confused and actually get go off on a tangent that makes it contradict itself and therefore end up with a wrong answer. Therefore I would have to take everything it comes out with, with a pinch of salt.

ChatGPT just googles everything :ROFLMAO:

Although I also caught my 14 year old daughter using it to do her homework...

No I absolutely agree. I have asked it some technical questions around NTFS permissions structures and it just parrots standard answers (often not entirely correct). I was just making the observation that like any tool it has its limitations, some of which are always the limitations of the user.
 

Back
Top