Best way to read someone's mind?

JonH

Refreshed and Renewed
Joined
Apr 28, 2014
Messages
299
Location
Joined a critiquing forum in the early nineties, w
It seems to me there are several possible approaches, and we're long way off all of them.

First I thought of is functional neuroimagery,an extension of fMRI, PET, EEG etc. At the moment you can only get a very crude idea of what's going on. Will this get substantially better, or is it a dead end technique?

Second, people will agree to have implanted chips with a neural interface. Prof. Warwick has been implanting simple chips into his body for some years. He connected himself to the Internet over a decade ago, and predicted we'll all have chips in our brains within 30 years. Personally I think those are the same ever-moving 30 years in the future that commercial fusion is always predicted to be.

Third. prediction by observation. If you know someone really well, you often know what they're thinking. You can finish each others sentences. Will machines be able to read all of us this way? With the advent of increasingly good sensors to monitor body movement, and better voice recognition and parsing, will what we do and say give away our thoughts with any level of reliability?

Any other approaches come to mind? [Just think into the interface.]
 
There's another possibility, a variant of your second - the implanted chips. A nanoswarm with neural sensors as part of the package would produce quite a lot of raw data.

Interpreting the raw data is, of course, quite another problem. It's quite likely that the coding of the data is different from one person to another, for example. There is no biological equivalent of ASCII or the JPEG standard.
 
First I thought of is functional neuroimagery,an extension of fMRI, PET, EEG etc. At the moment you can only get a very crude idea of what's going on. Will this get substantially better, or is it a dead end technique?

PET is physically limited to a certain resolution (a few mm, depending on the isotope you use) simply because that's how far the positrons it uses can travel before they annihilate, so it's unlikely it would ever work for this.

MR can be used on much smaller scales, although using that practically for medical use seems some way off yet. This might also interest you:

BBC News - Scientists 'read dreams' using brain scans

Not perfect yet, but I was honestly surprised that it was that good. They also point out the same issue- such a technique would likely require a lot of analysis of the subject beforehand (i.e. you would need to "calibrate" a subject's thoughts to scan results before you could read them). Nevertheless, I'd say this is the most plausible approach in the near future at least. It won't be magic but I'd say it will still be pretty amazing- well, it already is pretty amazing really.

I doubt implants will really catch on for a long time. The catch with brain implants is that implanting them is, quite simply, brain surgery- it's invasive and has significant risks attached, and is far more than most people would accept unless it was medically necessary, and even then it's often only a second-line approach after medication has failed. As I understand it, even Prof Warwick only had his implants put into his peripheral nerves, not his brain.
 
I think a few people might go for implants when they become reasonably easy, but if the technique is to be general, rather than just for say patients, the implants would have to be the rule not the exception. I do "like" Mirannan's idea of nanobot interfaces, as that could happen easily, painlessly and, for a dystopian feel, even without consent.

I reckon audio and video sensors with interpretation has the smoothest path forward. It's body non-intrusive, and the equipment is already everywhere, it "just" needs all the data-analysis work doing.

I suspect the individual calibration is already happening too, with bots reading our data footprints, supermarkets analysing our individual purchases, advertisers reading our cookies, etc, etc.

I'm not looking forward to the day I'm walking down the high street, when the sensor in the advertising hoarding outside Tesco's decides my blood-sugar level is a little low and it sees me licking my lips slightly. So it checks the database, decides I prefer plain chocolate, and flashes up an advert for Green & Black's, because it know my wife buys that in the weekly shop sometimes.

Other than using my gait to decide my blood-sugar is low, I think all of the rest could be done right now. How far from that to real mind-reading?
 
Thanks for the thumbs-up, JonH. I do believe that nanotech (if it proves practical to the extent that some think) is an immensely powerful technology that could make the world either close to a paradise or a dystopian hell that makes 1984 seem like a paradise by comparison. After all, in 1984 thoughtcrime had to be expressed in some way; what if the authorities could tell what you were thinking and take steps?

I also think that the concept of privacy is going to be radically changed in the next 30 years or so. I'm not really sure what Joe (or Jane!) Average is realistically going to be able to do about it.
 
I don't think we're at Minority Report just yet, but the practicalities of privacy have already been changed. If you pick your nose on the bus in what someone decides is an amusing way, you can end up on national television.

The Regulation of Investigatory Powers Act (2000) gave all branches of government, including local, to right to snoop on us electronically, without a warrant, for almost any old reason (the breadth of reasons are wide, culminating with the reason - anything else the Home Secretary decides). Last time I looked there were over half a million warrantless interception orders a year. (These aren't the ones the security services use, which still tend to have a warrant.)

Unfortunately for the government, the people they let do it are too incompetent to search effectively or analyse the data they get. The Communications Act which got shelved as the Conservatives couldn't get it past the LibDems, included the provision of a centre of experts who would do it for them. And also extended the warrantless snooping to physical mail.

If I leave my flat and turn right, there's a private surveillance camera thirty yards down the road, and public ones at the intersection twenty yards further. If I turn left I only have to go twenty yards before passing the first private camera (which is actually outside a surveillance/security shop).

It's getting so a gentleman cracksman, such as myself, can hardly rob houses in peace!
 
And what would they end up using it for? More personalized ads on our social media platforms and e-mail pages. Our...silly rulers continued obsession with playing God. Personally, I found a great way to read someone's mind: Pay Attention. Perhaps Sherlock Holmes is a fictional character, and perhaps his character is written with gifts, but he is very observant. It is amazing how much we give away with body language and habits. If you ask a proper question or two, you can know everything you want. And then there are those who are so willing to tell you Every. Single. Little. Thing. And for those, all you have to say is, hello...
 
It's getting so a gentleman cracksman, such as myself, can hardly rob houses in peace!

:D

I think it's creepy when the same personalized ad follows me from site to site.

Google Glass is going to revolutionize a lot of things, including eye-witness testimony. It will see and record everything that you see, watch, read and say. All day. Every day. Best indicator of future behaviour is past behaviour, so nothing will know you as well as your Google Glass will.
 
Maybe they will use the advencement in calculatory processing power now in demand for cryptocurrancy mining to process all that raw data they (oh that omnipresent "they") are and have been collecting.

(Ps google glass only records when you tell it to. Not that it Compton couldn't be taped and the tap recorded like any other camera we keep connected to an internet port...)
 

Similar threads


Back
Top