Scientists Can Now Officially Read Your Mind, UCLA Declares
Not that it takes a rocket scientist to figure out what men are thinking about 90 percent of the time.
But still, don't you find UCLA's revelation just a little scary?
The school's Laboratory of Integrative Neuroimaging Technology is boasting this week that it has pioneered "brain reading."
Yeah, they can read your mind:
CSUN Womens Soccer
TicketsThu., Oct. 26, 7:00pm
Los Angeles Lakers vs. Toronto Raptors
TicketsFri., Oct. 27, 7:30pm
UCLA Women's Soccer v California & UCLA Men's Soccer v Washington
TicketsSun., Oct. 29, 1:00pm
South Bay Lakers vs. Northern Arizona Suns
TicketsSun., Oct. 29, 7:00pm
Los Angeles Lakers vs. Detroit Pistons
TicketsTue., Oct. 31, 7:30pm
Research presented at a neuroimaging workshop in Spain this month outlines the school's ability to predict ... what you're thinking.
This is so spooky.
Luckily, the work by lead author Ariana Anderson, postdoctoral fellow in the Integrative Neuroimaging Technology lab at UCLA, is very focused. It doesn't seem like they can randomly submit you to an MRI and figure out your dirty kinks just yet.
But what they did do is pretty amazing. After exposing smokers to videos that would either induce cravings or present a neutral state regarding cigarette use, researchers analyzed the MRI brain wave data. UCLA:
... Machine learning algorithms were able to anticipate changes in subjects' underlying neurocognitive structure, predicting with a high degree of accuracy (90 percent for some of the models tested) what they were watching and, as far as cravings were concerned, how they were reacting to what they viewed.
The process was compared to Google's predictive search capability, when the site guesses what you're going to search for even before you finish typing:
In essence, the algorithm was able to complete or "predict" the subjects' mental states and thought processes in much the same way that Internet search engines or texting programs on cell phones anticipate and complete a sentence or request before the user is finished typing.
Essentially, we were predicting and detecting what kind of videos people were watching and whether they were resisting their cravings.
Um. This doesn't apply to any kind of video, does it?
So what are we thinking now?
Yeah -- you're not so smart without those algorithms, are you?
Get the ICYMI: Today's Top Stories Newsletter Our daily newsletter delivers quick clicks to keep you in the know
Catch up on the day's news and stay informed with our daily digest of the most popular news, music, food and arts stories in Los Angeles, delivered to your inbox Monday through Friday.