May 30, 2008
WASHINGTON - A COMPUTER has been trained to 'read' people's minds by looking at scans of their brains as they thought about specific words, researchers said on Thursday.
They hope their study, published in the journal Science, might lead to better understanding of how and where the brain stores information.
This might lead to better treatments for language disorders and learning disabilities, said Mr Tom Mitchell of the Machine Learning Department at Carnegie Mellon University in Pittsburgh, who helped lead the study.
'The question we are trying to get at is one people have been thinking about for centuries, which is: How does the brain organize knowledge?' Mr Mitchell said in a telephone interview.
'It is only in the last 10 or 15 years that we have this way that we can study this question.'
Mr Mitchell's team used functional magnetic resonance imaging, a type of brain scan that can see real-time brain activity.
They calibrated the computer by having nine student volunteers think of 58 different words, while imaging their brain activity.
'We gave instructions to people where we would tell them, 'We are going to show you words and we would like you, when you see this word, to think about its properties',' Mr Mitchell said.
They imaged each of the nine people thinking about the 58 different words, to create a kind of 'average' image of a word.
'If I show you the brain images for two words, the main thing you notice is that they look pretty much alike. If you look at them for a while you might see subtle differences,' Mitchell said.
'We have the program calculate the mean brain activity over all of the words that somebody has looked at. That gives us the average when somebody thinks about a word, and then we subtract that average out from all those images,' Mr Mitchell added.
Then the test came.
'After we train on the other 58 words, we can say 'Here are two new words you have not seen, celery and airplane'.' The computer was asked to choose which brain image corresponded with which word.
The computer passed the test, predicting when a brain image was taken when a person thought about the word 'celery' and when the assigned word was 'airplane'. The next step is to study brain activity for phrases.
'If I say 'rabbit' or 'fast rabbit' or 'cuddly rabbit', those are very different ideas,' Mr Mitchell said. 'I want to basically use that as a kind of scaffolding for studying language processing in the brain.'
Mr Mitchell was surprised at how similar brain activity was among the nine volunteers, although the work was painstaking.
For an MRI to work well, the patient must sit or lie very still for several minutes.
'It can be hard to focus,' Mr Mitchell said. 'Somewhere in the middle of that their stomach growls. And all of sudden they think, 'I'm hungry - oops.' It's not a controllable experiment.' -- REUTERS
[comment: If this works, it may be a better basis for a universal translator.]
WASHINGTON - A COMPUTER has been trained to 'read' people's minds by looking at scans of their brains as they thought about specific words, researchers said on Thursday.
They hope their study, published in the journal Science, might lead to better understanding of how and where the brain stores information.
This might lead to better treatments for language disorders and learning disabilities, said Mr Tom Mitchell of the Machine Learning Department at Carnegie Mellon University in Pittsburgh, who helped lead the study.
'The question we are trying to get at is one people have been thinking about for centuries, which is: How does the brain organize knowledge?' Mr Mitchell said in a telephone interview.
'It is only in the last 10 or 15 years that we have this way that we can study this question.'
Mr Mitchell's team used functional magnetic resonance imaging, a type of brain scan that can see real-time brain activity.
They calibrated the computer by having nine student volunteers think of 58 different words, while imaging their brain activity.
'We gave instructions to people where we would tell them, 'We are going to show you words and we would like you, when you see this word, to think about its properties',' Mr Mitchell said.
They imaged each of the nine people thinking about the 58 different words, to create a kind of 'average' image of a word.
'If I show you the brain images for two words, the main thing you notice is that they look pretty much alike. If you look at them for a while you might see subtle differences,' Mitchell said.
'We have the program calculate the mean brain activity over all of the words that somebody has looked at. That gives us the average when somebody thinks about a word, and then we subtract that average out from all those images,' Mr Mitchell added.
Then the test came.
'After we train on the other 58 words, we can say 'Here are two new words you have not seen, celery and airplane'.' The computer was asked to choose which brain image corresponded with which word.
The computer passed the test, predicting when a brain image was taken when a person thought about the word 'celery' and when the assigned word was 'airplane'. The next step is to study brain activity for phrases.
'If I say 'rabbit' or 'fast rabbit' or 'cuddly rabbit', those are very different ideas,' Mr Mitchell said. 'I want to basically use that as a kind of scaffolding for studying language processing in the brain.'
Mr Mitchell was surprised at how similar brain activity was among the nine volunteers, although the work was painstaking.
For an MRI to work well, the patient must sit or lie very still for several minutes.
'It can be hard to focus,' Mr Mitchell said. 'Somewhere in the middle of that their stomach growls. And all of sudden they think, 'I'm hungry - oops.' It's not a controllable experiment.' -- REUTERS
[comment: If this works, it may be a better basis for a universal translator.]
No comments:
Post a Comment