YouTube
... It’s not possible to know exactly what another person is thinking, but neuroscientists from UCSD and UCSF are on their way. They created a “glass brain” software that shows a person’s brain reacting to stimuli in real time.
The implications for virtual reality and digital communication are tremendous, according to Philip Rosedale, the founder of Second Life, who has been collaborating with the neuroscientists. “We’re trying to identify which critical factors can most help people feel like they’re face to face,” says Rosedale, whose new company, High Fidelity, is currently working on a next generation virtual world.
The neuroscientists used an MRI to scan the brain of Rosedale’s wife, Yvette. Then, for the recent SXSW demo, they fitted her with a cap covered in electroencephalogram (EEG) electrodes, which record brain activity. Sitting beside Yvette, Rosedale donned an Oculus Rift headset, which allowed him to see a 3-D picture of her brain activity. “In the middle of the presentation, somebody said ‘tickle her.’ I wouldn’t dare--she’d kill me. But I put my hand on her side and squeezed, and you could see the motor cortex activity lighting up all of a sudden,” he says.
This moment was particularly powerful for Rosedale, because it neurologically demonstrated the physical intimacy between himself and his wife. For him, that’s exactly why the glass brain software could be so useful. “There’s this theory of the brain that says we’re all kind of dancing together,” he says. “When I talk and you nod, you’re following the rhythm of my voice and guessing when my sentences will end. That’s something that we may be able to see with the EEG.” Rosedale says cell phone communication is “so terrible” because the delay, however small, disrupts this human interplay. “On the phone, you often can’t make a response sound like ‘mm-hm,’ close enough to the end of my sentence for me to feel it,” he says, “so one of the emotional elements of communication is lost.”
The glass brain software can similarly help improve the facial expressions and physical reactions of virtual reality avatars. “It will let us put a number on the quality of virtual communication and then compare that number to what happens face-to-face,” he explains. By hooking more subjects up to the glass brain and have them interact, “we can hopefully find out why video conferencing just doesn’t seem to work.”
Even if we will never know exactly what other people are thinking, Rosedale believes that watching a brain’s real-time response can lead to greater honesty. Right now, people won’t put on EEG caps for business meetings, but he says that in the future, technology will make brain activity transparent to everyone at the table. As evidence, he points to an iPhone app that can measure heart-rate based on how the skin flushes. “What if I could show, based on what’s happening in your brain, that you wanted to interact with me in a intimate manner?" he asks. "You can lie on a phone call but it will be harder to lie in virtual reality because of how your body and brain will be moving.”Fast Company
No comments:
Post a Comment