Vanderbilt: Ultrasound helmet would make live images, brain-machine interface possible

Ultrasound technology for the brain could mean real-time images during surgery, a better idea of which areas get stimulated by certain feelings or actions and, ultimately, an effective way for people to control software and robotics by thinking about it, according to a Vanderbilt University article.
Medical doctors and scientists have spent decades hoping for such an advance, but it was impossible before now, said Brett Byram, assistant professor of biomedical engineering.

Ultrasound beams bounced around inside the skull, so no useful imagery could make it out, Brett Byram, assistant professor of biomedical engineering, said in the article. Byram is a BMES member.

With his new $550,000 National Science Foundation Faculty Early Career Development grant, Byram plans to use machine learning that will gradually be able to account for distortion and deliver workable images, according to the article. Byram wants to integrate electroencephalogram technology so doctors could see not only brain perfusion—how blood flow correlates to changes in thought—but also areas of stimulation related to movement and emotion.

“The goal is to create a brain-machine interface using an ultrasound helmet and EEG,” Byram said in the article. “A lot of the technology we're using now wasn't available when people were working on this 20 or 30 years ago. Deep neural networks and machine learning have become popular, and our group is the first to show how you can use those for ultrasound beamforming.”

Read more HERE.