Among the many buzzy pet projects and aspirations which you sometimes hear tech moguls like Elon Musk and Mark Zuckerberg talk about that fall outside of their company’s core missions is the development of a dystopian-sounding direct link between computers and the human brain. The creepiness factor notwithstanding, they’ll regale audiences and journalists about what a near-magical state that would be, to not have a cumbersome obstacle — specifically, a phone — that sits between your mind and whatever action you’re trying to take, or thought you’re trying to communicate. Especially given the massive Facebook data leak that was disclosed over the weekend, I’m not sure how many of us would want to give Zuckerberg a direct hook-up to our brain, but this is all nevertheless indicative of growing scientific interest in essentially “reading” more of peoples’ brains — for a whole host of reasons.
The latest news from scientists out of Caltech, for example, shows how some particularly exciting advancements can be made in this area, after utilizing a familiar piece of technology, no less. A collaboration at Caltech has resulted in the development of what the university calls a “minimally invasive BMI” (for brain-machine interface), that promises the ability to “read” brain activity that corresponds to planned movements by the subject being studied. “Using functional ultrasound technology, it can accurately map brain activity from precise regions deep within the brain at a resolution of 100 micrometers,” a Caltech announcement explains.Today's Top Deal Unreal deal gets you Amazon’s hottest smart home gadget for $23 – plus a $40 credit! List Price:$29.98 Price:$22.99 You Save:$6.99 (23%) Available from Amazon, BGR may receive a commission Available from Amazon BGR may receive a commission
Sumner Norman, a postdoctoral fellow in the school’s Andersen lab and co-first author on a study about this technique (published Monday in the journal Neuron), said in the announcement that: “Functional ultrasound is an incredibly exciting new method to record detailed brain activity without damaging brain tissue. We pushed the limits of ultrasound neuroimaging, and were thrilled that it could predict movement. What’s most exciting is that fUS is a young technique with huge potential — this is just our first step in bringing high performance, less invasive BMI to more people.”
A collaboration at #Caltech has developed a new type of minimally invasive brain-machine interface using functional ultrasound technology to read out brain activity corresponding to the planning of movement.https://t.co/j6H3g17Kyl
— Caltech (@Caltech) March 23, 2021
The way this works: Ultrasounds emit pulses of high-frequency sound. Then they measure how the vibrations from those sounds echo through something, like the human body. The Caltech research goes on to note how this is done in practice, such as by noting that red blood cells — similar to the sound that a passing ambulance makes — increase in pitch when they get close to the source of ultrasound waves, and then die away as they move on. Measuring that activity, as a consequence, lets researchers measure changes in the blood flow in the brain.
Mikhail Shapiro, professor of chemical engineering and Heritage Medical Research Institute Investigator, who’s also an affiliated faculty member with the Chen Institute, explained that: “A key question in this work was — If we have a technique like functional ultrasound that gives us high-resolution images of the brain’s blood flow dynamics in space and over time, is there enough information from that imaging to decode something useful about behavior?
“The answer is yes. This technique produced detailed images of the dynamics of neural signals in our target region that could not be seen with other non-invasive techniques like fMRI. We produced a level of detail approaching electrophysiology, but with a far less invasive procedure.”Today's Top Deal Unreal deal gets you Amazon’s hottest smart home gadget for $23 – plus a $40 credit! List Price:$29.98 Price:$22.99 You Save:$6.99 (23%) Available from Amazon, BGR may receive a commission Available from Amazon BGR may receive a commission