What is BCI?
BCI stands for Brain-Computer Interface and is an area of active ongoing research. In this era of technology and development, accessibility is often taken for granted by making unwarranted assumptions about the target audience.
Why is accessibility important?
The fields of Augmented, Virtual, and Mixed reality are one such example of technologies having a tremendous potential for growth. However, application developers often assume that the wearer will be comfortable wearing the headset and using controllers to interact with the environment.
As an example, patients with Autism Spectrum Disorder (ASD) or Cerebral Palsy (CP) are unable to benefit from such technologies due to their discomfort at wearing the headset or lack of muscle control for using their hand-held controllers. Hence, although such technologies have great potential for developing healthcare and therapeutic applications, the lack of possible interactivity in the game environment renders developers focusing more on the general populace keeping the specialized care population in the dark.
Where has BCI been used?
What can be done in such cases, you ask? Well, BCI is the answer. Using Electroencephalography (EEG) electrodes, it is possible to read inputs directly from the brain and using these to perform certain day-to-day tasks. Although this technology is in its nascency, some success has been achieved in certain use-cases.
For example, being able to make boolean (Yes/No) decisions using left/right eye blinks. This is one of the easiest response signals that can be tracked and is popular among BCI applications. But success has been achieved at detecting signals for more advanced tasks such as the desire to use motor functions of the body or even tracking the emotional state of the user.
The misconception about BCI
Before delving deeper into these examples, I would like to make one thing very clear here. When introducing BCI most people have this picture in their mind.
Some kind of dystopian future where electrodes are screwed onto your skull to get such accessibility, however, this is far away from the truth. The examples we discussed earlier are examples of non-invasive BCI and not invasive BCI.
Ok, so lot’s of new technical terms here. I know I mentioned cool applications without explaining how BCI is possible in the first place, but that’s the reason you have garnered the curiosity to reach this stage of the article, isn’t it? So I would like to take 1 step back here to explain how BCI is really possible.
How does my brain look like?
First, let’s have a brief biology lesson about the human brain. I promise it will be brief.
The regions of your brain can be divided based on multiple factors, but one common categorization is the gray matter v/s the white matter.
Neurons or brain cells can also be divided into 2 major regions called the cell body (with dendrites) and the axon.
The cell-bodies of neurons lie in the gray matter right below the human skull and is a thin region.
The axons, on the other hand, go deep into the inner sections of your brain called the white matter. This is a much larger region of your brain compared to the gray matter.
The dendrites serve as “inputs to/outputs” from your brain and the axons are where the “processing” happens.
Ok, so now that we have a general idea about the regions of our brain, let’s continue our discussion about BCI.
Ok, so how is BCI really possible?
BCI devices are divided into 3 major categories: Invasive BCI, partially-invasive BCI, and non-invasive BCI. Consider the term “invasive” to correspond to the question, “Is surgery required?”.
So it’s all about the placement of the electrodes, which from now on I will refer to as sensors.
The sensors can be placed in 3 areas in relation to your brain. So let’s discuss the 3 categories in some detail now with respect to the placement of these sensors.
- Here, the sensors are placed in the gray matter of your brain.
- Since this area lies beneath the human skull, surgery is required to place the sensors in this region.
- As you have rightly guessed, these are high-risk surgeries and can be fatal if not conducted with surgical precision. Elon Musk’s Neuralink is one of many companies focusing on this area.
- Despite the risks, sensors placed in such locations are most-effective in detecting noise-free brain signals and hence such companies argue that its boons far exceed its banes.
- No commercialization of these products has happened yet and this field is mostly limited to clinical trials-based research work.
- Here, the sensors are placed in your skull but not in the gray matter.
- Although surgery is required for their placement, the operation time is faster and because we are not interacting with the neurons directly, we call it partially-invasive.
- These surgeries are not as high risk. One example of sensors used in this category is electrocorticography (ECoG).
- Here too, sensors have a better read on the brain waves but are not as effective as those placed during invasive BCI.
- Some commercialization of such products has happened but not much success has been achieved at targeting the general populace.
- Here, the sensors are placed outside your skull, on your scalp.
- No surgery is required for their placement, hence no operating time is involved.
- An example of sensors used here is electroencephalography (EEG). These have the lowest-risk since no surgery is required.
- These sensors are least effective at detecting and reading brain-waves since the skull and the scalp (including hair if not removed) attenuate the signals with the addition of noise.
- The commercialization of such products started recently. Due to their low effectiveness in research settings, they are marketed as meditation products for the general populace. One such popular product is the Muse series of headbands.
Any success using BCI?
Now you may be wondering, the applications we discussed earlier, did they fall under invasive BCI? Fear not, you would be happy to know that the use-cases we discussed earlier all fall under non-invasive BCI. So hopefully, now that horrific picture you had imagined earlier has disappeared from your mind.
Let’s discuss the other use-cases in some detail. Non-invasive BCI devices have succeeded in tracking limited motor-function controlling signals of the human brain and users have been able to control a Lego-Mindstorm robot by simply “thinking” about moving it.
Wow, that’s cool! And that’s only the beginning.
The alpha, beta, gamma, theta, and delta brain waves can be detected using non-invasive BCI and can be used to track the emotional states of the user. These states include light sleep, deep sleep, calmness, alertness, and anger.
Wow! There is actually a way to know the emotional state of the user even if they maintain a poker face and their expressions cannot be discerned from their face!
Coming back to accessibility
So now, making use of BCI-based controllers, even users who are handicapped, or lack accurate physical motor function (cerebral palsy) control over their arms, can play interactive games by using their brains as their controllers.
People with ASD (autism) often fail at discerning or differentiating between their emotional states and feel overwhelmed under certain environmental conditions. Such people can be aided by therapeutic games which help them discern their current emotional state by reading it directly from their brain (BCI).
This is only the beginning of this new and emerging technology! There is a huge scope of improvement here and the rewards are umpteen in the form of numerous use-cases especially focusing on accessibility.
Concluding this article…
I would like to end this article here by saying that a lot of awareness about BCI and neuroethics along these applications need to be discussed with the general populace, however, the boons far exceed the banes and this technology is worth discussing in our day to day conversations. So here’s to a brighter and more accessible future for the entire world!