Neurotech: Brainwave Joysticks & Automated Mindfulness
By Sean Bruce, Owner & Operator of GameplanVFX
Developer-ready technologies are available to healthcare innovators
Neuroscience is about understanding how the mind interacts with the brain. The patterns our minds make with a palette of electrons on a canvas of intertwining tissue. The pulse of our emotions. The tremors of our thoughts.
Electroencephalography (EEG) readings are captured when sensors are placed on specific regions of the scalp and connected to a computer. The output is returned as neural oscillations, more commonly known as “brainwaves”.
Brainwaves are categorized by their frequency in Hertz (hz) and linked to various brain functions like sleep, concentration, and anxiety. Mu waves, for example, range from 7.5 hz to 12.5 hz and are associated with the part of the brain that controls voluntary movement making them ideal for Brain-Computer Interfaces (BCI).
In 2012, developers Anton Lodder, Sartaj Dua, and Zeeshan Razvi used mental commands to pilot an EEG-controlled wheelchair at McMaster University (See the video here).
Before the wheelchair can be operated, the user must train the software to record and detect the commands issued by the brain. When the user presses the ink-coated thumb onto paper and thinks “left”, the software takes a snapshot of their brainwaves. The imprinted data is compared to the live EEG feed. When the computer detects a matching neuro occilation pattern, the wheelchair becomes a telepath able to move in the direction the rider is thinking.
“By learning to self-regulate EEG signals, participants aim at changing their brain patterns in response to feedbacks, so that voluntary manipulation of brain activity can be used to control external devices such as computers, switches, or wheelchairs.”
(Shuet Ying (Sofina) Chan – “Neurofeedback: Challenges, Applications, and Opportunities for Education“)
The technology provides a new way to interact with machines and devices. For those with disabilities, it may offer more independence by facilitating the access to smart home functions, self-driving vehicles, or the keys on a piano. It’s a world of possibilities.
However, differentiating a complex mental command from the static soup of one’s entire thought ecosystem can be challenging. Inputting the Konami code is vastly different than having Siri or Alexa regulate the temperature in a house.
Harder still, is decoding the human condition and deciphering our emotions. Brain-computer interfaces are able to move matter, but what happens when we are moved? How can we convey our feelings when there are no words to describe them?
How to get a blackbelt in “chill”
Montreal-based Beam Me Up (BMU) Labs is using technology to help us keep calm and carry on.
BMU has developed “The Virtual Sophrologist,” a virtual reality experience that combines fantastic worlds with the mindfulness methodologies of Sophrology in a series of sessions that help track and alleviate stress. During these sessions, participants’ brainwaves are monitored and the data is used to determine an anxiety score, a depression score, and the amount of time it takes to go from pulling your hair out to “serenity now.”
“The results demonstrate that our system, Virtual Sophrologist, can help users’ effort to relieve stress, anxiety, even pain and maladies.”
(Virtual Sophrologist: A Virtual Reality Neurofeedback Relaxation Training System Guoxin Gu and Claude Frasson)
Sophrology is a school of thought that pairs meditation, influenced by the yogis of India, with a graduated system of levels guiding one to understand the connection between one’s being and one’s body. It was founded in 1960 by the late Alphonso Caycedo, a Colombian professor who specialized in psychiatry and neurology.
Caycedo’s teachings are continued by his daughter, Natalia Caycedo, president of Sofrocay, the Académie Internacionale Sophrologie Caycédienne.
“Être & Mieux Être” or “Be & Well-being” is their slogan and BMU Labs is implementing VR to help you do just that with a tool to master your mind and emotions.
Evoked Potential and Reactive Meditative Environments
Evoked potential as defined on Wikipedia is:
“an electrical potential recorded from the nervous system of a human or other animal following presentation of a stimulus.”
With the arrival of next generation VR head-mounted displays, increased resolution, mobility, and an expanding developer community are thinning the line between the real and the unreal. The tools to create an environment and user experience specific to an area of study are readily available.
Unity, once considered game development software, is capable of deploying nearly any kind of digital experience to popular VR hardware like Oculus and Vive. From a mental healthcare perspective, VR is an audio-visual stimuli machine.
“It expands the bounds of possible evoked potential (EP) experiments by providing complex, dynamic environments in order to study cognition without sacrificing environmental control. VR also serves as a safe dynamic testbed for brain-computer interface (BCI) research.”
(D. Bayliss, Jessica & H. Ballard, Dana. (2000). Recognizing Evoked Potentials in a Virtual Environment)
An immersive VR experience can remove you from your surroundings from this very dimension — perhaps to one of tall maples and whispering pines (#shinrinyoku).
How so? With terraforming at the touch of a button and content creation software that uses math to make land, sea, and even ozone:
What if artificial intelligence could detect your anxiety and recalibrate the world around you? Procedurally create a happy place? A stress-free simulacrum? Biomes from your biorhythms? Would that place resemble one of your memories?
Neurotech is available to developers and healthcare innovators via commercial products including the Epoc+ Headset from Emotiv and Neurable which include Unity support.
Back to Articles and News