A crowd of highly intelligent geeks edged Georgetown students out of their study spots in the Healey Family Student Center last weekend. Hoya Hacks, a full-scale hackathon, was in full motion. Students, planted across every imaginable table and seat, outnumbered typical weekend study crowds by at least four to one. Laptop screens glowed across the landscape at a similar scale, boasting dizzying windows of code or half-built apps instead of the usual government papers or Excel spreadsheets.
Of course, plenty of code flew around: It glued together applications, websites and hardware interfaces on the fly. More exciting, though, was the hardware made available for the competitors to use. “Hardware” loosely describes any physical piece of technology that is not just data floating around in the cloud. Circuit boards, laptops and digital beacons were available to all the hackers. Students could check out an Oculus Rift, a $300 dollar top-of-the-market virtual reality headset, like a library book.
What really grabbed my attention, though, were the Muse headbands. Created by the Canadian company InteraXon, the Muse headband, which was originally designed and marketed as a meditation and relaxation tool, reads brain waves. If you’re like me, you might have known that was possible with modern technology, but not in a wearable, debatably fashionable form. Electroencephalography is regularly used to read patient brain activity in hospitals. However, EEG readings usually require dozens of electrodes gelled all over the scalp. While the Muse headband is by no means the first wearable or consumer-friendly EEG technology, and still has not dipped below $250 dollars on Amazon Prime, it was the first of its kind to make it into my grubby little hands.
As you may have heard at one time or another, neurons communicate by passing chemical and electrical signals in a lightning-speed game of telephone.
But these signals don’t fit our typical notion of electricity, like they do in a wire. Instead, ions, which are atoms with overall negative or positive charges, move in and out of the neuron’s axon, the long, tail-like part of the nerve cell. The “go” signal from the previous neuron triggers the first section of the axon to pump ions back and forth, establishing an overall negative charge on the outside of the cell. This charge gets the next section of the axon pumping ions, again creating a negative charge on the outside of that section, and so on.
The continuous chain of extracellular negative charge makes up the electrical signal that sends a message to the next neuron. However, the drastic spike in negative charge also affects the charge of other matter nearby that is not itself in the neuron chain. Since like charges repel, the negative charge outside the neuron knocks electrons outside the system further away. Our brains usually have thousands of neurons firing at once so a lot of electrons are displaced when we think. Because electrons are a part of all matter like skulls and skin, this electron movement can be detected by electrodes stuck to the scalp or by a Muse headband.
Since EEG technology simply catches patterns of chaotically careening electrons, the information obtained is not very specific. Devices such as functional MRIs, which track patterns of increasing blood flow and therefore activity in the brain, give more detailed information about the brain waves they detect. However, EEG equipment detects actual neural activity and is much cheaper and more portable than fMRIs.
The Muse headband, which is not at the top of the EEG quality production line, cannot be expected to be more than the meditation tool it was designed to be. However, at the hackathon, James Pavur (SFS ’17) and Casey Knerr (SFS ’17) adapted the Muse headband for a new use.
Their plan was to create a typing user interface directly controlled by brain activity picked up by the Muse. With nothing more than their thoughts, users would be able to select a letter or symbol from a grid of characters to type behind the cursor. Unfortunately, Pavur and Knerr were only able to distinguish brain activities associated with blinking and with only one type of signal available, users would not be able to toggle through the grid to select their preferred letters. As a replacement, they set flashing columns and rows that scrolled across and down the text with time. Once the desired character was highlighted by a flash, the user could blink and select the next letter for their text.
Although the project was conceptually simple and based off a still largely experimental device with many problems, the project walked away with first prize. However, just because the idea is easy in theory, of course, doesn’t mean it’s easy to implement. Such a device would enable people who are unable to use their hands to communicate with the use of their eyelids alone. Technologies like this have been invented in the past, but Pavur and Knerr’s project used relatively cheap hardware and was developed in under 36 hours.
When I tried on the headband to type, the device was unable to detect my blinks. Based off my inability to code anything more functional than a pocket calculator, I would not be surprised if my brain was too weak to give off adequate signals! Nevertheless, Pavur assured me that it was probably just my hair in the way. Students like Pavur and Knerr were creating the future right here at Georgetown, and I could not even hope to interpret those impressive pools of code on their laptop screens.
Patrick Soltis is a sophomore in the College. INNOVATION SMACK TALK appears every Friday.
Have a reaction to this article? Write a letter to the editor.