Brain Machine Interfaces - A farmer's definition

Brain Machine Interfaces - A farmer's definition

In 30–50 years from now, it is highly unlikely that you will read or listen to a post like this on your phone. You won’t find yesterday’s sports results by typing words in a Google search box, talk to your friend via your Apple Airpods or stumble upon the newest meme on Twitter by scrolling through a feed filled with favourite buttons and retweet counts.

In the future, you will simply know the information you need, the split second you want it.

Welcome to the world of Brain Machine Interfaces.

A farmer’s definition of Brain Machine Interfaces

You might have heard of Graphical User Interfaces (GUI), Voice User Interfaces (VUI) and perhaps even Command Line Interfaces (CLI), but what is a Brain Machine Interface (also referred to as Brain Computer Interfaces)?

I can’t remember when I Googled the question for the first time, but I do remember that my search results weren’t nearly as simple as my dad’s answer.

While we were on the farm a couple of months ago, I decided to ask him what a BMI is. I didn’t expect him to know, but he surprised me:

“Uh… it’s when your brain gets connected to the internet.”

Living room scene with dad and me sitting on couches asking him what a brain machine interface is
A farmer’s definition of Brain Machine Interfaces

My dad’s answer stunned me for two reasons:

  1. Firstly, during all the deep and philosophical conversations my brother and I had about this seemingly nonsensical topic, my dad was quietly listening in the background and then managed to reduce it to a 1-liner.
  2. Secondly, without even knowing it, he highlighted a major dichotomy: With BMI’s, we are about to strip down the layers between the most open system we’ve ever created, the internet, and the most private one, the brain.

This second point is particularly important because regardless of the definition we give BMI’s, they are ultimately paving a shift in thought patterns, culture, relationships, identity and society as a whole.

But to understand why, we’ll have to shift gears for a moment and travel through the evolution of traditional interfaces.

The evolution of interfaces

The most familiar user interface today is the Graphical User Interface (GUI). It’s what you see while you are reading this post. In its simplest form, this GUI is a middle man between you and a machine, enabling you to talk to it and others via pictures, shapes and text.

But GUI’s haven’t always been around — before that we had exclusive Batch- and Command-Line Interfaces. The way you interact with them might be vastly different, but their definition is basically the same as a GUI: They both act as middlemen that allow machines and humans to talk to each other.

Now, if we travel back in time far enough, you’ll see why and how they came about.

Are you ready to time travel? Let’s go.

Once upon a time, 75 0000 years ago…

…there was no internet and language wasn’t invented yet. Pause for a second and imagine that— no talking…

If you can’t quite imagine that, let me introduce you to Bak, the leader of his tribe.

Bak standing on rock under a tree, introducing himself
Say hello to Bak 👋

For most purposes, Bak is completely happy without words, but if he needed food urgently from his friend on the other side of the river, not having the words to ask for help, became a problem. What does “ugh ugh, ah ah” mean anyway!?

River scene where Bak asks his friend on the other side of the river for help
Bak’s saying: “Ugh ugh, ah-ah!” (Google translation: “I need food!”)

Because he hasn’t developed the words to ask his friend for help yet — and because Google Translate doesn’t exist yet — his best option is to swim over to the other side and get food himself. The worst-case scenario is if Bak never makes it to the other side. Scrumptious for the crocodile but not so pleasant for Bak.

River scene with a crocodile chasing a man
Aaaah! (Bak is in trouble)

Fortunately, something incredible happened: humans invented language. This did not only extended Bak’s life because he could now ask for food when he needed it, but it also became the first scalable interface and the foundation of all the interfaces that followed, including GUI’s, CLI’s and Voice Interfaces.

From language to GUI land

Even though language helped save millions of lives, it still had two major problems:

  1. Air, the very first channel through which language had to travel, didn’t do so well over vast distances. Bak can easily shout for help to the other side of a narrow river, but if he just happened to be on the other side of the Amazon River, his loudest voice wouldn’t travel across the river to reach his friend.
  2. Sharing Bak’s close-encounter crocodile lesson with friends helped more people to stay alive. But this only worked until the 3rd generation forgot about the crocodiles.

These two problems didn’t hold us back. Over the millennia that followed, we invented better and better ways to write our knowledge and lessons down, initially on rock tablets and later paper. But like Moses and the Postman would have testified, carrying these per foot and over rugged terrains required a lot of effort.

Hmm… Moses is wondering if there isn’t an easier way. 🤔

Fortunately, industrialisation came along and enabled us to travel further and across the big blue seas to share our ideas with those on the other side of the world.

We kept pushing ourselves beyond the edge of the ocean with incredible inventions from the scientific revolution. First with the telegraph, followed by the telephone and later the radio in 1902. These interfaces were low bandwidth and painfully slow (and initially doomed as diabolical) but most loved the way it brought everyone closer.

Yet we needed a faster channel.

And then BOOM! Air 2.0 was created; the Internet, Earth’s first digital big bang moment. It allowed us to send information almost instantaneously to every corner of the Earth (even to the other side of the Amazon river). Paper, one of the most ancient interfaces, got a whole new inter facelift.

But similar to language, our first digital interfaces had several limitations. Command-Line, for instance, had a high learning curve and required memorising 10’s of commands to say a little bit.

So we created the GUI, an interface where the primary way through which we communicate with each other and machines, is based on pictures and shapes, carefully placed on all sorts of screen types and sizes.

One of the primary reasons why the GUI was so much more successful than CLI’s is that it had a restricted interaction vocabulary with only three primitives: pointing, clicking and dragging.

It was easier than CLI, but it was still slow to move your cursor around to point and click at 40px size buttons or typing a very long message to a special friend with two thumbs.

Alas! A whole new breed of design jobs was born. Some call themselves UX designers, others, principle UI architects and still others, design unicorns. Regardless of job titles, we’re all ultimately doing the same thing: find the best possible way to ensure that the pictures, shapes and words used on a screen, accurately translates the experiences, goals and mental models of the people who use our products.

Phew… no easy task at all.

Our hunger for something better drove us to create even more interfaces that would help make our middlemen feel more natural. Nowadays it’s quite clear (or perhaps audible) that Voice interfaces are the latest craze, ushering in more and more devices.

But, like Tim Urban pointed out in his Neuralink essay (thanks for the inspiration Tim!), all of these interface types are merely “a bunch of brain communication hacks using different kinds of middlemen.” Or, “the era of indirect brain communication”.

If you haven’t fully grasped this, I’ll put it differently:

All interfaces up until now have been communication hacks to translate and share one human’s thoughts, experiences and impressions to another human or machine.

A brain indirectly connected to other brains via interfaces such as paper, a computer, smart glasses and a telegraph
The era of brain communication hacks

This is an important fact. BMI’s are nothing like their predecessors. Even though they contain the word “interface”, they are a different type of interface, in a class of their own and a whole new paradigm shift.

BMI’s mark the era of direct brain-to-brain and brain-to-machine communication.

Multiple brains that are directly connected with each other via a brain machine interface
The era of direct brain-to-brain and brain-to-machine communication

On a technical level, BMI’s connect brains and machines directly to each other via the internet (and without), removing all middlemen that previously added extra noise and latency to communication. If you read between the l̶i̶n̶e̶s̶ arrows, it becomes evident that in the future, the world might operate as one big brain. Like this.

One big brain
In the future, the world will operate as one big brain

But because human brains are mostly private and the internet mostly open, we have a dichotomy that makes BMI’s so much more than an internet device.

It is, for this reason, I believe BMI’s are Earth’s second digital big bang moment, causing a shift in thought patterns, culture, relationships, identity and society as a whole.

BMI’s are Earth’s second digital big bang moment, causing a shift in thought pattterns, culture, relationships, identity and society as a whole.

We’ve made a giant leap from isolated brains to indirect brain communication and now a colossal jump to an era of direct brain communication.

Timeline with 3 brain communication eras. From isolated brains, to indirect brain communication to direct brain communication
The 3 different eras of brain communication as outlined by Tim Urban

What this will look like in a couple of decades from today, is everyone’s guess. The best we can do at the moment is look at the current technology and how it’s shaping us today.

BMI’s made practical

Today, BMI’s are predominantly used in the healthcare sector, with a couple of players focusing on gaming or sports performance.

You’ll notice from the examples below that the applications are still very niche and rudimentary. Nevertheless, this is the stuff paradigm shifts are made of and what inspires us to go beyond today.

1. Healthcare

BMI’s in the healthcare sector primarily focusses on helping people with disabilities such as paralysis, cerebral palsy, dementia, vision impairment and hearing impairment.

Here is one example of a man who controls a wheelchair by only using his thoughts:

So what’s happening here? On a fundamental level, a hardware device reads the signals from his brain and then sends it to a computer where the messages get translated into a command for the wheelchair to move in a specific direction.

It’s elementary, but probably not hard to see how full-on bionic augmentation to the masses is the ultimate dream come true.

Besides robotics, here are some startups focusing on other healthcare solutions:

2. Learning & Sports performance

A couple of BMI companies are also claiming that you can improve your learning and sports performance through what they call neural priming.

Halo Sport, explains it like this:

[The device] works by applying a small electric current to the area of the brain that controls movement, putting it into a state of hyper learning.

This same principle applies to whether you are studying for a history exam or training for the Olympics: You prepare your brain for faster learning by increasing its neural plasticity.

Even though it’s still in its infancy, this level of neural stimulation is just the very first steps toward sending information directly to your brain.

I don’t think it’s too far-fetched to imagine devices that would ultimately stimulate senses such as smell and taste.

3. Gaming

BMI gaming devices enable gamers to control a game by only using their thoughts. This, therefore, bypasses the central nervous system so a joystick doesn’t have to be hand-controlled anymore.

As you can probably start to see, these devices not only speed up reaction time but also has the potential to become a significant component of immersive virtual reality experiences. Which makes me wonder how many of the Black Mirror episodes, including the Striking Vipers one, will become a reality…

Redefining humanity

Even though these examples are currently mostly functioning as 3 separate application areas, over time, the lines between them will become less pronounced.

Eventually, our brains will be seamlessly integrated to the internet, and we’ll have bionic limbs, telepathic conversations and get paid for jobs in vivid virtual worlds.

Solving the technical problems to get there is the easy part.

The more challenging and important problem to solve is this: How do we redefine culture, relationships, identity and society in a world of BMI’s so that it’s a better version of what we have today?

This is no easy feat, and the shadows of man’s past might not be very encouraging. But if we look beyond the grim media headlines and the biggest historical atrocities, it might become evident that the world is already better than what it was 75 000 years ago.

Today, the world is in a strange space. It might feel the world is back to where it was before we invented language — we’re all in our own ‘caves’ and physically isolated from one another — yet, it’s times like these that remind us how the interfaces we’ve created over the past few millennia moved us closer to each other and allowed us to do remarkable things.

These interfaces help us see each other, ask for help and fight viruses that are scarier than crocodiles.

It is undeniable that bigger and more complex human problems will arise in the future and even though we don’t know what they will look like, we do know that we’ll have to stand even closer to each other to ensure we get to the other side of the river.

As a tool, BMI’s promise to do this, but as fellow humans and designers, only we can redefine humanity together.

Thank you to…

  • My great friend, Jaco who created these beautiful illustrations to help navigate the story of brain machine interfaces (and Bak of course). You can compliment him on LinkedIn, Twitter, Facebook, or Instagram.
  • Tim Urbun, the rock star from Wait but why whose work on Neuralink I’m indebted to. If you haven’t seen his work yet, you seriously need to check him out.
  • My brother, for the endless deep and philosophical conversations we had to help shape this story.
  • My dad, who inspired me to write this post. Thanks, dad!

• • •

Photo by NASA