Can a Digital Reality Be Jacked Directly Into Your Brain?

DMCA / Correction Notice
- Advertisement -


- Advertisement -

a young man A gray flannel robe sits calmly at a table in front of a featureless black box. She is wearing a hat that looks like it is made of gauze strips. A bundle of wires coming out of the back of his head. He is waiting for something.

A researcher in a white lab coat walks up to the table and stands silently for a moment. The man looks at the box. Nothing happens for a moment. Then the man blinks and is seen blushing slightly. The researcher asks what happened.


“Just for the first second,” he says, “I saw an eye—an eye and a mouth.”

The researcher swaps the box for a different item. This time it’s an orange soccer ball. There is a throbbing, and then it is clear that something has happened inside the man’s head. “How do I explain it?” He says. “Like the previous one, I see one eye—one eye and one mouth, sideways.”

To be honest, this guy is a cyborg. Their fusiform gyri, curved ridges that run along the lower part of the brain on each side, are studded with electrodes. His doctors implanted him because they thought they would help determine the cause of the man’s seizures. But the electrodes also offer a rare opportunity—not only to read signals from the brain, but to write them down. A team of neuroscientists led by MIT’s Nancy Kanwisher is investigating the so-called fusiform face area, which is activated when a person sees a face. His question is what if they reverse the pumps? Deliberately activate the area—what will the man see?

You don’t need to be a cyborg to know that you should never trust your lying mind. It hides from you, for example, the fact that all your perceptions are delayed. Converting photons into vision, air-pressure fluctuations into sound, aerosolized molecules into odors—what your imperfect sensory organs need to receive signals, transfer them to brain language, and send them over a bushy network of nerves it occurs. Cells that count the incoming data. The process isn’t instantaneous, but you never know about the synaptic zaps, the electrochemical fizz your brain makes. The truth is that this is a theater—and you are both the director and the audience.

You see, or think you understand, things that aren’t “really there” all the time—those that aren’t anywhere other than in your head. That is what dreams are. That’s what psychedelic drugs do. This is what happens when you imagine your aunt’s face, the smell of your first car, the taste of a strawberry.

From this point of view, establishing a sensory experience—a perception—in one’s mind is not really that difficult. I did that to you for the first few paragraphs of this story. I described how the cyborgs were dressed, gave you a sign of what the room looked like, told you the soccer ball was orange. You saw it in your mind, or at least in some version of it. You heard, in your mind’s ear, research subjects talking to scientists (though in real life they were speaking Japanese). It’s all well and literary. But it would be better to take a more direct route. The brain is the salty glop that turns sensory information into the mind; You have to be able to use that ability, to build a whole world out there, a simulation indistinguishable from reality.

The Convair experiment didn’t do that—not by a long shot. But it certainly suggests the potential, power, of jacking directly into the brain. When you watch the video of the tests, the most remarkable thing is the gentle reaction of the person. When the scientists hit the juice, he felt nothing. The box with the eyes doesn’t scare or intimidate him; In fact, he’s even more surprised when he disappears. may not experience Real, Correct. (At one point, Conwhisher told me, the volunteer asked, “Am I just imagining things?”) But there’s something real about it. The spinning of electrical impulses in his fusiform gyri has shown him not just a face; It has injected an inexplicable emotion of the face.

The idea of ​​uploading a synthetic experience to the mind has been a weight-bearing member of science fiction for at least 75 years.math question, sure, but Philip K. Tape recorder in most of Dick’s work, Cyberspace, the film Metaverse, 1983 Brainstorm, (Underrated) Superconducting Quantum Interference Device in the 1995 movie strange Days, But in real life (that’s what it is, right?), we’re a long way from the data port on the back of every neck. Neuroscientists can decode the incoming signal Outside Brains strong enough to move a cursor or robotic arm, though they may not achieve the fluid elegance of biological connections. signal is going In Even more complicated.

Photo: Andrea Lou


neurosurgeons are beautiful Good at putting electrodes. The problem is knowing where to put them in all those occult nerve bushes. A small clump of cells may handle some portion of a given task, but clumps talk to each other, and it is the formation and improvement of these networks that help with power cognition. If you’re trying to trick the brain into interpreting a manufactured input as reality, you have to understand what individual neurons do, what large goblets of lots of neurons do, and all of them. how are related to each other.

It can be frustratingly specific. Sixteen years ago, Christoph Koch, chief scientist at the Allen Institute for Brain Science, helped run a famous study that showed that neurons in a part of the brain called the medial temporal lobe, which gives a lexicographer a noun- The person identifies as the place. , or things. For example, when a person saw pictures of actress Halle Berry, he was enraged. Another actively posed for various images of actress Jennifer Aniston (but not for her photos with Brad Pitt). “Neurons are the atoms of perception,” Koch says. “For a technology like Matrix, you have to understand the trigger characteristic of each individual neuron, and a piece of brain the size of a grain of rice contains 50,000 to 100,000 neurons.” Without that catalog, you might be able to “see a flash of light or speed” to someone, but they’ll never “see Father Christmas.”

Well, flashes of light are a start. You can do a lot with flashes of light. In a lab at the Netherlands Institute for Neuroscience, Peter Rolfsema and his team are using them to teach monkeys to read. Not like philosophy, but enough to be able to tell the difference between the letters of the alphabet. The researchers do this by stimulating a region called V1, which is part of the visual cortex, a patch of neurons at the back of each primate’s head. When you send current through the V1 electrode, the mammal will see a dot of light floating in space. Turn on the electrodes next door, and a second dot will appear next to the first. These are phosphenes, the phantom lights you see after a head injury, or Wiley E. The little bird flying around the coyote after it hit the wall. (The assumptions the Japanese patient had were officially called “facefence”.)

Insert an array of electrodes into the V1, Roelfsema says, and “you can work with it like a matrix board. If you have 1,000 electrodes, you basically have 1,000 light bulbs that you burn in digital space.” can.” The team could stimulate electrodes in the shape of an A or a B, and the monkeys could indicate that they noticed a difference.

You can imagine how, eventually, a visually impaired person might be able to see with this technology: Connect an electrode array in V1 to a camera to the outside world, and process the footage into a pointilist picture of reality. it might look like bitmapped Minecraft But the brain is very good at adapting to new types of sensory data.

Still, to get enough points to make lines and shapes and other useful stimuli, you need lots and lots of electrodes, and aim the electrodes very precisely. This is true for any electrode-based approach to send appreciable signals to the brain, not just the luminous phosphine shape. Whatever the views are, they are inherently specific. Stimulate the tissue a little too much, Koch says, and “you get chaos.” Also, you have to get your timing right. Perception and sensation are like a piano sonata; The notes must sound in a specific order for harmonization to work. Get that timing wrong and don’t look like an adjacent electrical ping shape—they look like one big blur, or like nothing.

Part of the reason it’s so hard for the brain to parse out the where and when is that Recording Neural activity generates data which isn’t a huge help if you’re trying to inspire nervous activity. “There is a fundamental asymmetry between brain reading and brain writing,” says Jack Gallant, a neuroscientist at UC Berkeley. The signals you see when a brain is working—objects aren’t actually thought of; They are the output that the brain emits when thinking. Researchers get a small piece of data about the overall state of the brain as a concept crosses the finish line, but sending that data back doesn’t recreate the entire race—sensing, perception, recognition, or successive intervals of cognition. Will be built. True, Kanvisher’s team illuminated a large facial recognition region of the brain and got someone to see a face, one of a kind. That is sensitivity, but not emotion, not the perception of a particular face. Jennifer Aniston gets neuron excited when she sees Jennifer Aniston; No one knows whether stimulating a Jennifer Aniston neuron can make one see Jennifer Aniston.


- Advertisement -

Stay on top - Get the daily news in your inbox

Recent Articles

Related Stories