brain implants

brain implants

“Silicon-Neural interface enables technology to be implanted directly into the brain!
Direct brain to brain communications via LEO satellites, PDA implanted directly in the head, artificial eyes and other breakthroughs now possible . . .”

Future Fantastic (, 1999


In the late 20th century, you might have viewed this exhibit on a primitive computer screen (at the time seen as a major advance over cellulose) or in physically superfluous spaces called museums. For educational purposes, we have chosen to replicate that uncomfortable sensory experience here. While some post-humans seem to think they were born with their brain implants in place, most of us know better. The redness in the eyes, the lower-back pain, the light headache that you are currently experiencing are all sensorial illusions spawned by the interaction between the “biological” human brain and its “post-biological” implants.

Unlike the crude relics of 20th century science fiction, the microcomputer exhibited here–not unlike the one implanted in your head–looks organic. Down to the last atom, it seems something only nature could devise. But this chunk of synthetic post-human “brain” was designed in a laboratory. It may not look like much, but this amorphous mass is the spring to the leap to the level beyond human.

Implants like this one have taken the digital revolution of the late 20th century to its logical next step. Instead of yesteryear’s crude array of digital peripherals-personal digital assistants, digital audio tapes, video players, desktop computers, telephones and the like-this tiny implant performs a singular, all-encompassing function. This new brain-enhancement device holds all the information these primitive peripherals once did, and more.

Data from the Encyclopaedia Brittanica, which rare antique collectors hold on their shelves in molecular form, is now only a thought away. The memory enhancement capabilities of modern-day implants make our human ancestors look like their primate forebears in comparison. In the human era, individuals carried cumbersome peripherals such as tape recorders to record their conversations with one another; post-human technology allows that aural information to be recorded into artificial memory implants for later recollection. In an affront to the aesthetic tastes of the late 20th and early 21st centuries, individuals once carried portable “telephone” devices to communicate over long distances. Current implant allow individuals to communicate directly with low-orbiting satellites (with which was once called “wireless” telecommunications technology) and thus, telepathically with post-humans anywhere on Earth.

In addition to such information, current implants are able to impart sensory experiences formerly reserved for “real life” encounters. For instance, while you may not have the physical ability to visit the Pacific Ocean, implants interacting with the human nervous system allow you to hear and see the waves, smell the salt air, feel the sand under your feet-all from your static, landlocked location.

While our simulation of the irritating 20th century computer interface may inhibit further exploration, we encourage you to visit the rest of the exhibit. It is, after all, an important link to post-human nature.

Our thanks go out to the folks at Brain-tech, pioneers in bringing affordable high-tech medical technology and cerebral peripherals to consumers at large. What started out as high-priced, early-Millennium medical prostheses for the neurologically impaired, and later “brain upgrades” for the rich and powerful, are now available to the public at large. Low-cost mass production and the destruction of the pharmaceutical-industrial complex have made affordable brain implants a thing of the present.

If our evolutionary forebearers, the humans, were defined by their own self-awareness, by their ability to reason, we post-humans are defined in terms of more evolved capacity for reason, unhindered by the finite restrictions of the primitive human brain. Evolution, Darwin taught us in the 19th century, was a natural occurrence. But in the 20th century, man began to wonder if technology couldn’t give natural evolution a much-needed push. In his need for survival in the 21st century, in the face of increased competition for diminishing resources, man was forced to fully exercise his capacity as self-replicating machine. With natural evolution moving slower than the growth of technology, humans were forced to bring about their own evolution, their move to the level beyond human.

More than a conscious effort to arrive at artificial brain devices, today’s implant technology is the result of years of collaborative research. A marriage of minds, it represents a mind-meld between fields as diverse as computer science, neuroscience, and nanotechnology. Today’s neuroscientists cannot help but laud the legacy of their 20th and 21st century counterparts, who saw early on the applications of biomedical interactive technology. Researchers as Stanford University’s Center for Integrated Systems, for example, were among those who saw the potential benefits of implanting computer sensor systems into human bodies; using bipolar complementary metal oxide semiconductor (biCMOS) technology, scientists implanted silicon wafers with embedded detectors which emitted signals which could later be treated as data. At the same time, scientists were working on a visual prothesis for humans with injuries to the retina or other parts of the eye; a set of electrodes, mounted on a support around the optic nerve, stimulated the optic nerve and exchanged data with a tiny CCD camera, which in turn converted the low- bandwidth data (100 pixels per image) to its biotechnical equivalent, finally stimulating the electrodes with electrical waveforms. The information conversion process was threefold: after enhancing the camera image by means of a histogram equalization, a segmentation by growth of zones was taken; finally, resolution was reduced though a mean/dominant luminescence algorithm. The final result was that the visually impaired could perceive light and images in real time.

In the now-defunct, turn-of-the-millennium United States of America, the National Institutes of Health and the National Institute of Neurological Disorders and Stroke began worked to improve life for neurologically impaired patients through selective stimulation of the nervous system. The neural prostheses eventually developed were equipped with microelectrodes made to stimulate the visual cortex and the spinal cord. Such technology allowed, for example, quadriplegics to regain motor function and sexual activity. The key to success was the design of thin-film microelectrodes minute enough to stimulate small clusters of neurons.

Similar microelectrode stimulation prostheses were developed to aid the hearing impaired, as well as for restoration of other lost “senses.” Many neuroscientists agreed that consciousness was simply a question of neurons, yet the brain as a whole was still an enigma.

Altering body parts was one thing, humans thought at the time. But the brain? The human mind? Consciousness itself? Somehow, our ancestors thought, that was different. Nonetheless, in the late 1990s, some doctors were already using brain implants to stop epileptic seizures.

Neuroscientists like Marvin Minsky and Hans Moravec, not to mention forward-thinking organizations like the Extropy Institute, however, aimed much higher. They suggested that the secret to success lay in demystifying the brain as just another organ, reducing consciousness to the synaptic interconnection of neurons-and then complementing it, much as a pacemaker did the heart or other Second Millennium artificial bio-aids did other organs.

Almost a century later, scientists had successfully developed a microfabricated silicon nervechip, surgically implanted into the human brain. This microcomputer, which was smaller than a human fingernail, was equipped with millions of microscopic electrodes engaging in electrical interaction with the the great bundle of nerves known as the the corpus callosum-the human brain’s largest data-bus. These individual microelectrodes were a great advance over the electrodes of years prior, in that they were small enough to pinpoint and stimulate reduced numbers of cells, thus providing intracortical microstimulation and detecting and analyzing nerve activity. As with the visual prostheses of yesteryear, the chip served as a “translator” of the highly complex pattern of signalsemitted between the corpus callosum and electrodes, providing the algorithm by which the computer signals could be converted to biotechnical signals, and vice versa. In such a manner, information stored as computer data could be translated to signals the human brain could process as normal “thought.” Instead of removing the implant with stored data, new data (both informational and sensorial) was digitally and wirelessly “injected” into the implant, later to be processed into biotechnical signals. These signals would go to the optic nerve or the spinal cord directly from the chip, thus bypassing the “real life” sensory experience. In other words, you could perfectly “see” or “hear” images or sounds that were not actually external, without using your eyes or your ears; perception was all in the brain.

Though they were seen as a revolutionary advance in their time, these thought-activated silicon implants were far from perfect. Often times, the microelectrodes and neurons did not interact correctly. Implant patients usually had to undergo extensive psychiatric evaluation and therapy to assure that their artificially enhanced brain was functioning properly, that the pre-op and post-op brain matter were interacting properly. In many cases, surgeons had to go back in and remove the implant.

One of the main obstacles to progress in the advancement of such biological computers was the problem of the power/size ratio. While silicon computers-with ever-increasing transistor densities-got smaller and smaller, they eventually hit the uncertainty principle. At such a small scale, normal laws gave way to quantum mechanical laws.

Fortunately, in the latter half of the 21st century, nanotechnology reached the point where it could effectively interact with individual neurons and replicate them with atomic exactitude. The current post-human brain is a far cry from the silicon implants of the late 21st century.

In the mid-20th century, students of the logical theory of computers saw great possibilities in the emerging field of computer science. The problem, however, was size. Made much smaller, with millions of times as many elements, computers were eventually able to make judgments. In a crude allusion to the human brain, artificially intelligent devices learned to solve problems, selecting for their calculations the most efficient analytic methods, based in part on their own past experience. Such advances were in part due to Richard P. Feyman, the father of nanotechnology, in his classic 1959 talk to the American Physical Society, “There’s Plenty of Room at the Bottom.” Feyman foresaw the day when science would be able to construct and alter matter at the atomic level.

Fifty years later, Steven Tsai, a Stanford professor who won the Nobel Prize for physics, was already creating “molecular tweezers,” capable of moving single molecules or atoms. Once scientists were able to enter construction sites measured in nanometres, to alter conditions they could once only observe through an electron microscope, the possibilities were endless. The late 21st century saw the advent of molecular assemblers, nanocomputers and cell repair. Dealing with fine-grain structures like the brain, at the atomic level, finally became akin to dealing with any other organ or system.

On the eve of the 22nd century, once computer science was reduced to the atomic level, microscopic computers were designed to monitor the activity of individual neurons, and then to replicate or interact with the neural network at its own scale. This new technology seemingly eliminated the need for silicon chip implants. Following the 20th century ideas of neuroscience pioneer Hans Moravec, instead of one skilled brain surgeon cutting open the human skull to insert a silicon device, today billions of ultra-precise, non-destructive molecular computers are injected into the brain, thus interacting with individual neurons. Once a nanocomputer is able to predict the behavior patterns of an individual neuron, it kills that neuron and takes its place. By gradually replacing these impulse-producing cells, an operative replica of the original organic brain begins to take shape. In essence, data from the original brain is transferred to a high-performance, upgraded, denser one-created in situ from existing organic matter, and thus eliminating the need for artificial implants.

Nanocomputers, in addition to artificially growing synapses to increase “wisdom” or “intelligence,” can also affect mood, much like 20th century psychopharmaceuticals did. In the present case of data storage (such as the Encyclopaedia Britannica or the telephone book), the secret to improvement comes in the form of memory enhancement. Now that modern-day neuroscience understands the brain’s hundreds of specialized regions, biomanufacturers can design attachments for each. Using the data introduced though the body’s built-in “receptors” (eyes, ears, nose, etc.), cerebral implants allow the organic brain to function with the efficiency of a supercomputer. For example, if you want to listen to a particular musical work without carrying around some clunky peripheral, you can listen to it at ultra-high speed; your brain can then reconstruct it and play it back in “real time.” Data such as memory or hard-to-attain physical experience, which are not introduced through these organic receptors, are simply injected to form part of the brain, as another component of memory.

As with any new technology, growth in the field of neural prosthetics and nanotechnology outstripped debate in the area of bioethics. While doctors saw the benefits of implants to neurologically challenged patients, they were not prepared for the use of technology for nonmedical ends. While the late 20th and early 21st century saw a plethora of anti-cloning legislation, the world’s wealthy were going to bioethics-free enclaves where they could replace deteriorating body parts with those from cultivated clones. But the Clone Wars of the Late 22nd century ended the use of real human tissue. With the extra money diverted from cloning activity going to research, by the early 23rd century, nanotechnology was able to replace worn body parts with organic, atomically correct replicas, with a zero chance of rejection.

This technology was eventually applied to sections of the human brain, thus leading to further ethical dilemmas. For decades, post-human brains were reserved for the wealthy and the powerful, inevitably used at the expense of the less fortunate humans. Once more and more people got access to this new brain-enhancement technology (thus becoming post-humans), post-human elites began investing more and more in brain upgrades; the “arms race” of the 20th century saw its 23rd century parallel in the “brains race.”

Even once post-humans became the dominant species and brain implant technology was conducted at birth throughout most of the world, new problems arrived. Though the nation-state was phased out in the early part of the Third Millennium, certain economic and social power interests saw the continued need to control “the masses.” Until the development of fool-proof jamming devices, certain cerebrally enhanced individuals enjoyed the ability to read-and control-the thoughts of individuals as they were transferred through the ether. Advertising agencies and corporations found new and nefarious ways to influence shopping habits.

The decades-long war between nanotechnology advocates and the residual defenders of silicon came to an end in the mid-23rd century. Major religious institutions accused scientists of “playing God,” and Vatican V denounced these silicon implants as sinful. Shortly before the Vatican VI decree of 2243, which allowed Roman Catholics both brain implants and eternal life, nanotechnology reached the point of routine brain upgrades. Convents soon began to offer their nuns digital filters against “impure thoughts,” and wealthier Catholics were able to apply a de facto repellent against the Seven Deadly Sins (periodically deactivating the anti-lust feature for procreation purposes), thus assuring themselves a place in the Heavenly Kingdom.

By Brett Allan King

Copyright by Brett Allan King and/or publication in which story first appeared
Do not reprint without permission

Share on facebook
Share on google
Share on twitter
Share on linkedin
Share on email
Share on whatsapp