Sunday, May 1, 2011

Steps toward a Bionic Eye

Artificial retinas that allow the blind to see

By Jamie Horder
Scientific American
February 15, 2011


The human eye is a biological marvel. Charles Darwin considered it one of the biggest challenges to his theory of evolution, famously writing: that “To suppose that the eye with all its inimitable contrivances for adjusting the focus to different distances, for admitting different amounts of light, and for the correction of spherical and chromatic aberration, could have been formed by natural selection, seems, I freely confess, absurd in the highest degree.” Of course he did go on to explain how natural selection could account for the eye, but we can see why he wrote these words under the heading of “Organs of Extreme Perfection and Complication.”

The complexity and perfection of the eye has meant that, to date, it’s been all but impossible to reproduce its function artificially. Artificial hearts, kidneys (albeit outside the body), and ears ( cochlear implants) are all in widespread medical use -- but not eyes.

That might be about to change. In a remarkable achievement, a team of ophthalmologists and engineers has managed to partially restore vision to the blind, using an electronic device which acts as a replacement for the retina. The results are reported in a paper by Professor Eberhart Zrenner, Director of the Institute for Ophthalmic Research at the University Eye Hospital in Tuebingen, Germany.

The implant consists of a tiny panel, 3 by 3.1 mm in size, containing a 38 by 40 array of 1,500 light-sensitive microphotodiodes. These sensors detect light, and control the output of a pulsed electrical current. The brighter the light, the stronger the resulting current. Each sensor has its own microelectrode, and these are placed in contact with nerve cells in the retina, called bipolar cells, the first step on the pathway from the eye to the brain. The sensors therefore mimic the way the eye’s own photoreceptor cells normally function, turning light into a pattern of electrical impulses.

The implant is not a complete artificial eye. It relies on an intact eyeball, an intact retina with functioning bipolar cells, and an optic nerve to convey the information to the brain. This means that the technology is only useful in forms of blindness caused by selective damage to photoreceptor cells.

However, such blindness is unfortunately common. Retinitis pigmentosa is a disease that causes progressive loss of vision, as the photoreceptor cells degenerate, and eventually die. There are many different forms of the disorder, each caused by mutations in a different gene. In some people, the loss of vision is gradual, and they remain able to see for most of their lives. In others, it rapidly leads to blindness. It’s estimated that about 400,000 Americans suffer some form of the disease.

Zrenner and his team implanted their device in three patients, all of whom had been born with normal vision, but had become almost totally blind due to retinal degeneration. Two of them suffered from retinitis pigmentosa, while the third had a similar disease.

The surgical procedure was, naturally, delicate. It involved inserting a metal tube behind and into one of the patient’s eyes, through which the implant was put into place. The chip comes connected to a cable that provides it with power from an external battery. It also allows the patient to control the sensitivity of the electrodes – essentially, manually adjusting the “brightness” of the image, to compensate for changes in the overall level of light. This is something that the eye normally does so effortlessly that we’re rarely aware of it.

So what happened? All three patients regained vision to some extent. Patient 2, a 44 year old man with retinitis pigmentosa, experienced the most dramatic benefits. He began to lose his sight at the age of 16. The first problem he noticed was a difficulty seeing at night, a common early symptom. By the time of the study, he was virtually blind, although he could still tell the direction from which a light was shining.

Thanks to the implant, he gained the ability to recognize everyday objects including spoons, bananas, and apples; he could read a clock; and he could read letters, albeit slowly, and they had to be printed extremely large (about 5-8 cm high).

Videos of his abilities and his reactions to his newfound sight are available online.

This subretinal implant is not the only “bionic eye” idea under development, however. Other researchers have been working on using an external camera which transmits information to a relay chip placed on the retina, the "epiretinal” approach.

However, Zrenner’s team argues that their subretinal implant technique has some important advantages. Epiretinal devices have to pre-process the image before sending it to the retina, and patients need time to learn how to process the information that their brain receives, because the camera isn’t able to provide an exact simulation of normal retina outputs.

Zrenner et al’s subretinal method, however, took little “getting used to” because the implant is such a close analogue of the healthy retina. Also, they say that epiretinal approaches have so far only provided up to 60 pixels, as opposed to their 1,500.

Still, the technology has limitations. The image has no color, and it’s much less detailed than normal vision. The sensor has a resolution of 38 by 40 pixels, compared to the 960 by 640 resolution of an iPhone screen.

Being so small, it only covers a small fraction of the normal retinal field. However, this is actually less of a problem than it might first appear, because all of our detailed vision takes place in a tiny part of the retina, called the fovea. By placing the implant where the fovea used to be, the quality of the images was maximized.

The chip also requires an external power supply, so patients need to carry the battery pack and control unit around with them. Finally, they have a fairly hefty wire coming out of the side of their head.

So, at the moment, science is very far from being able to fully restore vision, but it’s still an exciting step forward. Technical improvements are sure to bring higher-quality images in the future.

Other researchers are working on using gene therapy to cure the underlying molecular cause of the disease, preventing the photoreceptors from dying in the first place. This approach has shown promise in animal models, and the results of the first human trials of gene therapy in another genetic eye disease, Leber’s ameurosis, have recently appeared.

So whether this device will become widely used in the treatment of people with diseases like retinitis pigmentosa is unclear. But it joins other emerging technologies, from deep brain stimulation to brain-computer interfaces, which are blurring the boundaries between the nervous system and machines.

Jamie Horder is a postdoctoral neuroscientist working at the Institute of Psychiatry in London. His current research focuses on autism.