Archive for the ‘Cornell University’ Category

MASS killers like Elliot Rodger teach society all the wrong lessons about the connection between violence, mental illness and guns — and what we should do about it. One of the biggest misconceptions, pushed by our commentators and politicians, is that we can prevent these tragedies if we improve our mental health care system. It is a comforting notion, but nothing could be further from the truth.

And although the intense media attention might suggest otherwise, mass killings — when four or more people are killed at once — are very rare events. In 2012, they accounted for only about 0.15 percent of all homicides in the United States. Because of their horrific nature, however, they receive lurid media attention that distorts the public’s perception about the real risk posed by the mentally ill.

Anyone who watched Elliot Rodger’s chilling YouTube video, detailing his plan for murderous vengeance before he killed six people last week near Santa Barbara, Calif., would understandably conflate madness with violence. While it is true that most mass killers have a psychiatric illness, the vast majority of violent people are not mentally ill and most mentally ill people are not violent. Indeed, only about 4 percent of overall violence in the United States can be attributed to those with mental illness. Most homicides in the United States are committed by people without mental illness who use guns.

Mass killers are almost always young men who tend to be angry loners. They are often psychotic, seething with resentment and planning revenge for perceived slights and injuries. As a group, they tend to avoid contact with the mental health care system, so it’s tough to identify and help them. Even when they have received psychiatric evaluation and treatment, as in the case of Mr. Rodger and Adam Lanza, who killed 20 children and seven adults, including his mother, in Connecticut in 2012, we have to acknowledge that our current ability to predict who is likely to be violent is no better than chance.

Large epidemiologic studies show that psychiatric illness is a risk factor for violent behavior, but the risk is small and linked only to a few serious mental disorders. People with schizophrenia, major depression or bipolar disorder were two to three times as likely as those without these disorders to be violent. The actual lifetime prevalence of violence among people with serious mental illness is about 16 percent compared with 7 percent among people who are not mentally ill.

What most people don’t know is that drug and alcohol abuse are far more powerful risk factors for violence than other psychiatric illnesses. Individuals who abuse drugs or alcohol but have no other psychiatric disorder are almost seven times more likely than those without substance abuse to act violently.

As a psychiatrist, I welcome calls from our politicians to improve our mental health care system. But even the best mental health care is unlikely to prevent these tragedies.

If we can’t reliably identify people who are at risk of committing violent acts, then how can we possibly prevent guns from falling into the hands of those who are likely to kill? Mr. Rodger had no problem legally buying guns because he had neither been institutionalized nor involuntarily hospitalized, both of which are generally factors that would have prevented him from purchasing firearms.

Would lowering the threshold for involuntary psychiatric treatment, as some argue, be effective in preventing mass killings or homicide in general?

It’s doubtful.

The current guideline for psychiatric treatment over the objection of the patient is, in most states, imminent risk of harm to self or others. Short of issuing a direct threat of violence or appearing grossly disturbed, you will not receive involuntary treatment. When Mr. Rodger was interviewed by the police after his mother expressed alarm about videos he had posted, several weeks ago, he appeared calm and in control and was thus not apprehended. In other words, a normal-appearing killer who is quietly planning a massacre can easily evade detection.

In the wake of these horrific killings, it would be understandable if the public wanted to make it easier to force treatment on patients before a threat is issued. But that might simply discourage other mentally ill people from being candid and drive some of the sickest patients away from the mental health care system.

We have always had — and always will have — Adam Lanzas and Elliot Rodgers. The sobering fact is that there is little we can do to predict or change human behavior, particularly violence; it is a lot easier to control its expression, and to limit deadly means of self-expression. In every state, we should prevent individuals with a known history of serious psychiatric illness or substance abuse, both of which predict increased risk of violence, from owning or purchasing guns.

But until we make changes like that, the tragedy of mass killings will remain a part of American life.

Richard A. Friedman is a professor of clinical psychiatry and the director of the psychopharmacology clinic at the Weill Cornell Medical College.



Researchers at Weill Cornell Medical College have successfully tested their novel anti-cocaine vaccine in primates, bringing them closer to launching human clinical trials. Their study, published online by the journal Neuropsychopharmacology, used a radiological technique to demonstrate that the anti-cocaine vaccine prevented the drug from reaching the brain and producing a dopamine-induced high.

“The vaccine eats up the cocaine in the blood like a little Pac-man before it can reach the brain,” says the study’s lead investigator, Dr. Ronald G. Crystal, chairman of the Department of Genetic Medicine at Weill Cornell Medical College. “We believe this strategy is a win-win for those individuals, among the estimated 1.4 million cocaine users in the United States, who are committed to breaking their addiction to the drug,” he says. “Even if a person who receives the anti-cocaine vaccine falls off the wagon, cocaine will have no effect.”

Dr. Crystal says he expects to begin human testing of the anti-cocaine vaccine within a year.

Cocaine, a tiny molecule drug, works to produce feelings of pleasure because it blocks the recycling of dopamine — the so-called “pleasure” neurotransmitter — in two areas of the brain, the putamen in the forebrain and the caudate nucleus in the brain’s center. When dopamine accumulates at the nerve endings, “you get this massive flooding of dopamine and that is the feel good part of the cocaine high,” says Dr. Crystal.

The novel vaccine Dr. Crystal and his colleagues developed combines bits of the common cold virus with a particle that mimics the structure of cocaine. When the vaccine is injected into an animal, its body “sees” the cold virus and mounts an immune response against both the virus and the cocaine impersonator that is hooked to it. “The immune system learns to see cocaine as an intruder,” says Dr. Crystal. “Once immune cells are educated to regard cocaine as the enemy, it produces antibodies, from that moment on, against cocaine the moment the drug enters the body.”

In their first study in animals, the researchers injected billions of their viral concoction into laboratory mice, and found a strong immune response was generated against the vaccine. Also, when the scientists extracted the antibodies produced by the mice and put them in test tubes, it gobbled up cocaine. They also saw that mice that received both the vaccine and cocaine were much less hyperactive than untreated mice given cocaine.

In this study, the researchers sought to precisely define how effective the anti-cocaine vaccine is in non-human primates, who are closer in biology to humans than mice. They developed a tool to measure how much cocaine attached to the dopamine transporter, which picks up dopamine in the synapse between neurons and brings it out to be recycled. If cocaine is in the brain, it binds on to the transporter, effectively blocking the transporter from ferrying dopamine out of the synapse, keeping the neurotransmitter active to produce a drug high.

In the study, the researchers attached a short-lived isotope tracer to the dopamine transporter. The activity of the tracer could be seen using positron emission tomography (PET). The tool measured how much of the tracer attached to the dopamine receptor in the presence or absence of cocaine.

The PET studies showed no difference in the binding of the tracer to the dopamine transporter in vaccinated compared to unvaccinated animals if these two groups were not given cocaine. But when cocaine was given to the primates, there was a significant drop in activity of the tracer in non-vaccinated animals. That meant that without the vaccine, cocaine displaced the tracer in binding to the dopamine receptor.

Previous research had shown in humans that at least 47 percent of the dopamine transporter had to be occupied by cocaine in order to produce a drug high. The researchers found, in vaccinated primates, that cocaine occupancy of the dopamine receptor was reduced to levels of less than 20 percent.

“This is a direct demonstration in a large animal, using nuclear medicine technology, that we can reduce the amount of cocaine that reaches the brain sufficiently so that it is below the threshold by which you get the high,” says Dr. Crystal.

When the vaccine is studied in humans, the non-toxic dopamine transporter tracer can be used to help study its effectiveness as well, he adds.

The researchers do not know how often the vaccine needs to be administered in humans to maintain its anti-cocaine effect. One vaccine lasted 13 weeks in mice and seven weeks in non-human primates.

“An anti-cocaine vaccination will require booster shots in humans, but we don’t know yet how often these booster shots will be needed,” says Dr. Crystal. “I believe that for those people who desperately want to break their addiction, a series of vaccinations will help.”

Co-authors of the study include Dr. Anat Maoz, Dr. Martin J. Hicks, Dr. Shankar Vallabhajosula, Michael Synan, Dr. Paresh J. Kothari, Dr. Jonathan P. Dyke, Dr. Douglas J. Ballon, Dr. Stephen M. Kaminsky, Dr. Bishnu P. De and Dr. Jonathan B. Rosenberg from Weill Cornell Medical College; Dr. Diana Martinez from Columbia University; and Dr. George F. Koob and Dr. Kim D. Janda from The Scripps Research Institute.

The study was funded by grants from the National Institute on Drug Abuse (NIDA).

Thanks to Kebmodee and Dr. Rajadhyaksha for bringing this to the attention of the It’s Interesting community.

Structure of the CACNA1C gene product, a calcium channel named Cav1.2, which is one of 4 genes that has now been found to be genetically held in common amongst schizophrenia, bipolar disorder, autism, major depression and attention deficit hyperactivity disoder. Groundbreaking work on the role of this protein on anxiety and other forms of behavior related to mental illness has previously been established in the Rajadhyaksha laboratory at Weill Cornell Medical Center.

From the New York Times:
The psychiatric illnesses seem very different — schizophrenia, bipolar disorder, autism, major depression and attention deficit hyperactivity disorder. Yet they share several genetic glitches that can nudge the brain along a path to mental illness, researchers report. Which disease, if any, develops is thought to depend on other genetic or environmental factors.

Their study, published online Wednesday in the Lancet, was based on an examination of genetic data from more than 60,000 people worldwide. Its authors say it is the largest genetic study yet of psychiatric disorders. The findings strengthen an emerging view of mental illness that aims to make diagnoses based on the genetic aberrations underlying diseases instead of on the disease symptoms.

Two of the aberrations discovered in the new study were in genes used in a major signaling system in the brain, giving clues to processes that might go awry and suggestions of how to treat the diseases.

“What we identified here is probably just the tip of an iceberg,” said Dr. Jordan Smoller, lead author of the paper and a professor of psychiatry at Harvard Medical School and Massachusetts General Hospital. “As these studies grow we expect to find additional genes that might overlap.”

The new study does not mean that the genetics of psychiatric disorders are simple. Researchers say there seem to be hundreds of genes involved and the gene variations discovered in the new study confer only a small risk of psychiatric disease.

Steven McCarroll, director of genetics for the Stanley Center for Psychiatric Research at the Broad Institute of Harvard and M.I.T., said it was significant that the researchers had found common genetic factors that pointed to a specific signaling system.

“It is very important that these were not just random hits on the dartboard of the genome,” said Dr. McCarroll, who was not involved in the new study.

The work began in 2007 when a large group of researchers began investigating genetic data generated by studies in 19 countries and including 33,332 people with psychiatric illnesses and 27,888 people free of the illnesses for comparison. The researchers studied scans of people’s DNA, looking for variations in any of several million places along the long stretch of genetic material containing three billion DNA letters. The question: Did people with psychiatric illnesses tend to have a distinctive DNA pattern in any of those locations?

Researchers had already seen some clues of overlapping genetic effects in identical twins. One twin might have schizophrenia while the other had bipolar disorder. About six years ago, around the time the new study began, researchers had examined the genes of a few rare families in which psychiatric disorders seemed especially prevalent. They found a few unusual disruptions of chromosomes that were linked to psychiatric illnesses. But what surprised them was that while one person with the aberration might get one disorder, a relative with the same mutation got a different one.

Jonathan Sebat, chief of the Beyster Center for Molecular Genomics of Neuropsychiatric Diseases at the University of California, San Diego, and one of the discoverers of this effect, said that work on these rare genetic aberrations had opened his eyes. “Two different diagnoses can have the same genetic risk factor,” he said.

In fact, the new paper reports, distinguishing psychiatric diseases by their symptoms has long been difficult. Autism, for example, was once called childhood schizophrenia. It was not until the 1970s that autism was distinguished as a separate disorder.

But Dr. Sebat, who did not work on the new study, said that until now it was not clear whether the rare families he and others had studied were an exception or whether they were pointing to a rule about multiple disorders arising from a single genetic glitch.

“No one had systematically looked at the common variations,” in DNA, he said. “We didn’t know if this was particularly true for rare mutations or if it would be true for all genetic risk.” The new study, he said, “shows all genetic risk is of this nature.”

The new study found four DNA regions that conferred a small risk of psychiatric disorders. For two of them, it is not clear what genes are involved or what they do, Dr. Smoller said. The other two, though, involve genes that are part of calcium channels, which are used when neurons send signals in the brain.

“The calcium channel findings suggest that perhaps — and this is a big if — treatments to affect calcium channel functioning might have effects across a range of disorders,” Dr. Smoller said.

There are drugs on the market that block calcium channels — they are used to treat high blood pressure — and researchers had already postulated that they might be useful for bipolar disorder even before the current findings.

One investigator, Dr. Roy Perlis of Massachusetts General Hospital, just completed a small study of a calcium channel blocker in 10 people with bipolar disorder and is about to expand it to a large randomized clinical trial. He also wants to study the drug in people with schizophrenia, in light of the new findings. He cautions, though, that people should not rush out to take a calcium channel blocker on their own.

“We need to be sure it is safe and we need to be sure it works,” Dr. Perlis said.

scarecrow gene
Cross section of a mature maize leaf showing Kranz (German for wreath) anatomy around a large vein. The bundle sheath cells (lighter red) encircle the vascular core (light blue). Mesophyll cells (dark red) encircle the bundle sheath cells. The interaction and cooperation between the mesophyll and bundle sheath is essential for the C4 photosynthetic mechanism. (Credit: Thomas Slewinski)

With projections of 9.5 billion people by 2050, humankind faces the challenge of feeding modern diets to additional mouths while using the same amounts of water, fertilizer and arable land as today.

Cornell researchers have taken a leap toward meeting those needs by discovering a gene that could lead to new varieties of staple crops with 50 percent higher yields.

The gene, called Scarecrow, is the first discovered to control a special leaf structure, known as Kranz anatomy, which leads to more efficient photosynthesis. Plants photosynthesize using one of two methods: C3, a less efficient, ancient method found in most plants, including wheat and rice; and C4, a more efficient adaptation employed by grasses, maize, sorghum and sugarcane that is better suited to drought, intense sunlight, heat and low nitrogen.

“Researchers have been trying to find the underlying genetics of Kranz anatomy so we can engineer it into C3 crops,” said Thomas Slewinski, lead author of a paper that appeared online in November in the journal Plant and Cell Physiology. Slewinski is a postdoctoral researcher in the lab of senior author Robert Turgeon, professor of plant biology in the College of Arts and Sciences.

The finding “provides a clue as to how this whole anatomical key is regulated,” said Turgeon. “There’s still a lot to be learned, but now the barn door is open and you are going to see people working on this Scarecrow pathway.” The promise of transferring C4 mechanisms into C3 plants has been fervently pursued and funded on a global scale for decades, he added.

If C4 photosynthesis is successfully transferred to C3 plants through genetic engineering, farmers could grow wheat and rice in hotter, dryer environments with less fertilizer, while possibly increasing yields by half, the researchers said.

C3 photosynthesis originated at a time in Earth’s history when the atmosphere had a high proportion of carbon dioxide. C4 plants have independently evolved from C3 plants some 60 times at different times and places. The C4 adaptation involves Kranz anatomy in the leaves, which includes a layer of special bundle sheath cells surrounding the veins and an outer layer of cells called mesophyll. Bundle sheath cells and mesophyll cells cooperate in a two-step version of photosynthesis, using different kinds of chloroplasts.

By looking closely at plant evolution and anatomy, Slewinski recognized that the bundle sheath cells in leaves of C4 plants were similar to endodermal cells that surrounded vascular tissue in roots and stems.

Slewinski suspected that if C4 leaves shared endodermal genes with roots and stems, the genetics that controlled those cell types may also be shared. Slewinski looked for experimental maize lines with mutant Scarecrow genes, which he knew governed endodermal cells in roots. When the researchers grew those plants, they first identified problems in the roots, then checked for abnormalities in the bundle sheath. They found that the leaves of Scarecrow mutants had abnormal and proliferated bundle sheath cells and irregular veins.

In all plants, an enzyme called RuBisCo facilitates a reaction that captures carbon dioxide from the air, the first step in producing sucrose, the energy-rich product of photosynthesis that powers the plant. But in C3 plants RuBisCo also facilitates a competing reaction with oxygen, creating a byproduct that has to be degraded, at a cost of about 30-40 percent overall efficiency. In C4 plants, carbon dioxide fixation takes place in two stages. The first step occurs in the mesophyll, and the product of this reaction is shuttled to the bundle sheath for the RuBisCo step. The RuBisCo step is very efficient because in the bundle sheath cells, the oxygen concentration is low and the carbon dioxide concentration is high. This eliminates the problem of the competing oxygen reaction, making the plant far more efficient.

The study was funded by the National Science Foundation and the U.S. Department of Agriculture.






A bit reminiscent of the Terminator T-1000, a new material created by Cornell researchers is so soft that it can flow like a liquid and then, strangely, return to its original shape.

Rather than liquid metal, it is a hydrogel, a mesh of organic molecules with many small empty spaces that can absorb water like a sponge. It qualifies as a “metamaterial” with properties not found in nature and may be the first organic metamaterial with mechanical meta-properties.

Hydrogels have already been considered for use in drug delivery — the spaces can be filled with drugs that release slowly as the gel biodegrades — and as frameworks for tissue rebuilding. The ability to form a gel into a desired shape further expands the possibilities. For example, a drug-infused gel could be formed to exactly fit the space inside a wound.

Dan Luo, professor of biological and environmental engineering, and colleagues describe their creation in the Dec. 2 issue of the journal Nature Nanotechnology.

The new hydrogel is made of synthetic DNA. In addition to being the stuff genes are made of, DNA can serve as a building block for self-assembling materials. Single strands of DNA will lock onto other single stands that have complementary coding, like tiny organic Legos. By synthesizing DNA with carefully arranged complementary sections Luo’s research team previously created short stands that link into shapes such as crosses or Y’s, which in turn join at the ends to form meshlike structures to form the first successful all-DNA hydrogel. Trying a new approach, they mixed synthetic DNA with enzymes that cause DNA to self-replicate and to extend itself into long chains, to make a hydrogel without DNA linkages.

“During this process they entangle, and the entanglement produces a 3-D network,” Luo explained. But the result was not what they expected: The hydrogel they made flows like a liquid, but when placed in water returns to the shape of the container in which it was formed.

“This was not by design,” Luo said.

Examination under an electron microscope shows that the material is made up of a mass of tiny spherical “bird’s nests” of tangled DNA, about 1 micron (millionth of a meter) in diameter, further entangled to one another by longer DNA chains. It behaves something like a mass of rubber bands glued together: It has an inherent shape, but can be stretched and deformed.

Exactly how this works is “still being investigated,” the researchers said, but they theorize that the elastic forces holding the shape are so weak that a combination of surface tension and gravity overcomes them; the gel just sags into a loose blob. But when it is immersed in water, surface tension is nearly zero — there’s water inside and out — and buoyancy cancels gravity.

To demonstrate the effect, the researchers created hydrogels in molds shaped like the letters D, N and A. Poured out of the molds, the gels became amorphous liquids, but in water they morphed back into the letters. As a possible application, the team created a water-actuated switch. They made a short cylindrical gel infused with metal particles placed in an insulated tube between two electrical contacts. In liquid form the gel reaches both ends of the tube and forms a circuit. When water is added, the gel reverts to its shorter form that will not reach both ends. (The experiment is done with distilled water that does not conduct electricity.)

The DNA used in this work has a random sequence, and only occasional cross-linking was observed, Luo said. By designing the DNA to link in particular ways he hopes to be able to tune the properties of the new hydrogel.

The research has been partially supported by the U.S. Department of Agriculture and the Department of Defense.

Thanks to Dr. Rajadhyaksha for bringing this to the attention of the It’s Interesting community.

For the first time, researchers have used a specialized camera to measure pupillary changes in people watching erotic videos, the changes in pupil dilation revealing where the participant is located on the heterosexual-homosexual spectrum. The researchers at Cornell University who developed the technique say it provides an accurate method of gauging the precise sexual orientation of a subject. The work is detailed in the journal PLoS ONE.

Previously, researchers trying to assess sexual orientation simply asked people about their sexuality or used intrusive physiological measures, such as assessing their genital arousal.

“We wanted to find an alternative measure that would be an automatic indication of sexual orientation, but without being as invasive as previous measures. Pupillary responses are exactly that,” says lead researcher Gerulf Rieger. “With this new technology we are able to explore sexual orientation of people who would never participate in a study on genital arousal, such as people from traditional cultures. This will give us a much better understanding how sexuality is expressed across the planet.”

Experimenting with the technique, the researchers found heterosexual men showed strong pupillary responses to sexual videos of women, and little to men. Heterosexual women, however, showed pupillary responses to both sexes. This result confirms previous research suggesting that women have a very different type of sexuality than men.

Interestingly, the new study sheds new light on the long-standing debate on male bisexuality. Previous notions were that most bisexual men do not base their sexual identity on their physiological sexual arousal but on romantic and identity issues. Contrary to this claim, bisexual men in the new study showed substantial pupil dilations to sexual videos of both men and women.

“We can now finally argue that a flexible sexual desire is not simply restricted to women – some men have it, too, and it is reflected in their pupils,” said co-researcher Ritch C. Savin-Williams. “In fact, not even a division into ‘straight,’ ‘bi,’ and ‘gay’ tells the full story. Men who identity as ‘mostly straight’ really exist both in their identity and their pupil response; they are more aroused to males than straight men, but much less so than both bisexual and gay men.”

Thanks to Dr. A.R. for bringing this to the attention of the It’s Interesting community.


Researchers report they have developed in mice what they believe might one day become a breakthrough for humans: a retinal prosthesis that could restore near-normal sight to those who have lost their vision.

That would be a welcome development for the roughly 25 million people worldwide who are blind because of retinal disease, most notably macular degeneration.

The notion of using prosthetics to combat blindness is not new, with prior efforts involving retinal electrode implantation and/or gene therapy restoring a limited ability to pick out spots and rough edges of light.

The current effort takes matters to a new level. The scientists fashioned a prosthetic system packed with computer chips that replicate the “neural impulse codes” the eye uses to transmit light signals to the brain.

“This is a unique approach that hasn’t really been explored before, and we’re really very excited about it,” said study author Sheila Nirenberg, a professor and computational neuroscientist in the department of physiology and biophysics at Weill Medical College of Cornell University in New York City. “I’ve actually been working on this for 10 years. And suddenly, after a lot of work, I knew immediately that I could make a prosthetic that would work, by making one that could take in images and process them into a code that the brain can understand.”

Nirenberg and her co-author Chethan Pandarinath (a former Cornell graduate student now conducting postdoctoral research at Stanford University School of Medicine) report their work in the Aug. 14 issue of Proceedings of the National Academy of Sciences. Their efforts were funded by the U.S. National Institutes of Health and Cornell University’s Institute for Computational Biomedicine.

The study authors explained that retinal diseases destroy the light-catching photoreceptor cells on the retina’s surface. Without those, the eye cannot convert light into neural signals that can be sent to the brain.

However, most of these patients retain the use of their retina’s “output cells” — called ganglion cells — whose job it is to actually send these impulses to the brain. The goal, therefore, would be to jumpstart these ganglion cells by using a light-catching device that could produce critical neural signaling.

But past efforts to implant electrodes directly into the eye have only achieved a small degree of ganglion stimulation, and alternate strategies using gene therapy to insert light-sensitive proteins directly into the retina have also fallen short, the researchers said.

Nirenberg theorized that stimulation alone wasn’t enough if the neural signals weren’t exact replicas of those the brain receives from a healthy retina.

“So, what we did is figure out this code, the right set of mathematical equations,” Nirenberg explained. And by incorporating the code right into their prosthetic device’s chip, she and Pandarinath generated the kind of electrical and light impulses that the brain understood.

The team also used gene therapy to hypersensitize the ganglion output cells and get them to deliver the visual message up the chain of command.

Behavioral tests were then conducted among blind mice given a code-outfitted retinal prosthetic and among those given a prosthetic that lacked the code in question.

The result: The code group fared dramatically better on visual tracking than the non-code group, with the former able to distinguish images nearly as well as mice with healthy retinas.

“Now we hope to move on to human trials as soon as possible,” said Nirenberg. “Of course, we have to conduct standard safety studies before we get there. And I would say that we’re looking at five to seven years before this is something that might be ready to go, in the best possible case. But we do hope to start clinical trials in the next one to two years.”

Results achieved in animal studies don’t necessarily translate to humans.

Dr. Alfred Sommer, a professor of ophthalmology at Johns Hopkins University in Baltimore and dean emeritus of Hopkins’  Bloomberg School of Public Health, urged caution about the findings.

“This could be revolutionary,” he said. “But I doubt it. It’s a very, very complicated business. And people have been working on it intensively and incrementally for the last 30 years.”

“The fact that they have done something that sounds a little bit better than the last set of results is great,” Sommer added.  “It’s terrific. But this approach is really in its infancy. And I guarantee that it will be a long time before they get to the point where they can really restore vision to people using prosthetics.”

Other advances may offer benefits in the meantime, he said. “We now have new therapies that we didn’t have even five years ago,” Sommer said. “So we may be reaching a state where the amount of people losing their sight will decline even as these new techniques for providing artificial vision improve. It may not be as sci-fi. But I think it’s infinitely more important at this stage.”

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.