Archive for the ‘Uncategorized’ Category

Shorter sleep duration among children was associated with increased risk for depression, anxiety, impulsive behavior and poor cognitive performance, according to study findings published in Molecular Psychiatry.

“Sleep disturbances are common among children and adolescents around the world, with approximately 60% of adolescents in the United States receiving less than 8 hours of sleep on school nights,” Jianfeng Feng, PhD, of the department of computer science at University of Warwick in the UK, told Healio Psychiatry. “An important public health implication is that psychopathology in both children and their parents should be considered in relation to sleep problems in children. Further, we showed that brain structure is associated with sleep problems in children and that this is related to whether the child has depressive problems.”

According to Feng and colleagues, the present study is the first large-scale research effort to analyze sleep duration in children and its impact on psychiatric problems including depression, brain structure and cognition. They analyzed measures related to these areas using data from the Adolescent Brain Cognitive Development Study, which included structural MRI data from 11,067 individuals aged 9 to 11 years.

The researchers found that depression, anxiety and impulsive behavior were negatively correlated with sleep duration. Dimensional psychopathology in participants’ parents was correlated with short sleep duration in the children. Feng and colleagues noted that the orbitofrontal cortex, prefrontal and temporal cortex, precuneus and supramarginal gyrus were brain areas in which higher volume was correlated with longer sleep duration. According to longitudinal data analysis, psychiatric problems, particularly depressive problems, were significantly associated with short sleep duration 1 year later. Moreover, they found that depressive problems significantly mediated these brain regions’ effect on sleep. Higher volume of the prefrontal cortex, temporal cortex and medial orbitofrontal cortex were associated with higher cognitive scores.

“Our findings showed that 53% of children received less than 9 hours of sleep per night,” Feng said. “More importantly, the behavior problems total score for children with less than 7 hours of sleep was 53% higher on average and the cognitive total score was 7.8% lower on average than for children with 9 to 11 hours of sleep. We hope this study attracts public attention to sleep problems in children and provides evidence for governments to develop advice about sleep for children.” – by Joe Gramigna

https://www.healio.com/psychiatry/depression/news/online/%7B7440e93a-fe6a-4154-88f4-a5858d16c4cb%7D/children-with-less-sleep-experience-increased-depression-anxiety-decreased-cognitive-performance

A promising molecule has offered hope for a new treatment that could stop or slow Parkinson’s, something no treatment can currently do.

Researchers from the University of Helsinki found that molecule BT13 has the potential to both boost levels of dopamine, the chemical that is lost in Parkinson’s, as well as protect the dopamine-producing brain cells from dying.

The results from the study, co-funded by Parkinson’s UK and published online today in the journal Movement Disorders, showed an increase in dopamine levels in the brains of mice following the injection of the molecule. BT13 also activated a specific receptor in the mouse brains to protect the cells.

Typically, by the time people are diagnosed with Parkinson’s, they have already lost 70-80 per cent of their dopamine-producing cells, which are involved in coordinating movement.

While current treatments mask the symptoms, there is nothing that can slow down its progression or prevent more brain cells from being lost, and as dopamine levels continue to fall, symptoms get worse and new symptoms can appear.

Researchers are now working on improving the properties of BT13 to make it more effective as a potential treatment which, if successful, could benefit the 145,000 people living with Parkinson’s in the UK.

The study builds on previous research on another molecule that targets the same receptors in the brain, glial cell line-derived neurotrophic factor (GDNF), an experimental treatment for Parkinson’s which was the subject of a BBC documentary in February 2019. While the results were not clear cut, GDNF has shown promise to restore damaged cells in Parkinson’s.

However, the GDNF protein requires complex surgery to deliver the treatment to the brain because it’s a large molecule that cannot cross the blood-brain barrier – a protective barrier that prevents some drugs from getting into the brain.

BT13, a smaller molecule, is able to cross the blood-brain barrier – and therefore could be more easily administered as a treatment, if shown to be beneficial in further clinical trials.

Professor David Dexter, Deputy Director of Research at Parkinson’s UK, said:

“People with Parkinson’s desperately need a new treatment that can stop the condition in its tracks, instead of just masking the symptoms.

“One of the biggest challenges for Parkinson’s research is how to get drugs past the blood-brain barrier, so the exciting discovery of BT13 has opened up a new avenue for research to explore, and the molecule holds great promise as a way to slow or stop Parkinson’s.

“More research is needed to turn BT13 into a treatment to be tested in clinical trials, to see if it really could transform the lives of people living with Parkinson’s.”

Dr Yulia Sidorova, lead researcher on the study, said: “We are constantly working on improving the effectiveness of BT13. We are now testing a series of similar BT13 compounds, which were predicted by a computer program to have even better characteristics.

“Our ultimate goal is to progress these compounds to clinical trials in a few coming years.”

Molecule offers hope for halting Parkinson’s


Dr. Anjali Rajadhyaksha
Professor of Neuroscience in Pediatrics
Associate Dean of Program Development
Weill Cornell Graduate School


Dr. Francis Lee
Psychiatry/Pharmacology; Chair and Psychiatrist-in-Chief
Mortimer D. Sackler, M.D. Professor in Psychiatry, Weill Cornell Medicine


Dr. Caitlin Burgdorf

A common variation in a human gene that affects the brain’s reward processing circuit increases vulnerability to the rewarding effects of the main psychoactive ingredient of cannabis in adolescent females, but not males, according to preclinical research by Weill Cornell Medicine investigators. As adolescence represents a highly sensitive period of brain development with the highest risk for initiating cannabis use, these findings in mice have important implications for understanding the influence of genetics on cannabis dependence in humans.

The brain’s endocannabinoid system regulates activity of cannabinoids that are normally produced by the body to influence brain development and regulate mood, as well as those from external sources, such as the psychoactive ingredient THC, also known as Δ9-tetrahydrocannabinol, which is found in cannabis. An enzyme called fatty acid amide hydrolase (FAAH) breaks down a cannabinoid called anandamide that is naturally found in the brain and is most closely related to THC, helping to remove it from circulation.

In the study, published Feb. 12 in Science Advances, the investigators examined mice harboring a human gene variant that causes FAAH to degrade more easily, increasing overall anandamide levels in the brain. They discovered that the variant resulted in an overactive reward circuit in female—but not male adolescent mice—that resulted in higher preference for THC in females. Previous clinical studies linked this FAAH variant with increased risk for problem drug use, but no studies had specifically looked at the mechanistic effect on cannabis dependence.

“Our study shows that a variant in the FAAH gene, which is found in about one-third of people, increases vulnerability to THC in females and has large-scale impact on brain regions and pathways responsible for processing reward,” said lead author Dr. Caitlin Burgdorf, a recent doctoral graduate from the Weill Cornell Graduate School of Medical Sciences. “Our findings suggest that genetics can be a contributing factor for increased susceptibility to cannabis dependence in select populations.”

The team found that female mice with the FAAH variant showed an increased preference for the environment in which they’d been exposed to THC over a neutral environment when they were exposed to the substance during adolescence, and the effect persisted into adulthood. However, if female mice with this variant were exposed to THC for the first time in adulthood, there was no increased preference for THC. These findings in mice parallel observations in humans that a select population of females are more sensitive to the effects of cannabis and demonstrate a quicker progression to cannabis dependence. “Our findings suggest that we have discovered a genetic factor to potentially identify subjects at risk for cannabis dependence,” said Dr. Burgdorf.

The investigators also found that the genetic variant led to increased neuronal connections and neural activity between two regions of the brain heavily implicated in reward behavior. Next, the team reversed the overactive reward circuit in female mice and found that decreasing circuit activity dampened the rewarding effects of THC.

As substance abuse disorders often emerge during adolescence, the investigators say this study has significant implications for translating these findings to inform developmental and genetic risk factors for human cannabis dependence.

“Our study provides new insights into cannabis dependence and provides us with a circuit and molecular framework to further explore the mechanisms of cannabis dependence,” said co-senior author Dr. Anjali Rajadhyaksha, professor of neuroscience in pediatrics and associate professor of neuroscience in the Feil Family Brain and Mind Research Institute and a member of the Drukier Institute for Children’s Health at Weill Cornell Medicine.

Although genetic factors are increasingly found to be associated with risk for other types of addiction, very few studies have investigated genetic factors associated with increasing risk for cannabis dependence. “In the future, we could use the presence of this FAAH genetic variant to potentially predict if an individual is more likely to be vulnerable to cannabis dependence,” said co-senior author, Dr. Francis Lee, chair of the Department of Psychiatry at Weill Cornell Medicine and psychiatrist-in-chief at NewYork-Presbyterian/Weill Cornell Medical Center. “We are getting one step closer to understanding exactly how neurodevelopmental and genetic factors play interrelated roles to increase susceptibility for cannabis dependence.”

Additional authors on the study were Dr. Deqiang Jing, Ruirong Yang and Chienchum Huang from the Department of Psychiatry at Weill Cornell Medicine; Drs. Teresa A. Milner and Dr. Virginia M. Pickel from the Feil Family Brain and Mind Research Institute at Weill Cornell Medicine; Dr. Matthew N. Hill from departments of Cell Biology and Anatomy and Psychiatry at University of Calgary; and Dr. Ken Mackie from the Department of Psychological and Brain Sciences at Indiana University Bloomington.

This research was supported by the National Institute of Health (Grants T32DA039080, R01DA08259, R01HL098351, R01HL136520, R01DA042943, R01NS052819, R01DA029122), Weill Cornell’s Mowrer Memorial Graduate Student Fellowship, NewYork-Presbyterian Youth Anxiety Center, the Pritzker Neuropsychiatric Disorders Research Consortium, the DeWitt-Wallace Fund of the New York Community Trust, and The Paul Fund.

https://news.weill.cornell.edu/news/2020/02/preclinical-study-links-human-gene-variant-to-thc-reward-in-adolescent-females


The Harvard Medical School researcher’s work on the genetic basis of protein coding and production led him to make groundbreaking discoveries in immunology, molecular biology, and cancer genetics.

by ASHLEY YEAGER

Harvard Medical School molecular geneticist Philip Leder died last week (February 2). He was 85.

Leder was revered for his work in molecular biology, immunology, and cancer genetics. His first scientific breakthrough came in the 1960s when he was working as a postdoc in geneticist Marshall Nirenberg’s lab at the National Institutes of Health (NIH). Together they developed a technique that confirmed that amino acids were encoded by a sequence of three nucleotides and revealed the triplet code of ambiguous amino acids.

From there, Leder went on to determine the first complete sequence of a mammalian gene, develop the first recombinant DNA vector system safe for use in the lab, identify the structure of genes that encode antibody molecules, discover a gene that caused cancer, and develop the first mouse model of cancer.

“Phil Leder was special. Among great scientists, he was special, and among scientists, he was an icon,” David Livingston, a geneticist at Harvard who worked in Leder’s lab at NIH, tells The Scientist. “He was gifted. He was generous. He was a splendid person to listen to talk, to run experiments by, and be criticized by. He was a splendid human being on top of all of it.”

Leder was born on November 19, 1934 in Washington, DC, and grew up there. He attended Western High School, graduated in 1952, and went on to study at Harvard University. He interned at NIH as an undergraduate, working in biochemist Martha Vaughan’s lab in the National Heart Institute, which is now the National Heart, Lung, and Blood Institute. He finished his bachelor’s degree at Harvard in 1956 and stayed there for medical school, graduating in 1960.

After a two-year residency program at the University of Minnesota Hospitals, he returned to NIH to work with Nirenberg. Leder dove headfirst into the race to decipher the way genes encode proteins and helped to design a filtering instrument to rapidly test 45 amino acid samples simultaneously, instead of one at a time. Leder and Nirenberg could quickly tag amino acids with a radioactive label, bind them to triplet RNA sequences, and put them into the filtering instrument, which helped the team decode unknown amino acid codon sequences, well before other scientists could, according to a remembrance on Leder posted by NIH.

It was one of the most exciting times in Leder’s life, he said. “I would go to bed thinking about the next day’s experiments and then jump out of bed in the morning and rush to the laboratory,” he recalled in a 2012 interview with American Society for Biochemistry and Molecular Biology Today. “I stayed late at night. It was a lot of work, but the intellectual excitement was enormous.” The two published their work on the codons in 1964.

Leder’s “work w/Marshall Nirenberg set the stage for the revolution in molecular genetics,” NIH director Francis Collins wrote on Twitter last Friday (February 7).

In 1965, Leder joined the Weizmann Institute in Rehovot, Israel, as a visiting scientist and stayed until 1966. He returned to the NIH, serving as a research medical officer in the National Cancer Institute from 1966 to 1969 and then became head of the Section on Molecular Genetics in the Laboratory of Molecular Genetics in the National Institute of Child Health and Human Development and in 1972 was promoted to the director of the lab.

During this time and through the 1970s, he and his colleagues worked on deciphering the genetic sequence of alpha globin, a component of hemoglobin, a protein in red blood cells that carries oxygen to the body’s cells and tissues. His work also revealed important details about the genetics of encoding antibodies and that the synthesis of antibodies was not only regulated by genetics but also biochemical processes that ensure specificity to target the right antigen presented by viruses, bacteria, or other invaders in the body.

What made Leder such an outstanding scientist, Livingston explains, was his immense rigor. Control experiments, for example, had to be “at least as incisive or demanding and rigorous as the actual experiments . . . to prove that nothing in the discovery experiment was an artifact,” he says. “And he had an immensely adventurous mind. No problem was beyond at least discussion,” which made Leder unique as a mentor. “In fact, his ability to mentor was internationally celebrated,” Livingston explains. “You could listen to his talks, and you knew he was a fantastic teacher because his mind was utterly clear.”

Leder joined Harvard Medical School (HMS) in 1980, founding its genetics department in 1981 and chairing the department for 25 years. His research there led to the discovery of a specific gene, MYC. With Harvard colleague Timothy Stewart, Leder began using a fine glass needle to insert the cancer-causing gene into mouse embryos just after fertilization, thereby creating OncoMouse, a genetic line of mice that were prone to developing the disease. The duo patented the animal in 1988, giving researchers an unprecedented tool to study cancer and how to treat it.

His work at Harvard was not limited to his research. He made fundamental changes to hiring, instituting nationwide searches for new assistant professors in the genetics department, which increased the likelihood of hiring women, notes Jonathan Seidman, a geneticist at Harvard who worked in Leder’s lab at NIH in the 1970s. Leder also made sure the department didn’t get too big, Seidman says, and he insisted that if faculty were on different floors, spiral staircases—rather than drab stairwells—would connect them, making it easy for researchers to communicate and collaborate.

Leder’s “contributions to science and to HMS cannot be overstated, and he will never be forgotten,” George Daley, Harvard’s dean of the faculty of medicine wrote to colleagues on February 4.

For his work, Leder was honored with the Albert Lasker Award for Basic Medical Research, the US National Medal of Science, the Heineken Prize from the Royal Netherlands Academy of Arts of Sciences, and the William Allan Medal from the American Society of Human Genetics. He was a member of the National Academy of Sciences, a Fellow of the American Association for the Advancement of Science, and a Howard Hughes Medical Institute investigator.

Surviving him are his wife, Aya Leder, his children, Micki, Tani, and Ben, his daughters-in-law, Karen Leder and Mary Leder, and his grandchildren, Jacob, David, Sarah, Eli, Alex, Matt, Amanda, and Annie.

Ashley Yeager is an associate editor at The Scientist. Email her at ayeager@the-scientist.com. Follow her on Twitter @AshleyJYeager.

https://www.the-scientist.com/news-opinion/philip-leder–who-deciphered-amino-acid-sequences–dies-67096

Researchers from Case Western Reserve University School of Medicine, University Hospitals Cleveland Medical Center (UH), Cleveland Clinic and Lifebanc (a Northeast Ohio organ-procurement organization) have developed a new way to preserve donated kidneys–a method that could extend the number and quality of kidneys available for transplant, saving more people with end-stage renal disease, more commonly known as “kidney failure.”

The team identified a drug–ethyl nitrite–that could be added to the preservation fluid to generate tiny molecules called S-nitrosothiols (SNOs), which regulate tissue-oxygen delivery. This, in turn, restored flow-through and reduced resistance within the kidney. Higher flow-rates and lower resistance are associated with better kidney function after transplantation.

Their research was funded by a grant from the Roche Organ Transplant Research Foundation and recently published in Annals of Surgery.

The United States has one of the world’s highest incidences of end-stage renal disease, and the number of afflicted individuals continues to increase. The prevalence of end-stage renal disease has more than doubled between 1990 and 2016, according to the Centers for Disease Control.

The optimal treatment is a kidney transplant, but demand far exceeds supply. Additionally, donation rates for deceased donors have been static for several years, despite various public-education campaigns, resulting in fewer kidneys available for transplant. And while the proportion and number of living donors has increased, this latter group still only makes up a small percentage of recovered kidneys for transplant.

Increasing the number of kidneys available for transplant benefits patients by extending lifespans and/or enhancing quality of life as well as the potential for reducing medical costs (a transplant is cheaper than ongoing dialysis). To help improve outcomes for kidney transplant patients, the team explored ways to extend the viability of donated kidneys.

Improvements in surgical techniques and immunosuppression therapies have made kidney transplants a relatively common procedure. However, less attention has been paid to maintaining/improving kidney function during the kidney-transport phase.

“We addressed this latter point through developing enhanced preservation methods,” said senior author James Reynolds, professor of Anesthesiology and Perioperative Medicine at Case Western Reserve School of Medicine and a member of the Harrington Discovery Institute at UH.

For decades, procured kidneys were simply flushed with preservation solution and then transported in ice-filled coolers to the recipient’s hospital. But advances in pumping technology slowly changed the field toward active storage, the preferred method for conveying the organ from donor to recipient.

“However, while 85% of kidneys are now pumped, up to 20% of kidneys are determined to be unsuitable for transplant during the storage phase,” said Kenneth Chavin, professor of surgery at the School of Medicine, chief of hepatobiliary and transplant surgery and director of the UH Transplant Institute.

“For several years, our team has directed research efforts toward understanding and improving the body’s response to medical manipulation,” Reynolds said. “Organ-donor physiology and ‘transport status’ fit well within this metric. We identified a therapy that might improve kidney perfusion, a significant factor in predicting how the organ will perform post-transplant.”

Previous work by Reynolds and long-time collaborator Jonathan Stamler, the Robert S. and Sylvia K. Reitman Family Foundation Distinguished Chair in Cardiovascular Innovation and president of the Harrington Discovery Institute, determined that brain death significantly reduces SNOs, which impairs blood-flow and tissue-oxygenation to the kidneys and other commonly transplanted organs. The loss of SNOs is not corrected by current preservation fluids, so impaired flow through the kidneys continues during storage and transport.

http://7thspace.com/headlines/1099047/novel_drug_therapy_shows_promise_for_quality__quantity_of_kidneys_available_for_transplant.html

By Jason Arunn Murugesu

An AI can predict from people’s brainwaves whether an antidepressant is likely to help them. The technique may offer a new approach to prescribing medicines for mental illnesses.

Antidepressants don’t always work, and we aren’t sure why. “We have a central problem in psychiatry because we characterise diseases by their end point, such as what behaviours they cause,” says Amit Etkin at Stanford University in California. “You tell me you’re depressed, and I don’t know any more than that. I don’t really know what’s going on in the brain and we prescribe medication on very little information.”

Etkin wanted to find out if a machine-learning algorithm could predict from the brain scans of people diagnosed with depression who was most likely to respond to treatment with the antidepressant sertraline. The drug is typically effective in only a third of the people who take it.

He and his team gathered electroencephalogram (EEG) recordings showing the brainwaves of 228 people aged between 18 and 65 with depression. These individuals had previously tried antidepressants, but weren’t on such drugs at the start of the study.

Roughly half the participants were given sertraline, while the rest got a placebo. The researchers then monitored the participants’ mood over eight weeks, measuring any changes using a depression rating scale.

Brain activity patterns
By comparing the EEG recordings of those who responded well to the drug with those who didn’t, the machine-learning algorithm was able to identify a specific pattern of brain activity linked with a higher likelihood of finding sertraline helpful.

The team then tested the algorithm on a different group of 279 people. Although only 41 per cent of overall participants responded well to sertraline, 76 per cent of those the algorithm predicted would benefit did so.

Etkin has founded a company called Alto Neuroscience to develop the technology. He hopes it results in more efficient sertraline prescription by giving doctors “the tools to make decisions about their patients using objective tests, decisions that they’re currently making by chance”, says Etkin.

This AI “could have potential future relevance to patients with depression”, says Christian Gluud at the Copenhagen Trial Unit in Denmark. But the results need to be replicated by other researchers “before any transfer to clinical practice can be considered”, he says.

Journal reference: Nature Biotechnology, DOI: 10.1038/s41587-019-0397-3

Read more: https://www.newscientist.com/article/2232792-brain-scans-can-help-predict-wholl-benefit-from-an-antidepressant/#ixzz6DeyTJYpK

By Jason Dorrier

t’s been over a decade since artificial retinas first began helping the blind see. But for many people, whose blindness originates beyond the retina, the technology falls short. Which is why new research out of Spain skips the eye entirely, instead sending signals straight to the brain’s visual cortex.

Amazingly, 15 years after losing her sight, Bernardeta Gómez, who suffers from toxic optic neuropathy, used the experimental technology to recognize lights, letters, shapes, people—and even to play a basic video game sent directly to her brain via an implant.

According to MIT Technology Review, Gómez first began working with researchers in late 2018. Over the next six months, she spent four days a week dialing in the technology’s settings and testing its limits.

The system, developed by Eduardo Fernandez, director of neuroengineering at the University of Miguel Hernandez, works like this.

A camera embedded in a pair of thick, black-rimmed glasses records Gómez’s field of view and sends it to a computer. The computer translates the data into electrical impulses the brain can read and forwards it to a brain implant by way of a cable plugged into a port in the skull. The implant stimulates neurons in Gómez’s visual cortex, which her brain interprets as incoming sensory information. Gómez perceives a low-resolution depiction of her surroundings in the form of yellow dots and shapes called phosphenes which she’s learned to interpret as objects in the world around her.

The technology itself is still very much in the early stages—Gómez is the first to test it—but the team aims to work with five more patients in the next few years. Eventually, Fernandez hopes their efforts can help return sight to many more of the world’s blind people.

A Brief History of Artificial Eyes

This isn’t the first time researchers have used technology to help the blind see again.

Roughly two decades ago, the Artificial Retina Project brought together a number of research institutions to develop a device for those suffering retina-destroying diseases. The work resulted in the Argus systems, which, like Fernandez’s system, use a camera mounted on glasses, a computer to translate sensory data, and an implant with an array of electrodes embedded in the retina (instead of the brain).

Over the course of about a decade, researchers developed the Argus I and Argus II systems, ran them through human trials, and gained approval in Europe (2011) and the US (2013) to sell their bionic eyes to eligible individuals.

According to MIT Technology Review, around 350 people use Argus II today, but the company marketing the devices, Second Sight, has pivoted from artificial retinas to the brain itself because far more people, like Gómez, suffer from damage to the neural pathways between eyes and brain.

Just last year, Second Sight was involved in research, along with UCLA and Baylor, testing a system that also skips the retina and sends visual information straight to the brain.

The system, called Orion, is similar to Argus II. A feed from a video camera mounted on dark glasses is converted to electric pulses sent to an implant that stimulates the brain. The device is wireless and includes a belt with a button to amplify dark objects in the sun or light objects in the dark. Like Fernandez’s system, the user sees a low-resolution pattern of phosphenes they interpret as objects.

“I’ll see little white dots on a black background, like looking up at the stars at night,” said Jason Esterhuizen, who was the second research subject to receive the device. “As a person walks toward me, I might see three little dots. As they move closer to me, more and more dots light up.”

Though the research is promising—it’s designated an FDA Breakthrough Device and is being trialed with six patients—Dr. Daniel Yoshor, study leader and neurosurgeon, cautioned the Guardian last year that it’s “still a long way from what we hope to achieve.”

The Road Ahead

Brain implants are far riskier than eye implants, and if the original Argus system is any indication, it may be years before these new devices are used widely beyond research.

Still, brain-machine interfaces (BMIs) are quickly advancing on a number of fronts.

The implant used in Fernandez’s research is a fairly common device called a Utah array. The square array is a few millimeters wide and contains 100 electrode spikes which are inserted into the brain. Each spike stimulates a few neurons. Similar implants have helped paralyzed folks control robotic arms and type messages with just their thoughts.

Though they’ve been the source of several BMI breakthroughs, the arrays aren’t perfect.

The electrodes damage surrounding brain tissue, scarring renders them useless all too quickly, and they only interact with a handful of neurons. The ideal device would be wireless, last decades in the brain—limiting the number of surgeries needed—and offer greater precision and resolution.

Ferndandez believes his implant can be modified to last decades, and while the current maximum resolution is 10 by 10 pixels, he envisions one day implanting as many as 6 on each side of the brain to deliver a resolution of at least 60 by 60 pixels

In addition, new technologies are in the works. Famously, Elon Musk’s company Neuralink is developing soft, thread-like electrodes that are deftly laced into brain tissue by a robot. Neuralink is aiming to include 3,000 electrodes on their device to chat up far more neurons than is currently possible (though it’s not clear whether there’s a limit to how many more neurons actually add value). Still other approaches, that are likely further out, do away with electrodes altogether, using light or chemicals to control gene-edited neurons.

Fernandez’s process also relies on more than just the hardware. The team used machine learning, for example, to write the software that translates visual information into neural code. This can be further refined, and in the coming years, as they work on the system as a whole, the components will no doubt improve in parallel.

But how quickly it all comes together in a product for wider use isn’t clear.

Fernandez is quick to dial back expectations—pointing out that these are still early experiments, and he doesn’t want to get anyone’s hopes up. Still, given the choice, Gómez said she’d have elected to keep the implant and wouldn’t think twice about installing version two.

“This is an exciting time in neuroscience and neurotechnology, and I feel that within my lifetime we can restore functional sight to the blind,” Yoshor said last year.

Blind Woman Sees With New Implant, Plays Video Game Sent Straight to Her Brain