Retinal screening for Alzheimer’s disease

A technology that originated at the University of Minnesota is well on its way to commercialization thanks to an investment award from Alzheimer’s Drug Discovery Foundation (ADDF).

The investment of up to $500,000 was awarded through the ADDF’s Diagnostics Accelerator initiative. Toronto, Ontario-based RetiSpec licensed through the University of Minnesota’s Technology Commercialization program. The technology harnesses hyperspectral imaging and machine learning.

“We are focused on bringing to market a noninvasive, easy-to-use, screening technology that can change when and how we detect Alzheimer’s disease at its earliest stages including before a patient presents with symptoms,” said Eliav Shaked, CEO of RetiSpec. “Early detection provides an important window of opportunity for timely therapeutic interventions that can slow or even prevent the progression of Alzheimer’s disease. ADDF’s investment represents another point of external validation of the promise of our technology.”

In preclinical studies and a pilot human study, the retinal imaging technology was effective in detecting small changes in biomarkers associated with elevated cerebral amyloid beta levels early in the disease process including before the onset of clinical symptoms.

RetiSpec is currently collaborating with Toronto Memory Program, Canada’s largest Alzheimer’s clinical trial site, to validate the accuracy and usability of the technology in patients.

“We believe that RetiSpec’s retinal scanner stands out and shows promise as a unique diagnostic tool among a range of technologies in development,” said Howard Fillit , MD, founding executive director and chief science officer of ADDF The technology has the potential to facilitate early diagnosis, improve the lives of patients and their loved ones and save the healthcare system money and resources. The technology will also be useful in making clinical trials for Alzheimer’s disease more efficient.”

https://www.mddionline.com/feast-your-eyes-new-technology-early-alzheimers-screening

Could an eye doctor diagnose Alzheimer’s before you have symptoms?

By Samiha Khanna

A quick eye exam might one day allow eye doctors to check up on both your eyeglasses prescription and your brain health.

A study of more than 200 people at the Duke Eye Center published March 11 in the journal Ophthalmology Retina suggests the loss of blood vessels in the retina could signal Alzheimer’s disease. Authors of the study include the Neurology Department’s James Burke, MD, PhD, and Cynthia Dunn, PA-C.

In people with healthy brains, microscopic blood vessels form a dense web at the back of the eye inside the retina, as seen in 133 participants in a control group.

In the eyes of 39 people with Alzheimer’s disease, that web was less dense and even sparse in places. The differences in density were statistically significant after researchers controlled for factors including age, sex, and level of education, said Duke ophthalmologist and retinal surgeon Sharon Fekrat, MD, the study’s senior author.

“We’re measuring blood vessels that can’t be seen during a regular eye exam and we’re doing that with relatively new noninvasive technology that takes high-resolution images of very small blood vessels within the retina in just a few minutes,” she said. “It’s possible that these changes in blood vessel density in the retina could mirror what’s going on in the tiny blood vessels in the brain, perhaps before we are able to detect any changes in cognition.”

The study found differences in the retinas of those with Alzheimer’s disease when compared to healthy people and to those with mild cognitive impairment, often a precursor to Alzheimer’s disease.

With nearly 6 million Americans living with Alzheimer’s disease and no viable treatments or noninvasive tools for early diagnosis, its burden on families and the economy is heavy. Scientists at Duke Eye Center and beyond have studied other changes in the retina that could signal trouble upstream in the brain, such as thinning of some of the retinal nerve layers.

“We know that there are changes that occur in the brain in the small blood vessels in people with Alzheimer’s disease, and because the retina is an extension of the brain, we wanted to investigate whether these changes could be detected in the retina using a new technology that is less invasive and easy to obtain,” said Dilraj S. Grewal, M.D., a Duke ophthalmologist and retinal surgeon and a lead author on the study. The Duke study used a noninvasive technology called optical coherence tomography angiography (OCTA). OCTA machines use light waves that reveal blood flow in every layer of the retina.

An OCTA scan could even reveal changes in tiny capillaries — most less than half the width of a human hair — before blood vessel changes show up on a brain scan such as an MRI or cerebral angiogram, which highlight only larger blood vessels. Such techniques to study the brain are invasive and costly.

“Ultimately, the goal would be to use this technology to detect Alzheimer’s early, before symptoms of memory loss are evident, and be able to monitor these changes over time in participants of clinical trials studying new Alzheimer’s treatments,” Fekrat said.

In addition to Fekrat and Grewal, study authors include Stephen P. Yoon, Atalie C. Thompson, Bryce W. Polascik, Cynthia Dunn and James R. Burke.

The research was supported by National Institutes of Health (P30EY005722), the 2018 Unrestricted Grant from Research to Prevent Blindness, and the Karen L. Wrenn Alzheimer’s Disease Award.

https://neurology.duke.edu/about/news/could-eye-doctor-diagnose-alzheimer%E2%80%99s-you-have-symptoms

Scientists Find A Brain Circuit That Could Explain Seasonal Depression


Before light reaches these rods and cones in the retina, it passes through some specialized cells that send signals to brain areas that affect whether you feel happy or sad.

by Jon Hamilton

Just in time for the winter solstice, scientists may have figured out how short days can lead to dark moods.

Two recent studies suggest the culprit is a brain circuit that connects special light-sensing cells in the retina with brain areas that affect whether you are happy or sad.

When these cells detect shorter days, they appear to use this pathway to send signals to the brain that can make a person feel glum or even depressed.

“It’s very likely that things like seasonal affective disorder involve this pathway,” says Jerome Sanes, a professor of neuroscience at Brown University.

Sanes was part of a team that found evidence of the brain circuit in people. The scientists presented their research in November at the Society for Neuroscience meeting. The work hasn’t been published in a peer-reviewed journal yet, but the researchers plan to submit it.

A few weeks earlier, a different team published a study suggesting a very similar circuit in mice.

Together, the studies offer a strong argument that seasonal mood changes, which affect about 1 in 5 people, have a biological cause. The research also adds to the evidence that support light therapy as an appropriate treatment.

“Now you have a circuit that you know your eye is influencing your brain to affect mood,” says Samer Hattar, an author of the mouse study and chief of the section on light and circadian rhythms at the National Institute of Mental Health. The finding is the result of a decades-long effort to understand the elusive link between light and mood. “It is the last piece of the puzzle,” Hattar says.

The research effort began in the early 2000s, when Hattar and David Berson, a professor of neuroscience at Brown University, were studying cells in the retina.

At the time, most scientists thought that when light struck the retina, only two kinds of cells responded: rods and cones. But Hattar and Berson thought there were other light-sensitive cells that hadn’t been identified.

“People used to laugh at us if we say there are other photoreceptors distinct from rods and cones in the retina,” Hattar says.

The skeptics stopped laughing when the team discovered a third kind of photoreceptor that contained a light-sensitive substance called melanopsin not found in rods and cones. (The full name of these cells, if you’re interested, is intrinsically photosensitive retinal ganglion cells, or ipRGCs.) These receptors responded to light but weren’t part of the visual system.

Instead, their most obvious function was keeping the brain’s internal clock in sync with changes in daylight. And many scientists assumed that this circadian function also explained seasonal depression.

“People thought that the only reason you get mood problems is because your clock is misaligned,” Hattar says.

Other potential explanations included speculation that reduced sunlight was triggering depression by changing levels of serotonin, which can affect mood, or melatonin, which plays a role in sleep patterns and mood. But the evidence for either of these possibilities has been weak.

Hattar and Berson were pretty sure there was a better reason. And, after years of searching, they found one.

In September, Hattar’s team published a study about mice suggesting a direct pathway between the third kind of photoreceptor in the retina and brain areas that affect mood.

When these cells were present, an artificially shortened cycle of light and dark caused a version of depression in a mouse. But when the team removed the cells with gene-editing tools, the mouse didn’t become depressed.

Sanes knew about the research, in part because he and Berson are neuroscientists at Brown. And he was so intrigued by the discovery of the new pathway between retina and brain in mice that he decided to see whether something similar was going on in human brains.

Sanes’ team put young adults in an MRI machine and measured their brain activity as they were exposed to different levels of light. This allowed the team to identify brain areas that seemed to be receiving signals from the photoreceptors Hattar and Berson had discovered.

Two of these areas were in the front of the brain. “It’s interesting because these areas seem to be the areas that have been shown in many studies to be involved in depression and other affective disorders,” Sanes says.

The areas also appeared to be part of the same circuit found in mice.

The finding needs to be confirmed. But Hattar is pretty confident that this circuit explains the link between light exposure and mood.

So now he’s trying to answer a new question: Why would evolution produce a brain that works this way?

“You will understand why you would need light to see,” he says, “but why do you need light to make you happy?”

Hattar hopes to find out. In the meantime, he has some advice for people who are feeling low: “Try to take your lunch outside. That will help you adjust your mood.”

https://www.npr.org/sections/health-shots/2018/12/21/678342879/scientists-find-a-brain-circuit-that-could-explain-seasonal-depression

Scientists Have Detected an Entirely New Visual Phenomenon in The Human Eye

ghosted-images-1_1024

by DAVID NIELD

New research suggests the human eye and brain are capable of seeing ghosted images, a new type of visual phenomenon that scientists previously thought could only be detected by a computer. It turns out our eyes are more powerful than we thought.

The discovery could teach us more about the inner workings of the eye and brain and how they process information, as well as changing our thinking on what we human beings can truly see of the world around us.

Having been developed as a way of low-cost image capture for light outside the visible spectrum, the patterns produced by these ghosted images are usually processed by software algorithms – but, surprisingly, our eyes have the same capabilities.

“Ghost-imaging with the eye opens up a number of completely novel applications such as extending human vision into invisible wavelength regimes in real-time, bypassing intermediary screens or computational steps,” write the researchers.

“Perhaps even more interesting are the opportunities that ghost imaging offers for exploring neurological processes.”

Ghost imaging works using a camera with a single pixel, rather than the millions of pixels used by the sensors inside today’s digital cameras and smartphones. When it comes to capturing light beyond the visible spectrum, it’s even a more cost-effective method.

These single pixel cameras capture light as it reflects from an object – by watching different random patterns of bouncing light, and crunching through some calculations, the camera can gradually build up a picture of something even with just one pixel.

In some setups, the single pixel camera is used in combination with a second light, modulated in response to the first, and beamed back on the original random patterns. The advantage is that fewer patterns are needed to produce an image.

In this case a second camera using some smart algorithms can pick up the image without having looked at the object at all – just by looking at the patterns being cast and the light being produced from them.

That’s the ghosted image that was previously thought to only be visible to computers running specialist software. However, the new study shows the human visual perception can make sense of these patterns, called Hadamard patterns.

This diagram from the research paper should give you an idea of what’s happening:

ghosted-images-2

It’s a little bit like when our eyes and brains look at a series of still images and treat them as a moving picture – the same sort of subconscious processing seems to be going on.

Of the four volunteers who took part in the study, all four could make out an image of Albert Einstein sticking out his tongue from the Hadamard patterns. Interestingly, though, the illusion only appeared when the patterns were projected quickly enough.

If the rate dropped below 200 patterns per 20 milliseconds, the image couldn’t be seen by the study participants.

As the researchers point out, this is potentially hugely exciting – it means we might be able to devise simple systems to see light outside the visible spectrum, with no computer processing required in the middle.

That’s all to come – and this is really preliminary stuff, so we can’t get too carried away. For now, the team of researchers is using the findings to explore more about how our visual systems work, and whether our eyes and brains have yet-undiscovered superpowers for looking at the world around us.

The research has yet to be peer-reviewed, but you can read it on the pre-print resource Arxiv.

https://www.sciencealert.com/human-eye-sees-ghosted-images-reflected-light

Alzheimer’s one day may be predicted during eye exam

By Jim Dryden

It may be possible in the future to screen patients for Alzheimer’s disease using an eye exam.

Using technology similar to what is found in many eye doctors’ offices, researchers at Washington University School of Medicine in St. Louis have detected evidence suggesting Alzheimer’s in older patients who had no symptoms of the disease.

Their study, involving 30 patients, is published Aug. 23 in the journal JAMA Ophthalmology.

“This technique has great potential to become a screening tool that helps decide who should undergo more expensive and invasive testing for Alzheimer’s disease prior to the appearance of clinical symptoms,” said the study’s first author, Bliss E. O’Bryhim, MD, PhD, a resident physician in the Department of Ophthalmology & Visual Sciences. “Our hope is to use this technique to understand who is accumulating abnormal proteins in the brain that may lead them to develop Alzheimer’s.”

Significant brain damage from Alzheimer’s disease can occur years before any symptoms such as memory loss and cognitive decline appear. Scientists estimate that Alzheimer’s-related plaques can build up in the brain two decades before the onset of symptoms, so researchers have been looking for ways to detect the disease sooner.

Physicians now use PET scans and lumbar punctures to help diagnose Alzheimer’s, but they are expensive and invasive.

In previous studies, researchers examining the eyes of people who had died from Alzheimer’s have reported that the eyes of such patients showed signs of thinning in the center of the retina and degradation of the optic nerve.

In the new study, the researchers used a noninvasive technique — called optical coherence tomography angiography — to examine the retinas in eyes of 30 study participants with an average age in the mid 70s, none of whom exhibited clinical symptoms of Alzheimer’s.

Those participants were patients in The Memory and Aging Project at Washington University’s Knight Alzheimer’s Disease Research Center. About half of those in the study had elevated levels of the Alzheimer’s proteins amyloid or tau as revealed by PET scans or cerebrospinal fluid, suggesting that although they didn’t have symptoms, they likely would develop Alzheimer’s. In the other subjects, PET scans and cerebrospinal fluid analyses were normal.

“In the patients with elevated levels of amyloid or tau, we detected significant thinning in the center of the retina,” said co-principal investigator Rajendra S. Apte, MD, PhD, the Paul A. Cibis Distinguished Professor of Ophthalmology and Visual Sciences. “All of us have a small area devoid of blood vessels in the center of our retinas that is responsible for our most precise vision. We found that this zone lacking blood vessels was significantly enlarged in people with preclinical Alzheimer’s disease.”

The eye test used in the study shines light into the eye, allowing a doctor to measure retinal thickness, as well as the thickness of fibers in the optic nerve. A form of that test often is available in ophthalmologist’s offices.

For this study, however, the researchers added a new component to the more common test: angiography, which allows doctors to distinguish red blood cells from other tissue in the retina.

“The angiography component allows us to look at blood-flow patterns,” said the other co-principal investigator, Gregory P. Van Stavern, MD, a professor of ophthalmology and visual sciences. “In the patients whose PET scans and cerebrospinal fluid showed preclinical Alzheimer’s, the area at the center of the retina without blood vessels was significantly larger, suggesting less blood flow.”

Added Apte: “The retina and central nervous system are so interconnected that changes in the brain could be reflected in cells in the retina.”

Of the patients studied, 17 had abnormal PET scans and/or lumbar punctures, and all of them also had retinal thinning and significant areas without blood vessels in the centers of their retinas. The retinas appeared normal in the patients whose PET scans and lumbar punctures were within the typical range.

More studies in patients are needed to replicate the findings, Van Stavern said, but he noted that if changes detected with this eye test can be used as markers for Alzheimer’s risk, it may be possible one day to screen people as young as their 40s or 50s to see whether they are at risk for the disease.

“We know the pathology of Alzheimer’s disease starts to develop years before symptoms appear, but if we could use this eye test to notice when the pathology is beginning, it may be possible one day to start treatments sooner to delay further damage,” he said.

O’Bryhim BE, Apte RS, Kung N, Coble D, Van Stavern GP. Optical coherence tomography angiography findings in pre-clinical Alzheimer’s disease. JAMA Ophthalmology, Aug. 23, 2018.

Alzheimer’s one day may be predicted during eye exam

Can Eyes Predict Parkinson’s Disease? Retinal thinning from dopamine loss may be an early disease marker.

by Judy George

Retinal thinning was linked to dopaminergic neuronal atrophy in a cross-sectional analysis, raising the possibility that it could be a way to detect pathologic changes in early Parkinson’s disease (PD) patients, researchers said.

Drug-naïve patients with early Parkinson’s showed retinal thinning as measured by optical coherence tomography (OCT) that correlated with both disease severity and nigral dopaminergic degeneration, reported Jee-Young Lee, MD, PhD, of the Seoul National University Boramae Medical Center, and colleagues in Neurology.

“Our study is the first to show a link between the thinning of the retina and a known sign of the progression of the disease — the loss of brain cells that produce dopamine,” Lee said in a statement.

“We also found the thinner the retina, the greater the severity of disease. These discoveries may mean that neurologists may eventually be able to use a simple eye scan to detect Parkinson’s disease in its earliest stages, before problems with movement begin.”

Retinal pathology has been tied to other neurodegenerative disorders including dementia. In previous studies, retinal nerve fiber layer thickness has been linked to Parkinson’s disease, and OCT is a potential PD biomarker.

The search for a definitive Parkinson’s biomarker has been extensive and includes clinical (anosmia; REM behavior disorder), genetic (GBA mutation; LRRK2 mutation), and biochemical (blood and cerebrospinal fluid) techniques, along with positron emission tomography (PET), magnetic resonance imaging (MRI), and single photon emission computed tomography (SPECT) imaging.

No biomarker has been validated for clinical practice, noted Jamie Adams, MD, of the University of Rochester Medical Center in New York, and Chiara La Morgia, MD, PhD, of the University of Bologna in Italy, in an accompanying editorial: “Because of the complexity of the disease, combining biomarkers from different categories is likely the best strategy to accurately predict PD status and progression.”

In this analysis, Lee and colleagues studied 49 Parkinson’s patients with an average age of 69, along with 54 age-matched controls, including only early-stage, drug-naïve PD patients without ophthalmologic disease.

The researchers used high-resolution OCT to measure retinal nerve fiber layer thickness, microperimetry to measure retinal function, and dopamine transporter analysis to measure N(3-[18F]fluoropropyl)-2-carbomethoxy-3-(4-iodophenyl) nortropane uptake in the basal ganglia. Retinal layer thickness and volume were measured and compared in PD patients and controls.

Retinal thinning was found in the inferior and temporal perifoveal sectors of the PD patients, particularly the inner plexiform and ganglion cell layers, along with an association between retinal thinning and dopaminergic loss in the left substantia nigra. The team also reported an inverse association between inner retinal thickness in the inferior perifoveal sector and disease severity (Hoehn and Yahr stage), and a positive correlation between macular sensitivity and retinal layer thickness.

“Overall, these data support the presence of an association between retinal thinning and dopaminergic loss in PD,” said Adams and La Morgia. “Inner retinal thinning in individuals with PD has been reported in previous studies, but this is the first study that demonstrates a correlation between inner retinal thinning and nigral dopaminergic loss.”

“These findings may point to a pathologic connection between the retina and basal ganglia in PD and are in line with previous studies reporting asymmetric retinal nerve fiber layer loss, more evident in the eye contralateral to the most affected body side.”

The results need to be interpreted with caution, Lee and co-authors noted. Retina analysis was limited to the macular area in this research. Studies with larger numbers of Parkinson’s patients are needed to confirm the findings. And this study was a cross-sectional analysis, so correlations between retinal changes and PD severity need to be established over time.

But if the findings are confirmed, “retina scans may not only allow earlier treatment of Parkinson’s disease, but more precise monitoring of treatments that could slow progression of the disease as well,” Lee said.

https://www.medpagetoday.com/neurology/parkinsonsdisease/74575

Screens are killing your eyeballs

Blue light’s rap sheet is growing ever longer. Researchers have connected the high-energy visible light, which emanates from both the sun and your cell phone (and just about every other digital device in our hands and on our bedside tables), to disruptions in the body’s circadian rhythms. And physicians have drawn attention to the relationship between our favorite devices and eye problems, ranging from everyday eye strain to glaucoma to macular degeneration.

Humans can see a thin spectrum of light, ranging from red to violet. Shorter wavelengths appear blue, while the longer ones appear red. What appears as white light, whether it’s from sunlight or screen time, actually includes almost every color in the spectrum. In a recent paper published in the journal Scientific Reports, researchers at the University of Toledo have begun to parse the process by which close or prolonged exposure to the 445 nanometer shortwave called “blue light” can trigger damage irreversible damage in eye cells. The results could have profound consequences for consumer technology.

“Photoreceptors are like the vehicle. Retinal is the gas,” says study author and chemistry professor Ajith Karunarathne. In the lab, when cells from the eye were exposed to blue light directly—in theory, mimicking what happens when we stare at our phone or computer screens—the high-intensity waves trigger a chemical reaction in the retinal molecules in the eye. The blue light causes the retinal to oxidize, creating “toxic chemical species,” according to Karunarathne. The retinal, energized by this particular band of light, kills the photoreceptor cells, which do not grow back once they are damaged. If retinal is the gas, Karunarathne says, then blue light is a dangerous spark.

Catastrophic damage to your vision is hardly guaranteed. But the experiment shows that blue light can kill photoreceptor cells. Murdering enough of them can lead to macular degeneration, an incurable disease that blurs or even eliminates vision.

Blue light occurs naturally in sunlight, which also contains other forms of visible light and ultraviolet and infrared rays. But, Karunarathne points out, we don’t spend that much time staring at the sun. As kids, most of us were taught it would fry our eyes. Digital devices, however, pose a bigger threat. The average American spends almost 11 hours a day in front of some type of screen, according to a 2016 Nielsen poll. Right now, reading this, you’re probably mainlining blue light.

When we stare straight at our screens—especially in the dark—we channel the light into a very small area inside our eyeball. “That can actually intensify the light emitted from the device many many fold,” Karunarathne says. “When you take a magnifying glass and hold it to the sun, you can see how intense the light at the focal point gets. You can burn something.”

Some user experience designers have been criticizing our reliance on blue light, including Amber Case, author of the book Calm Technology. On her Medium blog she documented the way blue light has become “the color of the future,” thanks in part to films like 1982’s Blade Runner. The environmentally-motivated switch from incandescent light bulbs to high-efficiency (and high-wattage) LED bulbs further pushed us into blue light’s path. But, Case writes, “[i]f pop culture has helped lead us into a blue-lit reality that’s hurting us so much, it can help lead us toward a new design aesthetic bathed in orange.”

The military, she notes, still uses red or orange light for many of its interfaces, including those in control rooms and cockpits. “They’re low-impact colors that are great for nighttime shifts,” she writes. They also eliminate blue light-induced “visual artifacts”—the sensation of being blinded by a bright screen in the dark—that often accompany blue light and can be hazardous in some scenarios.

Apple offers a “night shift” setting on its phones, which allow users to blot out the blue and filter their screens through a sunset hue. Aftermarket products designed to control the influx of blue light into our irises are also available, including desktop screen protectors. There are even blue light-filtering sunglasses marketed to specifically to gamers. But as the damage done by blue light becomes clearer—just as our vision is getting blurrier—consumers may demand bigger changes.

Going forward, Karunarathne plans to stay in data-collection mode. “This is a new trend of looking at our devices,” he says. “It will take some time to see if and how much damage these devices can cause over time. When this new generation gets older, the question is, by that time, is the damage done?” But now that he appears to have identified a biochemical pathway for blue light damage, he’s also looking for new interventions. “Who knows. One day we might be able to develop eye drops, that if you know you are going to be exposed to intense light, you could use some of those… to reduce damage.”

https://www.popsci.com/screens-killing-eyes-blue-light?dom=currents&src=syn#page-3

New research shows that degenerative eye diseases are associated with risk of developing Alzheimer’s disease


Age-related macular degeneration, diabetic retinopathy and glaucoma were all associated with a higher risk of developing Alzheimer’s disease in a new study.

by Rich Haridy

A new study has found an interesting correlation between several degenerative eye diseases and the onset of Alzheimer’s disease. No mechanism explaining the connection has been proposed at this stage but it is thought these eye conditions may help physicians identify patients at risk of developing Alzheimer’s at a stage before major symptoms appear.

The five-year study followed almost 4,000 patients over the age of 65, all without clinically diagnosed Alzheimer’s disease at the time of enrolment. After five years, 792 subjects were officially diagnosed with Alzheimer’s. The study found that those subjects with age-related macular degeneration, diabetic retinopathy or glaucoma, were 40 to 50 percent more likely to develop Alzheimer’s compared to patients without those specific conditions. No correlation between cataracts and an increased risk of Alzheimer’s were found.

“We don’t mean people with these eye conditions will get Alzheimer’s disease,” cautions Cecilia Lee, lead researcher on the study. “The main message from this study is that ophthalmologists should be more aware of the risks of developing dementia for people with these eye conditions and primary care doctors seeing patients with these eye conditions might be more careful on checking on possible dementia or memory loss.”

The researchers are clear that there are no definable causal connections between these eye conditions and Alzheimer’s at this stage, but the study does highlight the potential of using the eye as a way to better understand what is going on in the brain. Intriguingly, this isn’t the first bit of research that has found correlations between signs detected in the eye and the onset of Alzheimer’s disease.

Last year, a team from Cedars-Sinai Medical Center revealed that the same type of amyloid protein deposits found in the brain, and hypothesized as a major pathogenic cause of Alzheimer’s, can also be detected on the retina. That research suggested a possible investigational eye scan could become an effective early screening device for the disease.

While this new study does not at all cross over with last year’s research, and there is no implication that amyloid proteins play a part in these degenerative eye diseases, it does add to a fascinating growing body of work that highlights the eye’s role in helping offer a deeper insight into the cognitive health of our brain.

The research was published in the journal Alzheimer’s & Dementia.

https://newatlas.com/eye-disease-alzheimers-connection/55823/

Artificial Intelligence Can Predict Your Personality By Simply Tracking Your Eyes


Researchers have developed a new deep learning algorithm that can reveal your personality type, based on the Big Five personality trait model, by simply tracking eye movements.

t’s often been said that the eyes are the window to the soul, revealing what we think and how we feel. Now, new research reveals that your eyes may also be an indicator of your personality type, simply by the way they move.

Developed by the University of South Australia in partnership with the University of Stuttgart, Flinders University and the Max Planck Institute for Informatics in Germany, the research uses state-of-the-art machine-learning algorithms to demonstrate a link between personality and eye movements.

Findings show that people’s eye movements reveal whether they are sociable, conscientious or curious, with the algorithm software reliably recognising four of the Big Five personality traits: neuroticism, extroversion, agreeableness, and conscientiousness.

Researchers tracked the eye movements of 42 participants as they undertook everyday tasks around a university campus, and subsequently assessed their personality traits using well-established questionnaires.

UniSA’s Dr Tobias Loetscher says the study provides new links between previously under-investigated eye movements and personality traits and delivers important insights for emerging fields of social signal processing and social robotics.

“There’s certainly the potential for these findings to improve human-machine interactions,” Dr Loetscher says.

“People are always looking for improved, personalised services. However, today’s robots and computers are not socially aware, so they cannot adapt to non-verbal cues.

“This research provides opportunities to develop robots and computers so that they can become more natural, and better at interpreting human social signals.”

Dr Loetscher says the findings also provide an important bridge between tightly controlled laboratory studies and the study of natural eye movements in real-world environments.

“This research has tracked and measured the visual behaviour of people going about their everyday tasks, providing more natural responses than if they were in a lab.

“And thanks to our machine-learning approach, we not only validate the role of personality in explaining eye movement in everyday life, but also reveal new eye movement characteristics as predictors of personality traits.”

Original Research: Open access research for “Eye Movements During Everyday Behavior Predict Personality Traits” by Sabrina Hoppe, Tobias Loetscher, Stephanie A. Morey and Andreas Bulling in Frontiers in Human Neuroscience. Published April 14 2018.
doi:10.3389/fnhum.2018.00105

Artificial Intelligence Can Predict Your Personality By Simply Tracking Your Eyes

Organic printing inks may restore sight to blind people


An array of semitransparent organic pixels on top of a ultrathin sheet of gold. The thickness of both the organic islands and the underlying gold is more than one-hundred times thinner than a single neuron.

SUMMARY: A simple retinal prosthesis is under development. Fabricated using cheap and widely-available organic pigments used in printing inks and cosmetics, it consists of tiny pixels like a digital camera sensor on a nanometric scale. Researchers hope that it can restore sight to blind people.

Researchers led by Eric Glowacki, principal investigator of the organic nanocrystals subgroup in the Laboratory of Organic Electronics, Linköping University, have developed a tiny, simple photoactive film that converts light impulses into electrical signals. These signals in turn stimulate neurons (nerve cells). The research group has chosen to focus on a particularly pressing application, artificial retinas that may in the future restore sight to blind people. The Swedish team, specializing in nanomaterials and electronic devices, worked together with researchers in Israel, Italy and Austria to optimise the technology. Experiments in vision restoration were carried out by the group of Yael Hanein at Tel Aviv University in Israel. Yael Hanein’s group is a world-leader in the interface between electronics and the nervous system.

The results have recently been published in the scientific journal Advanced Materials.

The retina consists of several thin layers of cells. Light-sensitive neurons in the back of the eye convert incident light to electric signals, while other cells process the nerve impulses and transmit them onwards along the optic nerve to an area of the brain known as the “visual cortex.” An artificial retina may be surgically implanted into the eye if a person’s sight has been lost as a consequence of the light-sensitive cells becoming degraded, thus failing to convert light into electric pulses.

The artificial retina consists of a thin circular film of photoactive material, and is similar to an individual pixel in a digital camera sensor. Each pixel is truly microscopic — it is about 100 times thinner than a single cell and has a diameter smaller than the diameter of a human hair. It consists of a pigment of semi-conducting nanocrystals. Such pigments are cheap and non-toxic, and are commonly used in commercial cosmetics and tattooing ink.

“We have optimised the photoactive film for near-infrared light, since biological tissues, such as bone, blood and skin, are most transparent at these wavelengths. This raises the possibility of other applications in humans in the future,” says Eric Glowacki.

He describes the artificial retina as a microscopic doughnut, with the crystal-containing pigment in the middle and a tiny metal ring around it. It acts without any external connectors, and the nerve cells are activated without a delay.

“The response time must be short if we are to gain control of the stimulation of nerve cells,” says David Rand, postdoctoral researcher at Tel Aviv University. “Here, the nerve cells are activated directly. We have shown that our device can be used to stimulate not only neurons in the brain but also neurons in non-functioning retinas.”

https://www.sciencedaily.com/releases/2018/05/180502104043.htm