Posts Tagged ‘eye’

ghosted-images-1_1024

by DAVID NIELD

New research suggests the human eye and brain are capable of seeing ghosted images, a new type of visual phenomenon that scientists previously thought could only be detected by a computer. It turns out our eyes are more powerful than we thought.

The discovery could teach us more about the inner workings of the eye and brain and how they process information, as well as changing our thinking on what we human beings can truly see of the world around us.

Having been developed as a way of low-cost image capture for light outside the visible spectrum, the patterns produced by these ghosted images are usually processed by software algorithms – but, surprisingly, our eyes have the same capabilities.

“Ghost-imaging with the eye opens up a number of completely novel applications such as extending human vision into invisible wavelength regimes in real-time, bypassing intermediary screens or computational steps,” write the researchers.

“Perhaps even more interesting are the opportunities that ghost imaging offers for exploring neurological processes.”

Ghost imaging works using a camera with a single pixel, rather than the millions of pixels used by the sensors inside today’s digital cameras and smartphones. When it comes to capturing light beyond the visible spectrum, it’s even a more cost-effective method.

These single pixel cameras capture light as it reflects from an object – by watching different random patterns of bouncing light, and crunching through some calculations, the camera can gradually build up a picture of something even with just one pixel.

In some setups, the single pixel camera is used in combination with a second light, modulated in response to the first, and beamed back on the original random patterns. The advantage is that fewer patterns are needed to produce an image.

In this case a second camera using some smart algorithms can pick up the image without having looked at the object at all – just by looking at the patterns being cast and the light being produced from them.

That’s the ghosted image that was previously thought to only be visible to computers running specialist software. However, the new study shows the human visual perception can make sense of these patterns, called Hadamard patterns.

This diagram from the research paper should give you an idea of what’s happening:

ghosted-images-2

It’s a little bit like when our eyes and brains look at a series of still images and treat them as a moving picture – the same sort of subconscious processing seems to be going on.

Of the four volunteers who took part in the study, all four could make out an image of Albert Einstein sticking out his tongue from the Hadamard patterns. Interestingly, though, the illusion only appeared when the patterns were projected quickly enough.

If the rate dropped below 200 patterns per 20 milliseconds, the image couldn’t be seen by the study participants.

As the researchers point out, this is potentially hugely exciting – it means we might be able to devise simple systems to see light outside the visible spectrum, with no computer processing required in the middle.

That’s all to come – and this is really preliminary stuff, so we can’t get too carried away. For now, the team of researchers is using the findings to explore more about how our visual systems work, and whether our eyes and brains have yet-undiscovered superpowers for looking at the world around us.

The research has yet to be peer-reviewed, but you can read it on the pre-print resource Arxiv.

https://www.sciencealert.com/human-eye-sees-ghosted-images-reflected-light

Advertisements

By Jim Dryden

It may be possible in the future to screen patients for Alzheimer’s disease using an eye exam.

Using technology similar to what is found in many eye doctors’ offices, researchers at Washington University School of Medicine in St. Louis have detected evidence suggesting Alzheimer’s in older patients who had no symptoms of the disease.

Their study, involving 30 patients, is published Aug. 23 in the journal JAMA Ophthalmology.

“This technique has great potential to become a screening tool that helps decide who should undergo more expensive and invasive testing for Alzheimer’s disease prior to the appearance of clinical symptoms,” said the study’s first author, Bliss E. O’Bryhim, MD, PhD, a resident physician in the Department of Ophthalmology & Visual Sciences. “Our hope is to use this technique to understand who is accumulating abnormal proteins in the brain that may lead them to develop Alzheimer’s.”

Significant brain damage from Alzheimer’s disease can occur years before any symptoms such as memory loss and cognitive decline appear. Scientists estimate that Alzheimer’s-related plaques can build up in the brain two decades before the onset of symptoms, so researchers have been looking for ways to detect the disease sooner.

Physicians now use PET scans and lumbar punctures to help diagnose Alzheimer’s, but they are expensive and invasive.

In previous studies, researchers examining the eyes of people who had died from Alzheimer’s have reported that the eyes of such patients showed signs of thinning in the center of the retina and degradation of the optic nerve.

In the new study, the researchers used a noninvasive technique — called optical coherence tomography angiography — to examine the retinas in eyes of 30 study participants with an average age in the mid 70s, none of whom exhibited clinical symptoms of Alzheimer’s.

Those participants were patients in The Memory and Aging Project at Washington University’s Knight Alzheimer’s Disease Research Center. About half of those in the study had elevated levels of the Alzheimer’s proteins amyloid or tau as revealed by PET scans or cerebrospinal fluid, suggesting that although they didn’t have symptoms, they likely would develop Alzheimer’s. In the other subjects, PET scans and cerebrospinal fluid analyses were normal.

“In the patients with elevated levels of amyloid or tau, we detected significant thinning in the center of the retina,” said co-principal investigator Rajendra S. Apte, MD, PhD, the Paul A. Cibis Distinguished Professor of Ophthalmology and Visual Sciences. “All of us have a small area devoid of blood vessels in the center of our retinas that is responsible for our most precise vision. We found that this zone lacking blood vessels was significantly enlarged in people with preclinical Alzheimer’s disease.”

The eye test used in the study shines light into the eye, allowing a doctor to measure retinal thickness, as well as the thickness of fibers in the optic nerve. A form of that test often is available in ophthalmologist’s offices.

For this study, however, the researchers added a new component to the more common test: angiography, which allows doctors to distinguish red blood cells from other tissue in the retina.

“The angiography component allows us to look at blood-flow patterns,” said the other co-principal investigator, Gregory P. Van Stavern, MD, a professor of ophthalmology and visual sciences. “In the patients whose PET scans and cerebrospinal fluid showed preclinical Alzheimer’s, the area at the center of the retina without blood vessels was significantly larger, suggesting less blood flow.”

Added Apte: “The retina and central nervous system are so interconnected that changes in the brain could be reflected in cells in the retina.”

Of the patients studied, 17 had abnormal PET scans and/or lumbar punctures, and all of them also had retinal thinning and significant areas without blood vessels in the centers of their retinas. The retinas appeared normal in the patients whose PET scans and lumbar punctures were within the typical range.

More studies in patients are needed to replicate the findings, Van Stavern said, but he noted that if changes detected with this eye test can be used as markers for Alzheimer’s risk, it may be possible one day to screen people as young as their 40s or 50s to see whether they are at risk for the disease.

“We know the pathology of Alzheimer’s disease starts to develop years before symptoms appear, but if we could use this eye test to notice when the pathology is beginning, it may be possible one day to start treatments sooner to delay further damage,” he said.

O’Bryhim BE, Apte RS, Kung N, Coble D, Van Stavern GP. Optical coherence tomography angiography findings in pre-clinical Alzheimer’s disease. JAMA Ophthalmology, Aug. 23, 2018.

https://source.wustl.edu/2018/08/alzheimers-one-day-may-be-predicted-during-eye-exam/

by Judy George

Retinal thinning was linked to dopaminergic neuronal atrophy in a cross-sectional analysis, raising the possibility that it could be a way to detect pathologic changes in early Parkinson’s disease (PD) patients, researchers said.

Drug-naïve patients with early Parkinson’s showed retinal thinning as measured by optical coherence tomography (OCT) that correlated with both disease severity and nigral dopaminergic degeneration, reported Jee-Young Lee, MD, PhD, of the Seoul National University Boramae Medical Center, and colleagues in Neurology.

“Our study is the first to show a link between the thinning of the retina and a known sign of the progression of the disease — the loss of brain cells that produce dopamine,” Lee said in a statement.

“We also found the thinner the retina, the greater the severity of disease. These discoveries may mean that neurologists may eventually be able to use a simple eye scan to detect Parkinson’s disease in its earliest stages, before problems with movement begin.”

Retinal pathology has been tied to other neurodegenerative disorders including dementia. In previous studies, retinal nerve fiber layer thickness has been linked to Parkinson’s disease, and OCT is a potential PD biomarker.

The search for a definitive Parkinson’s biomarker has been extensive and includes clinical (anosmia; REM behavior disorder), genetic (GBA mutation; LRRK2 mutation), and biochemical (blood and cerebrospinal fluid) techniques, along with positron emission tomography (PET), magnetic resonance imaging (MRI), and single photon emission computed tomography (SPECT) imaging.

No biomarker has been validated for clinical practice, noted Jamie Adams, MD, of the University of Rochester Medical Center in New York, and Chiara La Morgia, MD, PhD, of the University of Bologna in Italy, in an accompanying editorial: “Because of the complexity of the disease, combining biomarkers from different categories is likely the best strategy to accurately predict PD status and progression.”

In this analysis, Lee and colleagues studied 49 Parkinson’s patients with an average age of 69, along with 54 age-matched controls, including only early-stage, drug-naïve PD patients without ophthalmologic disease.

The researchers used high-resolution OCT to measure retinal nerve fiber layer thickness, microperimetry to measure retinal function, and dopamine transporter analysis to measure N(3-[18F]fluoropropyl)-2-carbomethoxy-3-(4-iodophenyl) nortropane uptake in the basal ganglia. Retinal layer thickness and volume were measured and compared in PD patients and controls.

Retinal thinning was found in the inferior and temporal perifoveal sectors of the PD patients, particularly the inner plexiform and ganglion cell layers, along with an association between retinal thinning and dopaminergic loss in the left substantia nigra. The team also reported an inverse association between inner retinal thickness in the inferior perifoveal sector and disease severity (Hoehn and Yahr stage), and a positive correlation between macular sensitivity and retinal layer thickness.

“Overall, these data support the presence of an association between retinal thinning and dopaminergic loss in PD,” said Adams and La Morgia. “Inner retinal thinning in individuals with PD has been reported in previous studies, but this is the first study that demonstrates a correlation between inner retinal thinning and nigral dopaminergic loss.”

“These findings may point to a pathologic connection between the retina and basal ganglia in PD and are in line with previous studies reporting asymmetric retinal nerve fiber layer loss, more evident in the eye contralateral to the most affected body side.”

The results need to be interpreted with caution, Lee and co-authors noted. Retina analysis was limited to the macular area in this research. Studies with larger numbers of Parkinson’s patients are needed to confirm the findings. And this study was a cross-sectional analysis, so correlations between retinal changes and PD severity need to be established over time.

But if the findings are confirmed, “retina scans may not only allow earlier treatment of Parkinson’s disease, but more precise monitoring of treatments that could slow progression of the disease as well,” Lee said.

https://www.medpagetoday.com/neurology/parkinsonsdisease/74575

Blue light’s rap sheet is growing ever longer. Researchers have connected the high-energy visible light, which emanates from both the sun and your cell phone (and just about every other digital device in our hands and on our bedside tables), to disruptions in the body’s circadian rhythms. And physicians have drawn attention to the relationship between our favorite devices and eye problems, ranging from everyday eye strain to glaucoma to macular degeneration.

Humans can see a thin spectrum of light, ranging from red to violet. Shorter wavelengths appear blue, while the longer ones appear red. What appears as white light, whether it’s from sunlight or screen time, actually includes almost every color in the spectrum. In a recent paper published in the journal Scientific Reports, researchers at the University of Toledo have begun to parse the process by which close or prolonged exposure to the 445 nanometer shortwave called “blue light” can trigger damage irreversible damage in eye cells. The results could have profound consequences for consumer technology.

“Photoreceptors are like the vehicle. Retinal is the gas,” says study author and chemistry professor Ajith Karunarathne. In the lab, when cells from the eye were exposed to blue light directly—in theory, mimicking what happens when we stare at our phone or computer screens—the high-intensity waves trigger a chemical reaction in the retinal molecules in the eye. The blue light causes the retinal to oxidize, creating “toxic chemical species,” according to Karunarathne. The retinal, energized by this particular band of light, kills the photoreceptor cells, which do not grow back once they are damaged. If retinal is the gas, Karunarathne says, then blue light is a dangerous spark.

Catastrophic damage to your vision is hardly guaranteed. But the experiment shows that blue light can kill photoreceptor cells. Murdering enough of them can lead to macular degeneration, an incurable disease that blurs or even eliminates vision.

Blue light occurs naturally in sunlight, which also contains other forms of visible light and ultraviolet and infrared rays. But, Karunarathne points out, we don’t spend that much time staring at the sun. As kids, most of us were taught it would fry our eyes. Digital devices, however, pose a bigger threat. The average American spends almost 11 hours a day in front of some type of screen, according to a 2016 Nielsen poll. Right now, reading this, you’re probably mainlining blue light.

When we stare straight at our screens—especially in the dark—we channel the light into a very small area inside our eyeball. “That can actually intensify the light emitted from the device many many fold,” Karunarathne says. “When you take a magnifying glass and hold it to the sun, you can see how intense the light at the focal point gets. You can burn something.”

Some user experience designers have been criticizing our reliance on blue light, including Amber Case, author of the book Calm Technology. On her Medium blog she documented the way blue light has become “the color of the future,” thanks in part to films like 1982’s Blade Runner. The environmentally-motivated switch from incandescent light bulbs to high-efficiency (and high-wattage) LED bulbs further pushed us into blue light’s path. But, Case writes, “[i]f pop culture has helped lead us into a blue-lit reality that’s hurting us so much, it can help lead us toward a new design aesthetic bathed in orange.”

The military, she notes, still uses red or orange light for many of its interfaces, including those in control rooms and cockpits. “They’re low-impact colors that are great for nighttime shifts,” she writes. They also eliminate blue light-induced “visual artifacts”—the sensation of being blinded by a bright screen in the dark—that often accompany blue light and can be hazardous in some scenarios.

Apple offers a “night shift” setting on its phones, which allow users to blot out the blue and filter their screens through a sunset hue. Aftermarket products designed to control the influx of blue light into our irises are also available, including desktop screen protectors. There are even blue light-filtering sunglasses marketed to specifically to gamers. But as the damage done by blue light becomes clearer—just as our vision is getting blurrier—consumers may demand bigger changes.

Going forward, Karunarathne plans to stay in data-collection mode. “This is a new trend of looking at our devices,” he says. “It will take some time to see if and how much damage these devices can cause over time. When this new generation gets older, the question is, by that time, is the damage done?” But now that he appears to have identified a biochemical pathway for blue light damage, he’s also looking for new interventions. “Who knows. One day we might be able to develop eye drops, that if you know you are going to be exposed to intense light, you could use some of those… to reduce damage.”

https://www.popsci.com/screens-killing-eyes-blue-light?dom=currents&src=syn#page-3


Age-related macular degeneration, diabetic retinopathy and glaucoma were all associated with a higher risk of developing Alzheimer’s disease in a new study.

by Rich Haridy

A new study has found an interesting correlation between several degenerative eye diseases and the onset of Alzheimer’s disease. No mechanism explaining the connection has been proposed at this stage but it is thought these eye conditions may help physicians identify patients at risk of developing Alzheimer’s at a stage before major symptoms appear.

The five-year study followed almost 4,000 patients over the age of 65, all without clinically diagnosed Alzheimer’s disease at the time of enrolment. After five years, 792 subjects were officially diagnosed with Alzheimer’s. The study found that those subjects with age-related macular degeneration, diabetic retinopathy or glaucoma, were 40 to 50 percent more likely to develop Alzheimer’s compared to patients without those specific conditions. No correlation between cataracts and an increased risk of Alzheimer’s were found.

“We don’t mean people with these eye conditions will get Alzheimer’s disease,” cautions Cecilia Lee, lead researcher on the study. “The main message from this study is that ophthalmologists should be more aware of the risks of developing dementia for people with these eye conditions and primary care doctors seeing patients with these eye conditions might be more careful on checking on possible dementia or memory loss.”

The researchers are clear that there are no definable causal connections between these eye conditions and Alzheimer’s at this stage, but the study does highlight the potential of using the eye as a way to better understand what is going on in the brain. Intriguingly, this isn’t the first bit of research that has found correlations between signs detected in the eye and the onset of Alzheimer’s disease.

Last year, a team from Cedars-Sinai Medical Center revealed that the same type of amyloid protein deposits found in the brain, and hypothesized as a major pathogenic cause of Alzheimer’s, can also be detected on the retina. That research suggested a possible investigational eye scan could become an effective early screening device for the disease.

While this new study does not at all cross over with last year’s research, and there is no implication that amyloid proteins play a part in these degenerative eye diseases, it does add to a fascinating growing body of work that highlights the eye’s role in helping offer a deeper insight into the cognitive health of our brain.

The research was published in the journal Alzheimer’s & Dementia.

https://newatlas.com/eye-disease-alzheimers-connection/55823/


Researchers have developed a new deep learning algorithm that can reveal your personality type, based on the Big Five personality trait model, by simply tracking eye movements.

t’s often been said that the eyes are the window to the soul, revealing what we think and how we feel. Now, new research reveals that your eyes may also be an indicator of your personality type, simply by the way they move.

Developed by the University of South Australia in partnership with the University of Stuttgart, Flinders University and the Max Planck Institute for Informatics in Germany, the research uses state-of-the-art machine-learning algorithms to demonstrate a link between personality and eye movements.

Findings show that people’s eye movements reveal whether they are sociable, conscientious or curious, with the algorithm software reliably recognising four of the Big Five personality traits: neuroticism, extroversion, agreeableness, and conscientiousness.

Researchers tracked the eye movements of 42 participants as they undertook everyday tasks around a university campus, and subsequently assessed their personality traits using well-established questionnaires.

UniSA’s Dr Tobias Loetscher says the study provides new links between previously under-investigated eye movements and personality traits and delivers important insights for emerging fields of social signal processing and social robotics.

“There’s certainly the potential for these findings to improve human-machine interactions,” Dr Loetscher says.

“People are always looking for improved, personalised services. However, today’s robots and computers are not socially aware, so they cannot adapt to non-verbal cues.

“This research provides opportunities to develop robots and computers so that they can become more natural, and better at interpreting human social signals.”

Dr Loetscher says the findings also provide an important bridge between tightly controlled laboratory studies and the study of natural eye movements in real-world environments.

“This research has tracked and measured the visual behaviour of people going about their everyday tasks, providing more natural responses than if they were in a lab.

“And thanks to our machine-learning approach, we not only validate the role of personality in explaining eye movement in everyday life, but also reveal new eye movement characteristics as predictors of personality traits.”

Original Research: Open access research for “Eye Movements During Everyday Behavior Predict Personality Traits” by Sabrina Hoppe, Tobias Loetscher, Stephanie A. Morey and Andreas Bulling in Frontiers in Human Neuroscience. Published April 14 2018.
doi:10.3389/fnhum.2018.00105

https://neurosciencenews.com/ai-personality-9621/


An array of semitransparent organic pixels on top of a ultrathin sheet of gold. The thickness of both the organic islands and the underlying gold is more than one-hundred times thinner than a single neuron.

SUMMARY: A simple retinal prosthesis is under development. Fabricated using cheap and widely-available organic pigments used in printing inks and cosmetics, it consists of tiny pixels like a digital camera sensor on a nanometric scale. Researchers hope that it can restore sight to blind people.

Researchers led by Eric Glowacki, principal investigator of the organic nanocrystals subgroup in the Laboratory of Organic Electronics, Linköping University, have developed a tiny, simple photoactive film that converts light impulses into electrical signals. These signals in turn stimulate neurons (nerve cells). The research group has chosen to focus on a particularly pressing application, artificial retinas that may in the future restore sight to blind people. The Swedish team, specializing in nanomaterials and electronic devices, worked together with researchers in Israel, Italy and Austria to optimise the technology. Experiments in vision restoration were carried out by the group of Yael Hanein at Tel Aviv University in Israel. Yael Hanein’s group is a world-leader in the interface between electronics and the nervous system.

The results have recently been published in the scientific journal Advanced Materials.

The retina consists of several thin layers of cells. Light-sensitive neurons in the back of the eye convert incident light to electric signals, while other cells process the nerve impulses and transmit them onwards along the optic nerve to an area of the brain known as the “visual cortex.” An artificial retina may be surgically implanted into the eye if a person’s sight has been lost as a consequence of the light-sensitive cells becoming degraded, thus failing to convert light into electric pulses.

The artificial retina consists of a thin circular film of photoactive material, and is similar to an individual pixel in a digital camera sensor. Each pixel is truly microscopic — it is about 100 times thinner than a single cell and has a diameter smaller than the diameter of a human hair. It consists of a pigment of semi-conducting nanocrystals. Such pigments are cheap and non-toxic, and are commonly used in commercial cosmetics and tattooing ink.

“We have optimised the photoactive film for near-infrared light, since biological tissues, such as bone, blood and skin, are most transparent at these wavelengths. This raises the possibility of other applications in humans in the future,” says Eric Glowacki.

He describes the artificial retina as a microscopic doughnut, with the crystal-containing pigment in the middle and a tiny metal ring around it. It acts without any external connectors, and the nerve cells are activated without a delay.

“The response time must be short if we are to gain control of the stimulation of nerve cells,” says David Rand, postdoctoral researcher at Tel Aviv University. “Here, the nerve cells are activated directly. We have shown that our device can be used to stimulate not only neurons in the brain but also neurons in non-functioning retinas.”

https://www.sciencedaily.com/releases/2018/05/180502104043.htm