Posts Tagged ‘eye’


An array of semitransparent organic pixels on top of a ultrathin sheet of gold. The thickness of both the organic islands and the underlying gold is more than one-hundred times thinner than a single neuron.

SUMMARY: A simple retinal prosthesis is under development. Fabricated using cheap and widely-available organic pigments used in printing inks and cosmetics, it consists of tiny pixels like a digital camera sensor on a nanometric scale. Researchers hope that it can restore sight to blind people.

Researchers led by Eric Glowacki, principal investigator of the organic nanocrystals subgroup in the Laboratory of Organic Electronics, Linköping University, have developed a tiny, simple photoactive film that converts light impulses into electrical signals. These signals in turn stimulate neurons (nerve cells). The research group has chosen to focus on a particularly pressing application, artificial retinas that may in the future restore sight to blind people. The Swedish team, specializing in nanomaterials and electronic devices, worked together with researchers in Israel, Italy and Austria to optimise the technology. Experiments in vision restoration were carried out by the group of Yael Hanein at Tel Aviv University in Israel. Yael Hanein’s group is a world-leader in the interface between electronics and the nervous system.

The results have recently been published in the scientific journal Advanced Materials.

The retina consists of several thin layers of cells. Light-sensitive neurons in the back of the eye convert incident light to electric signals, while other cells process the nerve impulses and transmit them onwards along the optic nerve to an area of the brain known as the “visual cortex.” An artificial retina may be surgically implanted into the eye if a person’s sight has been lost as a consequence of the light-sensitive cells becoming degraded, thus failing to convert light into electric pulses.

The artificial retina consists of a thin circular film of photoactive material, and is similar to an individual pixel in a digital camera sensor. Each pixel is truly microscopic — it is about 100 times thinner than a single cell and has a diameter smaller than the diameter of a human hair. It consists of a pigment of semi-conducting nanocrystals. Such pigments are cheap and non-toxic, and are commonly used in commercial cosmetics and tattooing ink.

“We have optimised the photoactive film for near-infrared light, since biological tissues, such as bone, blood and skin, are most transparent at these wavelengths. This raises the possibility of other applications in humans in the future,” says Eric Glowacki.

He describes the artificial retina as a microscopic doughnut, with the crystal-containing pigment in the middle and a tiny metal ring around it. It acts without any external connectors, and the nerve cells are activated without a delay.

“The response time must be short if we are to gain control of the stimulation of nerve cells,” says David Rand, postdoctoral researcher at Tel Aviv University. “Here, the nerve cells are activated directly. We have shown that our device can be used to stimulate not only neurons in the brain but also neurons in non-functioning retinas.”

https://www.sciencedaily.com/releases/2018/05/180502104043.htm

Advertisements

by Lacy Cook

This praying mantis isn’t just wearing minuscule 3D glasses for the cute factor, but to help scientists learn more about 3D vision. A Newcastle University team discovered a novel form of 3D vision, or stereo vision, in the insects – and compared human and insect stereo vision for the very first time. Their findings could have implications for visual processing in robots.

Humans aren’t the only creatures with stereo vision, which “helps us work out the distances to the things we see,” according to the university. Cats, horses, monkeys, toads, and owls have it too – but the only insect we know about with 3D vision is the praying mantis. Six Newcastle University researchers obtained new insight into their robust stereo vision with the help of small 3D glasses temporarily attached to the insects with beeswax.

The researchers designed an insect 3D cinema, showing a praying mantis a film of prey. The insects would actually try to catch the prey because the illusion was so convincing. And the scientists were able to take their work to the next level, showing the mantises “complex dot-patterns used to investigate human 3D vision” so they could compare our 3D vision with an insect’s for the first time.

According to the university, humans see 3D in still images by matching details of the image each eye sees. “But mantises only attack moving prey so their 3D doesn’t need to work in still images. The team found mantises don’t bother about the details of the picture but just look for places where the picture is changing…Even if the scientists made the two eyes’ images completely different, mantises can still match up the places where things are changing. They did so even when humans couldn’t.”

The journal Current Biology published their work online last week. Lead author Vivek Nityananda, a behavioral ecologist, described the praying mantis’ stereo vision as “a completely new form of 3D vision.”

Future robots could benefit from these findings: instead of 3D vision based on complex human stereo vision, researchers might be able to take some tips from praying mantis stereo vision, which team member Ghaith Tarawneh said probably doesn’t require a lot of computer processing since insect brains are so small.

https://inhabitat.com/praying-mantises-wearing-tiny-glasses-help-researchers-discover-new-type-of-3d-vision/

Simply moving the eyes triggers the eardrums to move too, says a new study by Duke University neuroscientists.

The researchers found that keeping the head still but shifting the eyes to one side or the other sparks vibrations in the eardrums, even in the absence of any sounds.

Surprisingly, these eardrum vibrations start slightly before the eyes move, indicating that motion in the ears and the eyes are controlled by the same motor commands deep within the brain.

“It’s like the brain is saying, ‘I’m going to move the eyes, I better tell the eardrums, too,’” said Jennifer Groh, a professor in the departments of neurobiology and psychology and neuroscience at Duke.

The findings, which were replicated in both humans and rhesus monkeys, provide new insight into how the brain coordinates what we see and what we hear. It may also lead to new understanding of hearing disorders, such as difficulty following a conversation in a crowded room.

The paper appeared Jan. 23 in Proceedings of the National Academy of Sciences.

It’s no secret that the eyes and ears work together to make sense of the sights and sounds around us. Most people find it easier to understand somebody if they are looking at them and watching their lips move. And in a famous illusion called the McGurk Effect, videos of lip cues dubbed with mismatched audio cause people to hear the wrong sound.

But researchers are still puzzling over where and how the brain combines these two very different types of sensory information.

“Our brains would like to match up what we see and what we hear according to where these stimuli are coming from, but the visual system and the auditory system figure out where stimuli are located in two completely different ways,” Groh said. “The eyes are giving you a camera-like snapshot of the visual scene, whereas for sounds, you have to calculate where they are coming from based on differences in timing and loudness across the two ears.”

Because the eyes are usually darting about within the head, the visual and auditory worlds are constantly in flux with respect to one another, Groh added.

In an experiment designed by Kurtis Gruters, a formal doctoral student in Groh’s lab and co-first author on the paper, 16 participants were asked to sit in a dark room and follow shifting LED lights with their eyes. Each participant also wore small microphones in their ear canals that were sensitive enough to pick up the slight vibrations created when the eardrum sways back and forth.

Though eardrums vibrate primarily in response to outside sounds, the brain can also control their movements using small bones in the middle ear and hair cells in the cochlea. These mechanisms help modulate the volume of sounds that ultimately reach the inner ear and brain, and produce small sounds known as otoacoustic emissions.

Gruters found that when the eyes moved, both eardrums moved in sync with one another, one side bulging inward at the same time the other side bulged outward. They continued to vibrate back and forth together until shortly after the eyes stopped moving. Eye movements in opposite directions produced opposite patterns of vibrations.

Larger eye movements also triggered bigger vibrations than smaller eye movements, the team found.

“The fact that these eardrum movements are encoding spatial information about eye movements means that they may be useful for helping our brains merge visual and auditory space,” said David Murphy, a doctoral student in Groh’s lab and co-first author on the paper. “It could also signify a marker of a healthy interaction between the auditory and visual systems.”

The team, which included Christopher Shera at the University of Southern California and David W. Smith of the University of Florida, is still investigating how these eardrum vibrations impact what we hear, and what role they may play in hearing disorders. In future experiments, they will look at whether up and down eye movements also cause unique signatures in eardrum vibrations.

“The eardrum movements literally contain information about what the eyes are doing,” Groh said. “This demonstrates that these two sensory pathways are coupled, and they are coupled at the earliest points.”

Cole Jenson, an undergraduate neuroscience major at Duke, also coauthored the new study.

CITATION: “The Eardrums Move When the Eyes Move: A Multisensory Effect on the Mechanics of Hearing,” K. G. Gruters, D. L. K. Murphy, Cole D. Jensen, D. W. Smith, C. A. Shera and J. M. Groh. Proceedings of the National Academy of Sciences, Jan. 23, 2018. DOI: 10.1073/pnas.1717948115

When people are awake, their pupils regularly change in size. Those changes are meaningful, reflecting shifting attention or vigilance, for example. Now, researchers reporting in Current Biology on January 18 have found in studies of mice that pupil size also fluctuates during sleep. They also show that pupil size is a reliable indicator of sleep states.

“We found that pupil size rhythmically fluctuates during sleep,” says Daniel Huber of the University of Geneva in Switzerland. “Intriguingly, these pupil fluctuations follow the sleep-related brain activity so closely that they can indicate with high accuracy the exact stage of sleep—the smaller the pupil, the deeper the sleep.”

Studies of pupil size had always been a challenge for an obvious reason: people and animals generally sleep with their eyes closed. Huber says that he and his colleagues were inspired to study pupil size in sleep after discovering that their laboratory mice sometimes sleep with their eyes open. They knew that pupil size varies strongly during wakefulness. What, they wondered, happened during sleep?

To investigate this question, they developed a novel optical pupil-tracking system for mice. The device includes an infrared light positioned close to the head of the animal. That invisible light travels through the skull and brain to illuminate the back of the eye. When the eyes are imaged with an infrared camera, the pupils appear as bright circles. Thanks to this new method, it was suddenly possible to track changes in pupil size accurately, particularly when the animals snoozed naturally with their eyelids open.

Their images show that mouse pupils rhythmically fluctuate during sleep and that those fluctuations are not at all random; they correlate with changes in sleep states.

Further experiments showed that changes in pupil size are not just a passive phenomenon, either. They are actively controlled by the parasympathetic autonomic nervous system. The evidence suggests that in mice, at least, pupils narrow in deep sleep to protect the animals from waking up with a sudden flash of light.

“The common saying that ‘the eyes are the window to the soul’ might even hold true behind closed eyelids during sleep,” Özge Yüzgeç, the student conducting the study, says. “The pupil continues to play an important role during sleep by blocking sensory input and thereby protecting the brain in periods of deep sleep, when memories should be consolidated.”

Huber says they would like to find out whether the findings hold in humans and whether their new method can be adapted in the sleep clinic. “Inferring brain activity by non-invasive pupil tracking might be an interesting alternative or complement to electrode recordings,” he says.

Reference:

Yüzgeç, Ö., Prsa, M., Zimmermann, R., & Huber, D. (2018). Pupil Size Coupling to Cortical States Protects the Stability of Deep Sleep via Parasympathetic Modulation. Current Biology. doi:10.1016/j.cub.2017.12.049

https://www.technologynetworks.com/neuroscience/news/pupil-size-couples-to-cortical-states-to-protect-deep-sleep-stability-296519?utm_campaign=NEWSLETTER_TN_Neuroscience_2017&utm_source=hs_email&utm_medium=email&utm_content=60184122&_hsenc=p2ANqtz-_uyMIjTK1pmq-79zMcyJIvQNsa8i7gH9l8Tn-_75Taz2opCD4t1otYN6OBmeI-iAKoenGO8wKWNZ7VV6E_JcYum4fHlA&_hsmi=60184122

Last year, doctors of optometry detected more than 320,000 cases of diabetes. Imagine if they could make the same impact when it comes to exposing early signs of Alzheimer’s disease.

November is National Alzheimer’s Disease Awareness Month. An estimated 5.4 million Americans are affected by Alzheimer’s disease, according to the Centers for Disease Control and Prevention (CDC). Projections put the number at 13.8 million by 2050.

Maryke Nijhuis Neiberg, O.D., associate professor in the School of Optometry at Massachusetts College of Pharmacy and Heath Sciences, in Worcester, Massachusetts, considers this an unrealized patient education opportunity for doctors of optometry.

“The earlier diagnoses give doctors and patients a better chance at managing the progressive brain disease and preserving the patient’s quality of life,” Dr. Neiberg says. “There has been some increase in Alzheimer’s awareness over the years, particularly in the eye community, but not enough yet.

“Alzheimer’s is a significant future public health issue,” she adds. “It is still a terminal disease.”

Early intervention

Much of the research on Alzheimer’s disease seeks to slow the disease’s progression. For instance, a study in Biological Psychiatry on Nov. 6 by researchers at the University of Iowa and the University of Texas Southwestern Medical Center in Dallas reports that there may be a new treatment that can slow the depression and cognitive decline associated with Alzheimer’s disease, without affecting amyloid plaque deposits or reactive glia in rats.

Among the early signs of Alzheimer’s, the researchers say, are anxiety, depression and irritability-long before the devastating effects of memory loss.

“Thus, P7C3 compounds may form the basis for a new class of neuroprotective drugs for mitigating the symptoms in patients with Alzheimer’s disease by preserving neuronal cell survival, irrespective of other pathological events,” researchers say. “P7C3 compounds represent a novel route to treating depression, and new-onset depression in elderly patients may herald the development of Alzheimer’s disease with later cognitive impairments to follow.”

Another study in JAMA Ophthalmology in September by researchers at Stanford University and Veterans Affairs Palo Alto Health Care System linked visual impairment and cognition in older adults and also stressed the “potential importance” of vision screening in identifying these patients’ eye disease and cognitive deficits. The AOA strongly recommends comprehensive eye examinations and stresses the limitations of screenings.

Optometry’s role

According to the CDC:

The rate of Alzheimer’s jumped 50 percent between 1999 and 2014.

Americans fear losing their mental capacity more than losing their physical abilities.

More than $230 billion is estimated to be spent in 2017 on providing health care, long-term care, hospice plus unpaid care for relatives with Alzheimer’s and other dementias.

More large-scale research on Alzheimer’s needs to be done, but progress is being made. Dr. Neiberg pointed to research linking optical coherence tomography (OCT) of the macula to Alzheimer’s and Parkinson’s diseases.

“With the advent of OCT, we now know that the retinal ganglion cell layer thins and that the optic nerve cup-to-disc ratio increases in size, not unlike glaucoma,” Dr. Neiberg says. “Alzheimer’s produces visual field defects that are easily confused with glaucoma. What we need is large-scale research to determine how much of the normal tension glaucoma we diagnose and treat is ultimately related to Alzheimer’s disease.”

She adds, “The early perceptual changes that occur in early Alzheimer’s are startling and measurable. One of the earliest signs is a decline in the Benton Visual Retention Test, a test of visual memory. This test requires the duplication of shapes on paper with a pencil, and is scored.

“Research has shown that this test is able to predict high risk for Alzheimer’s 15 years before diagnosis,” she says. “It’s a simple test many developmental and pediatric optometrists already have on their shelves. If we combine that test and the ocular findings we see, we have a very strong indication that something is indeed amiss. Armed with this information, the patient can then consult with their primary care physician, initiate lifestyle modification and request a referral if necessary.”

There is no cure for Alzheimer’s disease. But doctors of optometry can engage patients in conversation about Alzheimer’s disease and how they can manage their own risk factors, including:

Smoking
Mid-life obesity
Sedentary lifestyle
High-cholesterol diet|
Vascular disease (i.e., diabetes and hypertension)

“Lifestyle modification and early access to medication, which can delay the progression of dementia, might be enough to keep the disease at bay for longer,” Dr. Neiberg says. “We should include the Alzheimer’s disease connection when we educate our patients about lifestyle diseases.”

https://finchannel.com/society/health-beauty/69483-doctors-of-optometry-can-spot-early-signs-of-alzheimer-s-disease

Pupil dilation in reaction to negative emotional faces predicts risk for depression relapse, according to new research from Binghamton University, State University of New York.

Researchers at Binghamton University, led by PhD student Anastacia Kudinova, aimed to examine whether physiological reactivity to emotional stimuli, assessed via pupil dilation, served as a biological marker of risk for depression recurrence among individuals who are known to be at a higher risk due to having previous history of depression. Participants were 57 women with a history of major depressive disorder (MDD). The researchers recorded the change in pupil dilation in response to angry, happy, sad and neutral faces. The team found that women’s pupillary reactivity to negative (sad or angry faces) but not positive stimuli prospectively predicted MDD recurrence.

“The study focuses on trying to identify certain markers of depression risk using measures that are readily accessible, reliable and less expensive,” said Kudinova. “It is something we can put in any doctor’s office that gives us a quick and easy objective measure of risk.”

Additionally, the researchers found that both high and low reactivity to angry faces predicted risk for MDD recurrence. These findings suggest that disrupted physiological response to negative stimuli indexed via pupillary dilation could serve as a physiological marker of MDD risk, thus presenting clinicians with a convenient and inexpensive method to predict which of the at-risk women are more likely to experience depression recurrence.

“It’s a bit complicated because different patterns of findings were found for pupil reactivity to angry versus sad faces. Specifically, really high or really low pupil dilation to angry faces was associated with increased risk whereas only low dilation to sad faces was associated with risk (high dilation to sad faces was actually protective),” said Brandon Gibb, professor of psychology at Binghamton University and director of the Mood Disorders Institute and Center for Affective Science.

Other contributors to this research include Katie Burkhouse and Mary Woody, both PhD students; Max Owens, assistant professor of psychology at the University of South Florida, St. Petersburg; and Greg Siegle, associate professor of psychiatry at the University of Pittsburgh School of Medicine.
The paper, “Pupillary reactivity to negative stimuli prospectively predicts recurrence of major depressive disorder in women,” was published in Psychophysiology.

https://www.binghamton.edu/mpr/news-releases/news-release.html?id=2448

By Mo Costandi

It’s sometimes said that the eyes are windows into the soul, revealing deep emotions that we might otherwise want to hide. The eyes not only reflect what is happening in the brain but may also influence how we remember things and make decisions.

Our eyes are constantly moving, and while some of those movements are under conscious control, many of them occur subconsciously. When we read, for instance, we make a series of very quick eye movements called saccades that fixate rapidly on one word after another. When we enter a room, we make larger sweeping saccades as we gaze around. Then there are the small, involuntary eye movements we make as we walk, to compensate for the movement of our head and stabilise our view of the world. And, of course, our eyes dart around during the ‘rapid eye movement’ (REM) phase of sleep.

What is now becoming clear is that some of our eye movements may actually reveal our thought process.

Research published last year shows that pupil dilation is linked to the degree of uncertainty during decision-making: if somebody is less sure about their decision, they feel heightened arousal, which causes the pupils to dilate. This change in the eye may also reveal what a decision-maker is about to say: one group of researchers, for example, found that watching for dilation made it possible to predict when a cautious person used to saying ‘no’ was about to make the tricky decision to say ‘yes’.

Watching the eyes can even help predict what number a person has in mind. Tobias Loetscher and his colleagues at the University of Zurich recruited 12 volunteers and tracked their eye movements while they reeled off a list of 40 numbers.

They found that the direction and size of the participants’ eye movements accurately predicted whether the number they were about to say was bigger or smaller than the previous one – and by how much. Each volunteer’s gaze shifted up and to the right just before they said a bigger number, and down and to the left before a smaller one. The bigger the shift from one side to the other, the bigger the difference between the numbers.

This suggests that we somehow link abstract number representations in the brain with movement in space. But the study does not tell us which comes first: whether thinking of a particular number causes changes in eye position, or whether the eye position influences our mental activity. In 2013, researchers in Sweden published evidence that it’s the latter that may be at work: eye movements may actually facilitate memory retrieval.

They recruited 24 students and asked each one to carefully examine a series of objects displayed to them in one corner of a computer screen. The participants were then told to listen to a series of statements about some of the objects they had seen, such as “The car was facing to the left” and asked to indicate as quickly as possible if each was true or false. Some participants were allowed to let their eyes roam about freely; others were asked to fix their gaze on a cross at the centre of the screen, or the corner where the object had appeared, for example.

The researchers found that those who were allowed to move their eyes spontaneously during recall performed significantly better than those who fixed on the cross. Interestingly, though, participants who were told to fix their gaze in the corner of the screen in which objects had appeared earlier performed better than those told to fix their gaze in another corner. This suggests that the more closely the participants’ eye movements during information encoding corresponded with those that occurred during retrieval of the information, the better they were at remembering the objects. Perhaps that’s because eye movements help us to recall the spatial relationships between objects in the environment at the time of encoding.

These eye movements can occur unconsciously. “When people are looking at scenes they have encountered before, their eyes are frequently drawn to information they have already seen, even when they have no conscious memory of it,” says Roger Johansson, a psychologist at Lund University who led the study.

Watching eye movements can also be used to nudge people’s decisions. One recent study showed – maybe worryingly – that eye-tracking can be exploited to influence the moral decisions we take.

Researchers asked participants complex moral questions such as “Can murder ever be justified?” and then displayed, on a computer screen, alternative answers (“sometimes justifiable” or “never justifiable”). By tracking the participants’ eye movements, and removing the two answer options immediately after a participant had spent a certain amount of time gazing at one of the two options, the researchers found that they could nudge the participants to provide that particular option as their answer.

“We didn’t give them any more information,” says neuroscientist Daniel Richardson of University College London, senior author of study. “We simply waited for their own decision-making processes to unfold and interrupted them at exactly the right point. We made them change their minds just by controlling when they made the decision.”

Richardson adds that successful salespeople may have some insight into this, and use it to be more persuasive with clients. “We think of persuasive people as good talkers, but maybe they’re also observing the decision-making process,” he says. “Maybe good salespeople can spot the exact moment you’re wavering towards a certain choice, and then offer you a discount or change their pitch.”

The ubiquity of eye-tracking apps for smartphones and other hand-held devices raises the possibility of altering people’s decision-making process remotely. “If you’re shopping online, they might bias your decision by offering free shipping at the moment you shift your gaze to a particular product.”

Thus, eye movements can both reflect and influence higher mental functions such as memory and decision-making, and betray our thoughts, beliefs, and desires. This knowledge may give us ways of improving our mental functions – but it also leaves us vulnerable to subtle manipulation by other people.

“The eyes are like a window into our thought processes, and we just don’t appreciate how much information might be leaking out of them,” says Richardson. “They could potentially reveal things that a person might want to suppress, such as implicit racial bias.”

“I can see eye-tracking apps being used for, say, supportive technologies that figure out what phone function you need and then help out,” he adds, “but if they’re left on all the time they could be used to track all sorts of other things. This would provide much richer information, and raises the possibility of unwittingly sharing our thoughts with others.”

http://www.bbc.com/future/story/20150521-how-the-eyes-betray-your-thoughts