Posts Tagged ‘eye’

by Lacy Cook

This praying mantis isn’t just wearing minuscule 3D glasses for the cute factor, but to help scientists learn more about 3D vision. A Newcastle University team discovered a novel form of 3D vision, or stereo vision, in the insects – and compared human and insect stereo vision for the very first time. Their findings could have implications for visual processing in robots.

Humans aren’t the only creatures with stereo vision, which “helps us work out the distances to the things we see,” according to the university. Cats, horses, monkeys, toads, and owls have it too – but the only insect we know about with 3D vision is the praying mantis. Six Newcastle University researchers obtained new insight into their robust stereo vision with the help of small 3D glasses temporarily attached to the insects with beeswax.

The researchers designed an insect 3D cinema, showing a praying mantis a film of prey. The insects would actually try to catch the prey because the illusion was so convincing. And the scientists were able to take their work to the next level, showing the mantises “complex dot-patterns used to investigate human 3D vision” so they could compare our 3D vision with an insect’s for the first time.

According to the university, humans see 3D in still images by matching details of the image each eye sees. “But mantises only attack moving prey so their 3D doesn’t need to work in still images. The team found mantises don’t bother about the details of the picture but just look for places where the picture is changing…Even if the scientists made the two eyes’ images completely different, mantises can still match up the places where things are changing. They did so even when humans couldn’t.”

The journal Current Biology published their work online last week. Lead author Vivek Nityananda, a behavioral ecologist, described the praying mantis’ stereo vision as “a completely new form of 3D vision.”

Future robots could benefit from these findings: instead of 3D vision based on complex human stereo vision, researchers might be able to take some tips from praying mantis stereo vision, which team member Ghaith Tarawneh said probably doesn’t require a lot of computer processing since insect brains are so small.

https://inhabitat.com/praying-mantises-wearing-tiny-glasses-help-researchers-discover-new-type-of-3d-vision/

Advertisements

Simply moving the eyes triggers the eardrums to move too, says a new study by Duke University neuroscientists.

The researchers found that keeping the head still but shifting the eyes to one side or the other sparks vibrations in the eardrums, even in the absence of any sounds.

Surprisingly, these eardrum vibrations start slightly before the eyes move, indicating that motion in the ears and the eyes are controlled by the same motor commands deep within the brain.

“It’s like the brain is saying, ‘I’m going to move the eyes, I better tell the eardrums, too,’” said Jennifer Groh, a professor in the departments of neurobiology and psychology and neuroscience at Duke.

The findings, which were replicated in both humans and rhesus monkeys, provide new insight into how the brain coordinates what we see and what we hear. It may also lead to new understanding of hearing disorders, such as difficulty following a conversation in a crowded room.

The paper appeared Jan. 23 in Proceedings of the National Academy of Sciences.

It’s no secret that the eyes and ears work together to make sense of the sights and sounds around us. Most people find it easier to understand somebody if they are looking at them and watching their lips move. And in a famous illusion called the McGurk Effect, videos of lip cues dubbed with mismatched audio cause people to hear the wrong sound.

But researchers are still puzzling over where and how the brain combines these two very different types of sensory information.

“Our brains would like to match up what we see and what we hear according to where these stimuli are coming from, but the visual system and the auditory system figure out where stimuli are located in two completely different ways,” Groh said. “The eyes are giving you a camera-like snapshot of the visual scene, whereas for sounds, you have to calculate where they are coming from based on differences in timing and loudness across the two ears.”

Because the eyes are usually darting about within the head, the visual and auditory worlds are constantly in flux with respect to one another, Groh added.

In an experiment designed by Kurtis Gruters, a formal doctoral student in Groh’s lab and co-first author on the paper, 16 participants were asked to sit in a dark room and follow shifting LED lights with their eyes. Each participant also wore small microphones in their ear canals that were sensitive enough to pick up the slight vibrations created when the eardrum sways back and forth.

Though eardrums vibrate primarily in response to outside sounds, the brain can also control their movements using small bones in the middle ear and hair cells in the cochlea. These mechanisms help modulate the volume of sounds that ultimately reach the inner ear and brain, and produce small sounds known as otoacoustic emissions.

Gruters found that when the eyes moved, both eardrums moved in sync with one another, one side bulging inward at the same time the other side bulged outward. They continued to vibrate back and forth together until shortly after the eyes stopped moving. Eye movements in opposite directions produced opposite patterns of vibrations.

Larger eye movements also triggered bigger vibrations than smaller eye movements, the team found.

“The fact that these eardrum movements are encoding spatial information about eye movements means that they may be useful for helping our brains merge visual and auditory space,” said David Murphy, a doctoral student in Groh’s lab and co-first author on the paper. “It could also signify a marker of a healthy interaction between the auditory and visual systems.”

The team, which included Christopher Shera at the University of Southern California and David W. Smith of the University of Florida, is still investigating how these eardrum vibrations impact what we hear, and what role they may play in hearing disorders. In future experiments, they will look at whether up and down eye movements also cause unique signatures in eardrum vibrations.

“The eardrum movements literally contain information about what the eyes are doing,” Groh said. “This demonstrates that these two sensory pathways are coupled, and they are coupled at the earliest points.”

Cole Jenson, an undergraduate neuroscience major at Duke, also coauthored the new study.

CITATION: “The Eardrums Move When the Eyes Move: A Multisensory Effect on the Mechanics of Hearing,” K. G. Gruters, D. L. K. Murphy, Cole D. Jensen, D. W. Smith, C. A. Shera and J. M. Groh. Proceedings of the National Academy of Sciences, Jan. 23, 2018. DOI: 10.1073/pnas.1717948115

When people are awake, their pupils regularly change in size. Those changes are meaningful, reflecting shifting attention or vigilance, for example. Now, researchers reporting in Current Biology on January 18 have found in studies of mice that pupil size also fluctuates during sleep. They also show that pupil size is a reliable indicator of sleep states.

“We found that pupil size rhythmically fluctuates during sleep,” says Daniel Huber of the University of Geneva in Switzerland. “Intriguingly, these pupil fluctuations follow the sleep-related brain activity so closely that they can indicate with high accuracy the exact stage of sleep—the smaller the pupil, the deeper the sleep.”

Studies of pupil size had always been a challenge for an obvious reason: people and animals generally sleep with their eyes closed. Huber says that he and his colleagues were inspired to study pupil size in sleep after discovering that their laboratory mice sometimes sleep with their eyes open. They knew that pupil size varies strongly during wakefulness. What, they wondered, happened during sleep?

To investigate this question, they developed a novel optical pupil-tracking system for mice. The device includes an infrared light positioned close to the head of the animal. That invisible light travels through the skull and brain to illuminate the back of the eye. When the eyes are imaged with an infrared camera, the pupils appear as bright circles. Thanks to this new method, it was suddenly possible to track changes in pupil size accurately, particularly when the animals snoozed naturally with their eyelids open.

Their images show that mouse pupils rhythmically fluctuate during sleep and that those fluctuations are not at all random; they correlate with changes in sleep states.

Further experiments showed that changes in pupil size are not just a passive phenomenon, either. They are actively controlled by the parasympathetic autonomic nervous system. The evidence suggests that in mice, at least, pupils narrow in deep sleep to protect the animals from waking up with a sudden flash of light.

“The common saying that ‘the eyes are the window to the soul’ might even hold true behind closed eyelids during sleep,” Özge Yüzgeç, the student conducting the study, says. “The pupil continues to play an important role during sleep by blocking sensory input and thereby protecting the brain in periods of deep sleep, when memories should be consolidated.”

Huber says they would like to find out whether the findings hold in humans and whether their new method can be adapted in the sleep clinic. “Inferring brain activity by non-invasive pupil tracking might be an interesting alternative or complement to electrode recordings,” he says.

Reference:

Yüzgeç, Ö., Prsa, M., Zimmermann, R., & Huber, D. (2018). Pupil Size Coupling to Cortical States Protects the Stability of Deep Sleep via Parasympathetic Modulation. Current Biology. doi:10.1016/j.cub.2017.12.049

https://www.technologynetworks.com/neuroscience/news/pupil-size-couples-to-cortical-states-to-protect-deep-sleep-stability-296519?utm_campaign=NEWSLETTER_TN_Neuroscience_2017&utm_source=hs_email&utm_medium=email&utm_content=60184122&_hsenc=p2ANqtz-_uyMIjTK1pmq-79zMcyJIvQNsa8i7gH9l8Tn-_75Taz2opCD4t1otYN6OBmeI-iAKoenGO8wKWNZ7VV6E_JcYum4fHlA&_hsmi=60184122

Last year, doctors of optometry detected more than 320,000 cases of diabetes. Imagine if they could make the same impact when it comes to exposing early signs of Alzheimer’s disease.

November is National Alzheimer’s Disease Awareness Month. An estimated 5.4 million Americans are affected by Alzheimer’s disease, according to the Centers for Disease Control and Prevention (CDC). Projections put the number at 13.8 million by 2050.

Maryke Nijhuis Neiberg, O.D., associate professor in the School of Optometry at Massachusetts College of Pharmacy and Heath Sciences, in Worcester, Massachusetts, considers this an unrealized patient education opportunity for doctors of optometry.

“The earlier diagnoses give doctors and patients a better chance at managing the progressive brain disease and preserving the patient’s quality of life,” Dr. Neiberg says. “There has been some increase in Alzheimer’s awareness over the years, particularly in the eye community, but not enough yet.

“Alzheimer’s is a significant future public health issue,” she adds. “It is still a terminal disease.”

Early intervention

Much of the research on Alzheimer’s disease seeks to slow the disease’s progression. For instance, a study in Biological Psychiatry on Nov. 6 by researchers at the University of Iowa and the University of Texas Southwestern Medical Center in Dallas reports that there may be a new treatment that can slow the depression and cognitive decline associated with Alzheimer’s disease, without affecting amyloid plaque deposits or reactive glia in rats.

Among the early signs of Alzheimer’s, the researchers say, are anxiety, depression and irritability-long before the devastating effects of memory loss.

“Thus, P7C3 compounds may form the basis for a new class of neuroprotective drugs for mitigating the symptoms in patients with Alzheimer’s disease by preserving neuronal cell survival, irrespective of other pathological events,” researchers say. “P7C3 compounds represent a novel route to treating depression, and new-onset depression in elderly patients may herald the development of Alzheimer’s disease with later cognitive impairments to follow.”

Another study in JAMA Ophthalmology in September by researchers at Stanford University and Veterans Affairs Palo Alto Health Care System linked visual impairment and cognition in older adults and also stressed the “potential importance” of vision screening in identifying these patients’ eye disease and cognitive deficits. The AOA strongly recommends comprehensive eye examinations and stresses the limitations of screenings.

Optometry’s role

According to the CDC:

The rate of Alzheimer’s jumped 50 percent between 1999 and 2014.

Americans fear losing their mental capacity more than losing their physical abilities.

More than $230 billion is estimated to be spent in 2017 on providing health care, long-term care, hospice plus unpaid care for relatives with Alzheimer’s and other dementias.

More large-scale research on Alzheimer’s needs to be done, but progress is being made. Dr. Neiberg pointed to research linking optical coherence tomography (OCT) of the macula to Alzheimer’s and Parkinson’s diseases.

“With the advent of OCT, we now know that the retinal ganglion cell layer thins and that the optic nerve cup-to-disc ratio increases in size, not unlike glaucoma,” Dr. Neiberg says. “Alzheimer’s produces visual field defects that are easily confused with glaucoma. What we need is large-scale research to determine how much of the normal tension glaucoma we diagnose and treat is ultimately related to Alzheimer’s disease.”

She adds, “The early perceptual changes that occur in early Alzheimer’s are startling and measurable. One of the earliest signs is a decline in the Benton Visual Retention Test, a test of visual memory. This test requires the duplication of shapes on paper with a pencil, and is scored.

“Research has shown that this test is able to predict high risk for Alzheimer’s 15 years before diagnosis,” she says. “It’s a simple test many developmental and pediatric optometrists already have on their shelves. If we combine that test and the ocular findings we see, we have a very strong indication that something is indeed amiss. Armed with this information, the patient can then consult with their primary care physician, initiate lifestyle modification and request a referral if necessary.”

There is no cure for Alzheimer’s disease. But doctors of optometry can engage patients in conversation about Alzheimer’s disease and how they can manage their own risk factors, including:

Smoking
Mid-life obesity
Sedentary lifestyle
High-cholesterol diet|
Vascular disease (i.e., diabetes and hypertension)

“Lifestyle modification and early access to medication, which can delay the progression of dementia, might be enough to keep the disease at bay for longer,” Dr. Neiberg says. “We should include the Alzheimer’s disease connection when we educate our patients about lifestyle diseases.”

https://finchannel.com/society/health-beauty/69483-doctors-of-optometry-can-spot-early-signs-of-alzheimer-s-disease

Pupil dilation in reaction to negative emotional faces predicts risk for depression relapse, according to new research from Binghamton University, State University of New York.

Researchers at Binghamton University, led by PhD student Anastacia Kudinova, aimed to examine whether physiological reactivity to emotional stimuli, assessed via pupil dilation, served as a biological marker of risk for depression recurrence among individuals who are known to be at a higher risk due to having previous history of depression. Participants were 57 women with a history of major depressive disorder (MDD). The researchers recorded the change in pupil dilation in response to angry, happy, sad and neutral faces. The team found that women’s pupillary reactivity to negative (sad or angry faces) but not positive stimuli prospectively predicted MDD recurrence.

“The study focuses on trying to identify certain markers of depression risk using measures that are readily accessible, reliable and less expensive,” said Kudinova. “It is something we can put in any doctor’s office that gives us a quick and easy objective measure of risk.”

Additionally, the researchers found that both high and low reactivity to angry faces predicted risk for MDD recurrence. These findings suggest that disrupted physiological response to negative stimuli indexed via pupillary dilation could serve as a physiological marker of MDD risk, thus presenting clinicians with a convenient and inexpensive method to predict which of the at-risk women are more likely to experience depression recurrence.

“It’s a bit complicated because different patterns of findings were found for pupil reactivity to angry versus sad faces. Specifically, really high or really low pupil dilation to angry faces was associated with increased risk whereas only low dilation to sad faces was associated with risk (high dilation to sad faces was actually protective),” said Brandon Gibb, professor of psychology at Binghamton University and director of the Mood Disorders Institute and Center for Affective Science.

Other contributors to this research include Katie Burkhouse and Mary Woody, both PhD students; Max Owens, assistant professor of psychology at the University of South Florida, St. Petersburg; and Greg Siegle, associate professor of psychiatry at the University of Pittsburgh School of Medicine.
The paper, “Pupillary reactivity to negative stimuli prospectively predicts recurrence of major depressive disorder in women,” was published in Psychophysiology.

https://www.binghamton.edu/mpr/news-releases/news-release.html?id=2448

By Mo Costandi

It’s sometimes said that the eyes are windows into the soul, revealing deep emotions that we might otherwise want to hide. The eyes not only reflect what is happening in the brain but may also influence how we remember things and make decisions.

Our eyes are constantly moving, and while some of those movements are under conscious control, many of them occur subconsciously. When we read, for instance, we make a series of very quick eye movements called saccades that fixate rapidly on one word after another. When we enter a room, we make larger sweeping saccades as we gaze around. Then there are the small, involuntary eye movements we make as we walk, to compensate for the movement of our head and stabilise our view of the world. And, of course, our eyes dart around during the ‘rapid eye movement’ (REM) phase of sleep.

What is now becoming clear is that some of our eye movements may actually reveal our thought process.

Research published last year shows that pupil dilation is linked to the degree of uncertainty during decision-making: if somebody is less sure about their decision, they feel heightened arousal, which causes the pupils to dilate. This change in the eye may also reveal what a decision-maker is about to say: one group of researchers, for example, found that watching for dilation made it possible to predict when a cautious person used to saying ‘no’ was about to make the tricky decision to say ‘yes’.

Watching the eyes can even help predict what number a person has in mind. Tobias Loetscher and his colleagues at the University of Zurich recruited 12 volunteers and tracked their eye movements while they reeled off a list of 40 numbers.

They found that the direction and size of the participants’ eye movements accurately predicted whether the number they were about to say was bigger or smaller than the previous one – and by how much. Each volunteer’s gaze shifted up and to the right just before they said a bigger number, and down and to the left before a smaller one. The bigger the shift from one side to the other, the bigger the difference between the numbers.

This suggests that we somehow link abstract number representations in the brain with movement in space. But the study does not tell us which comes first: whether thinking of a particular number causes changes in eye position, or whether the eye position influences our mental activity. In 2013, researchers in Sweden published evidence that it’s the latter that may be at work: eye movements may actually facilitate memory retrieval.

They recruited 24 students and asked each one to carefully examine a series of objects displayed to them in one corner of a computer screen. The participants were then told to listen to a series of statements about some of the objects they had seen, such as “The car was facing to the left” and asked to indicate as quickly as possible if each was true or false. Some participants were allowed to let their eyes roam about freely; others were asked to fix their gaze on a cross at the centre of the screen, or the corner where the object had appeared, for example.

The researchers found that those who were allowed to move their eyes spontaneously during recall performed significantly better than those who fixed on the cross. Interestingly, though, participants who were told to fix their gaze in the corner of the screen in which objects had appeared earlier performed better than those told to fix their gaze in another corner. This suggests that the more closely the participants’ eye movements during information encoding corresponded with those that occurred during retrieval of the information, the better they were at remembering the objects. Perhaps that’s because eye movements help us to recall the spatial relationships between objects in the environment at the time of encoding.

These eye movements can occur unconsciously. “When people are looking at scenes they have encountered before, their eyes are frequently drawn to information they have already seen, even when they have no conscious memory of it,” says Roger Johansson, a psychologist at Lund University who led the study.

Watching eye movements can also be used to nudge people’s decisions. One recent study showed – maybe worryingly – that eye-tracking can be exploited to influence the moral decisions we take.

Researchers asked participants complex moral questions such as “Can murder ever be justified?” and then displayed, on a computer screen, alternative answers (“sometimes justifiable” or “never justifiable”). By tracking the participants’ eye movements, and removing the two answer options immediately after a participant had spent a certain amount of time gazing at one of the two options, the researchers found that they could nudge the participants to provide that particular option as their answer.

“We didn’t give them any more information,” says neuroscientist Daniel Richardson of University College London, senior author of study. “We simply waited for their own decision-making processes to unfold and interrupted them at exactly the right point. We made them change their minds just by controlling when they made the decision.”

Richardson adds that successful salespeople may have some insight into this, and use it to be more persuasive with clients. “We think of persuasive people as good talkers, but maybe they’re also observing the decision-making process,” he says. “Maybe good salespeople can spot the exact moment you’re wavering towards a certain choice, and then offer you a discount or change their pitch.”

The ubiquity of eye-tracking apps for smartphones and other hand-held devices raises the possibility of altering people’s decision-making process remotely. “If you’re shopping online, they might bias your decision by offering free shipping at the moment you shift your gaze to a particular product.”

Thus, eye movements can both reflect and influence higher mental functions such as memory and decision-making, and betray our thoughts, beliefs, and desires. This knowledge may give us ways of improving our mental functions – but it also leaves us vulnerable to subtle manipulation by other people.

“The eyes are like a window into our thought processes, and we just don’t appreciate how much information might be leaking out of them,” says Richardson. “They could potentially reveal things that a person might want to suppress, such as implicit racial bias.”

“I can see eye-tracking apps being used for, say, supportive technologies that figure out what phone function you need and then help out,” he adds, “but if they’re left on all the time they could be used to track all sorts of other things. This would provide much richer information, and raises the possibility of unwittingly sharing our thoughts with others.”

http://www.bbc.com/future/story/20150521-how-the-eyes-betray-your-thoughts

By Helen Thomson

“When the tide came in, these kids started swimming. But not like I had seen before. They were more underwater than above water, they had their eyes wide open – they were like little dolphins.”

Deep in the island archipelagos on the Andaman Sea, and along the west coast of Thailand live small tribes called the Moken people, also known as sea-nomads. Their children spend much of their day in the sea, diving for food. They are uniquely adapted to this job – because they can see underwater. And it turns out that with a little practice, their unique vision might be accessible to any young person.

In 1999, Anna Gislen at the University of Lund, in Sweden was investigating different aspects of vision, when a colleague suggested that she might be interested in studying the unique characteristics of the Moken tribe. “I’d been sitting in a dark lab for three months, so I thought, ‘yeah, why not go to Asia instead’,” says Gislen.

Gislen and her six-year old daughter travelled to Thailand and integrated themselves within the Moken communities, who mostly lived on houses sat upon poles. When the tide came in, the Moken children splashed around in the water, diving down to pick up food that lay metres below what Gislen or her daughter could see. “They had their eyes wide open, fishing for clams, shells and sea cucumbers, with no problem at all,” she says.

Gislen set up an experiment to test just how good the children’s underwater vision really was. The kids were excited about joining in, says Gislen, “they thought it was just a fun game.”

The kids had to dive underwater and place their heads onto a panel. From there they could see a card displaying either vertical or horizontal lines. Once they had stared at the card, they came back to the surface to report which direction the lines travelled. Each time they dived down, the lines would get thinner, making the task harder. It turned out that the Moken children were able to see twice as well as European children who performed the same experiment at a later date.

What was going on? To see clearly above land, you need to be able to refract light that enters the eye onto the retina. The retina sits at the back of the eye and contains specialised cells, which convert the light signals into electrical signals that the brain interprets as images.

Light is refracted when it enters the human eye because the outer cornea contains water, which makes it slightly denser than the air outside the eye. An internal lens refracts the light even further.

When the eye is immersed in water, which has about the same density as the cornea, we lose the refractive power of the cornea, which is why the image becomes severely blurred.

Gislen figured that in order for the Moken children to see clearly underwater, they must have either picked up some adaption that fundamentally changed the way their eyes worked, or they had learned to use their eyes differently under water.

She thought the first theory was unlikely, because a fundamental change to the eye would probably mean the kids wouldn’t be able to see well above water. A simple eye test proved this to be true – the Moken children could see just as well above water as European children of a similar age.

It had to be some kind of manipulation of the eye itself, thought Gislen. There are two ways in which you can theoretically improve your vision underwater. You can change the shape of the lens – which is called accommodation – or you can make the pupil smaller, thereby increasing the depth of field.

Their pupil size was easy to measure – and revealed that they can constrict their pupils to the maximum known limit of human performance. But this alone couldn’t fully explain the degree to which their sight improved. This led Gislen to believe that accommodation of the lens was also involved.

“We had to make a mathematical calculation to work out how much the lens was accommodating in order for them to see as far as they could,” says Gislen. This showed that the children had to be able to accommodate to a far greater degree than you would expect to see underwater.

“Normally when you go underwater, everything is so blurry that the eye doesn’t even try to accommodate, it’s not a normal reflex,” says Gislen. “But the Moken children are able to do both – they can make their pupils smaller and change their lens shape. Seals and dolphins have a similar adaptation.”

Gislen was able to test a few Moken adults in the same way. They showed no unusual underwater vision or accommodation – perhaps explaining why the adults in the tribe caught most of their food by spear fishing above the surface. “When we age, our lenses become less flexible, so it makes sense that the adults lose the ability to accommodate underwater,” says Gislen.

Gislen wondered whether the Moken children had a genetic anomaly to thank for their ability to see underwater or whether it was just down to practice. To find out, she asked a group of European children on holiday in Thailand, and a group of children in Sweden to take part in training sessions, in which they dived underwater and tried to work out the direction of lines on a card. After 11 sessions across one month, both groups had attained the same underwater acuity as the Moken children.

“It was different for each child, but at some point their vision would just suddenly improve,” says Gislen. “I asked them whether they were doing anything different and they said, ‘No, I can just see better now’.”

She did notice, however, that the European kids would experience red eyes, irritated by the salt in the water, whereas the Moken children appeared to have no such problem. “So perhaps there is some adaptation there that allows them to dive down 30 times without any irritation,” she says.

Gislen recently returned to Thailand to visit the Moken tribes, but things had changed dramatically. In 2004, a tsunami created by a giant earthquake within the Indian Ocean destroyed much of the Moken’s homeland. Since then, the Thai government has worked hard to move them onto the land, building homes that are further inland and employing members of the tribe to work in the National Park. “It’s difficult,” says Gislen. “You want to help keep people safe and give them the best parts of modern culture, but in doing so they lose their own culture.”

In unpublished work, Gislen tested the same kids that were in her original experiment. The Moken children, now in their late teens, were still able to see clearly underwater. She wasn’t able to test many adults as they were too shy, but she is certain that they would have lost the ability to see underwater as they got older. “The adult eye just isn’t capable of that amount of accommodation,” she says.

Unfortunately, the children in Gislen’s experiments may be the last of the tribe to possess the ability to see so clearly underwater. “They just don’t spend as much time in the sea anymore,” she says, “so I doubt that any of the children that grow up these days in the tribe have this extraordinary vision.”

http://www.bbc.com/future/story/20160229-the-sea-nomad-children-who-see-like-dolphins