New research shows that being forgetful is a sign of unusual intelligence

By Timothy Roberts

Being able to recall memories, whether short-term or long-term is something that we all need in life. It comes in handy when we are studying at school or when we are trying to remember where we left our keys. We also tend to use our memory at work and remembering somebody’s name is certainly a good thing.

Although many of us may consider ourselves to have a good memory, we are all going to forget things from time to time. When it happens, we might feel as if we are slipping but there may be more behind it than you realize.

Imagine this scenario; you go to the grocery store to pick up 3 items and suddenly, you forget why you were there. Even worse, you may walk from one room to another and forget why you got up in the first place!

If you often struggle with these types of problems, you will be happy to learn that there is probably nothing wrong with you. In fact, a study that was done by the Neuron Journal and it has some rather good news. It says that forgetting is part of the brain process that might actually make you smarter by the time the day is over.

Professors took part in a study at the University of Toronto and they discovered that the perfect memory actually doesn’t necessarily reflect your level of intelligence.

You might even be surprised to learn that when you forget details on occasion, it can make you smarter.

Most people would go by the general thought that remembering more means that you are smarter.

According to the study, however, when you forget a detail on occasion, it’s perfectly normal. It has to do with remembering the big picture compared to remembering little details. Remembering the big picture is better for the brain and for our safety.

Our brains are perhaps more of a computer than many of us think. The hippocampus, which is the part of the brain where memories are stored, tends to filter out the unnecessary details.

In other words, it helps us to “optimize intelligent decision making by holding onto what’s important and letting go of what’s not.”

Think about it this way; is it easier to remember somebody’s face or their name? Which is the most important?

In a social setting, it is typically better to remember both but if we were part of the animal kingdom, remembering somebody as being a threat would mean our very lives. Remembering their name would be inconsequential.

The brain doesn’t automatically decide what we should remember and what we shouldn’t. It holds new memories but it sometimes overwrites old memories.

When the brain becomes cluttered with memories, they tend to conflict with each other and that can make it difficult to make important decisions.

That is why the brain tends to hold on to those big picture memories but they are becoming less important with the advent of technology.

As an example, at one time, we would have learned how to spell words but now, we just use Google if we don’t know how to spell them. We also tend to look everything up online, from how to change a showerhead to how to cook meatloaf for dinner.

If you forget everything, you may want to consider having a checkup but if you forget things on occasion, it’s perfectly okay.

The moral of the story is, the next time you forget something, just think of it as your brain doing what it was designed to do.

http://wetpaintlife.com/scientists-say-that-being-forgetful-is-actually-a-sign-you-are-unusually-intelligent/?utm_source=vn&utm_tracking=11&utm_medium=Social

Could a Dose of Sunshine Make You Smarter?

By Ruth Williams

The sun’s ultraviolet (UV) radiation is a major cause of skin cancer, but it offers some health benefits too, such as boosting production of essential vitamin D and improving mood. A recent report in Cell adds enhanced learning and memory to UV’s unexpected benefits.

Researchers have discovered that, in mice, exposure to UV light activates a molecular pathway that increases production of the brain chemical glutamate, heightening the animals’ ability to learn and remember.

“The subject is of strong interest, because it provides additional support for the recently proposed theory of ultraviolet light’s regulation of the brain and central neuroendocrine system,” dermatologist Andrzej Slominski of the University of Alabama who was not involved in the research writes in an email to The Scientist.

“It’s an interesting and timely paper investigating the skin-brain connection,” notes skin scientist Martin Steinhoff of University College Dublin’s Center for Biomedical Engineering who also did not participate in the research. “The authors make an interesting observation linking moderate UV exposure to . . . [production of] the molecule urocanic acid. They hypothesize that this molecule enters the brain, activates glutaminergic neurons through glutamate release, and that memory and learning are increased.”

While the work is “fascinating, very meticulous, and extremely detailed,” says dermatologist David Fisher of Massachusetts General Hospital and Harvard Medical School, “it does not imply that UV is actually good for you. . . . Across the board, for humanity, UV really is dangerous.”

Wei Xiong of the University of Science and Technology of China who led the research did not set out to investigate the effects of UV light on the brain or the skin-brain connection. He stumbled upon his initial finding “almost accidentally,” he explains in an email to The Scientist. Xiong and his colleagues were using a mass spectrometry technique they had recently developed for analyzing the molecular contents of single neurons, when their results revealed the unexpected presence of urocanic acid—a little-known molecule produced in the skin in response to UV light.

“It was a surprise because we checked through all the literature and found no reports of the existence of this small molecule in the central nervous system,” writes Xiong.

With little information to go on, Xiong and his colleagues decided to see whether UV light could also boost levels of urocanic acid in the brain. They exposed shaved mice to a low-dose of UVB—responsible for sunburn in humans—for 2 hours, then performed mass spectrometry on the animals’ individual brain cells. Sure enough, levels of urocanic acid increased in neurons of the animals exposed to the light, but not in those of control animals.

Urocanic acid can absorb UV rays and, as a result, may be able to protect skin against the sun’s harmful effects. But in the liver and other peripheral tissues, the acid is also known to be an intermediate molecule generated in the metabolic pathway that converts histidine to glutamate. Given glutamate’s role in the brain as an excitatory neurotransmitter, Xiong and his colleagues were interested to test whether the observed UV-dependent increase in urocanic acid in neurons might be coupled with increased glutamate production. It was.

Next, the team showed that UV light enhanced electrical transmission between glutaminergic neurons in brain slices taken from animals exposed to UV, but not in those from control animals. This UV-induced effect was prevented when the researchers inhibited activity of the enzyme urocanase, which converts urocanic acid to glutamate, indicating that the acid was indeed the mediator of the UV-induced boost in glutaminergic activity.

Lastly, the team showed that mice exposed to UV performed better in motor learning and recognition memory tasks than their unexposed counterparts. And, as before, treating the animals with a urocanase inhibitor prevented the UV-induced improvements in learning and memory. Administering urocanic acid directly to animals not exposed to ultraviolet light also spurred similar learning and memory improvements to those achieved with UV exposure.

Whether the results obtained in mice, which are nocturnal and rarely see the sun, will hold true in humans is yet to be determined. But, Fisher says, if the results do hold, the finding that urocanic acid alone can enhance learning and memory might suggest “a way to utilize this information to benefit people without exposing them to the damaging effects of UV.”

H. Zhu et al., “Moderate UV exposure enhances learning and memory by promoting a novel glutamate biosynthetic pathway in the brain,” Cell, doi: 10.1016/j.cell.2018.04.014, 2018.

https://www.the-scientist.com/?articles.view/articleNo/54603/title/Could-a-Dose-of-Sunshine-Make-You-Smarter-/

New study suggests that living in dim light can affect our brains

By Andy Henion

Spending too much time in dimly lit rooms and offices may actually change the brain’s structure and hurt one’s ability to remember and learn, indicates groundbreaking research by Michigan State University neuroscientists.

The researchers studied the brains of Nile grass rats (which, like humans, are diurnal and sleep at night) after exposing them to dim and bright light for four weeks. The rodents exposed to dim light lost about 30 percent of capacity in the hippocampus, a critical brain region for learning and memory, and performed poorly on a spatial task they had trained on previously.

The rats exposed to bright light, on the other hand, showed significant improvement on the spatial task. Further, when the rodents that had been exposed to dim light were then exposed to bright light for four weeks (after a month-long break), their brain capacity – and performance on the task – recovered fully.

The study, funded by the National Institutes of Health, is the first to show that changes in environmental light, in a range normally experienced by humans, leads to structural changes in the brain. Americans, on average, spend about 90 percent of their time indoors, according to the Environmental Protection Agency.

“When we exposed the rats to dim light, mimicking the cloudy days of Midwestern winters or typical indoor lighting, the animals showed impairments in spatial learning,” said Antonio “Tony” Nunez, psychology professor and co-investigator on the study. “This is similar to when people can’t find their way back to their cars in a busy parking lot after spending a few hours in a shopping mall or movie theater.”

Nunez collaborated with Lily Yan, associate professor of psychology and principal investigator on the project, and Joel Soler, a doctoral graduate student in psychology. Soler is also lead author of a paper on the findings published in the journal Hippocampus.

Soler said sustained exposure to dim light led to significant reductions in a substance called brain derived neurotrophic factor – a peptide that helps maintain healthy connections and neurons in the hippocampus – and in dendritic spines, or the connections that allow neurons to “talk” to one another.

“Since there are fewer connections being made, this results in diminished learning and memory performance that is dependent upon the hippocampus,” Soler said. “In other words, dim lights are producing dimwits.”

Interestingly, light does not directly affect the hippocampus, meaning it acts first on other sites within the brain after passing through the eyes. Yan said the research team is investigating one potential site in the rodents’ brains – a group of neurons in the hypothalamus that produce a peptide called orexin that’s known to influence a variety of brain functions. One of their major research questions: If orexin is given to the rats that are exposed to dim light, will their brains recover without being re-exposed to bright light?

The project could have implications for the elderly and people with glaucoma, retinal degeneration or cognitive impairments.

“For people with eye disease who don’t receive much light, can we directly manipulate this group of neurons in the brain, bypassing the eye, and provide them with the same benefits of bright light exposure?” Yan said. “Another possibility is improving the cognitive function in the aging population and those with neurological disorders. Can we help them recover from the impairment or prevent further decline?”

http://msutoday.msu.edu/news/2018/does-dim-light-make-us-dumber/

This is How You Raise Successful Teens, and It Starts When They’re Toddlers With a 3-Part Adaptive Cascade

by Drake Baer, Senior writer at Thrive Global covering the brain and social sciences.

Teachers, parents and policymakers are finally started to realize that academic success depends on more than just “booksmarts,” the kind of fluid intelligence captured by IQ tests and the like. The importance of “soft” or “non-cognitive” skills like grit and emotional intelligence is growing rapidly. But there’s a deeper question here: where do these soft skills come from? According to a new paper in Psychological Science, it’s your mom.

The research team, lead by Lilian Dindo, a clinical psychologist at the Baylor College of Medicine, crossed disciplines and decades to discover what they describe as an “adaptive cascade” that happens in three parts, drawing a line from the relational experiences we have as infants to the academic achievements we have later on. “That having a supportive responsive caregiving environment can actually provide these inner resources that will foster something like effortful control, and that this in turn can actually promote better functioning in school is the new thing here,” she tells Thrive Global.

The first part of that cascade is “secure attachment.” Tots—in this study, one cohort of 9-month olds and another of two-to-three year olds—get strongly influenced by their primary caregivers, implicitly learning how relationships work (often called attachment in the psychology field).

In this study, the mothers rated their children’s security of attachment using a widely used assessment tool. “If a child is distressed and shows distress to a parent and the parent responds to the distress in sensitive and loving and reassuring ways the child then feels secure in their knowledge that they can freely express this negative emotion,” Dindo explained. “Learning in that way is very different than learning that if I express negative emotion then I will be rejected or minimized or ignored or ridiculed. And so the child will learn not to express the negative emotions, to inhibit that negative emotion, or to actually act up even more to try to get that response. Either way they’re learning that expressing this negative emotion will not be responded to in a sensitive or loving way.”

Think of it this way: if you ate at a restaurant and it made you sick, you’d be unlikely to go back; if you expressed hurt and your mom rejected it, you’d minimize that pain next time. Even very early in life, kids are already observing cause and effect.

Step two in the cascade is effortful control, or the ability to delay gratification and inhibit a response to something when it’s in your best interest to do so—it’s the toddler-aged forerunner of things like grit and conscientiousness. In this study, effortful control in toddlers was examined experimentally—for example, in a “snack delay” task where tykes are presented with a cup of Goldfish crackers and instructed to wait to eat them until the experimenter rings a bell—and through parental ratings of how well the kids controlled themselves at home.

Then comes the third part of the cascade: academic achievement. More than a decade after the first experiments, Dindo tracked down the mother-child duos. About two-thirds of each cohort participated in the follow-up, where moms sent in their now 11 to 15-year-old kids’ scores on a couple of academic different standardized tests. The researchers crunched the data from all of the experiments and found quite the developmental chain: secure attachment was associated with effortful control in toddlers, and in turn, effortful control at age 3 predicted better test scores in early adolescence.

While this study doesn’t explain the mechanics of that three-part cascade, Dindo thinks it has to do with how we learn to regard our own inner emotional lives from the way our moms (or primary caregivers) regard us. If mom is soothing and dependable, you learn to consistently do the same for yourself—you learn that you’re going to be okay even if you feel anxious in the moment, like when tackling homework or a test. To Dindo, this shows how coming from a psychologically or emotionally deprived environment can have long-term consequences: if you don’t get the loving attentiveness you need when you’re little, it’s going to be harder to succeed as you grow up.

In very hopeful news though, other studies out this year—like here (https://www.ncbi.nlm.nih.gov/pubmed/28401843) and here (https://www.ncbi.nlm.nih.gov/pubmed/28401847) —show that when parents get attachment interventions, or are coached to be more attentive to their toddlers, the kids’ effortful control scores go up, which should, in turn, lead to greater achievement down the line. Because as this line of research is starting to show, just like plants need sunlight to grow into their fullest forms, humans need skillful love to reach their full potential.

https://www.thriveglobal.com/stories/15459-this-is-how-you-raise-successful-teens

https://www.ncbi.nlm.nih.gov/pubmed/29023183

Psychol Sci. 2017 Oct 1:956797617721271. doi: 10.1177/0956797617721271. [Epub ahead of print]

Attachment and Effortful Control in Toddlerhood Predict Academic Achievement Over a Decade Later.

Dindo L, Brock RL, Aksan N, Gamez W, Kochanska G, Clark LA.

Abstract

A child’s attachment to his or her caregiver is central to the child’s development. However, current understanding of subtle, indirect, and complex long-term influences of attachment on various areas of functioning remains incomplete. Research has shown that (a) parent-child attachment influences the development of effortful control and that (b) effortful control influences academic success. The entire developmental cascade among these three constructs over many years, however, has rarely been examined. This article reports a multimethod, decade-long study that examined the influence of mother-child attachment and effortful control in toddlerhood on school achievement in early adolescence. Both attachment security and effortful control uniquely predicted academic achievement a decade later. Effortful control mediated the association between early attachment and school achievement during adolescence. This work suggests that attachment security triggers an adaptive cascade by promoting effortful control, a vital set of skills necessary for future academic success.

KEYWORDS: academic performance; attachment; effortful control; longitudinal; temperament

PMID: 29023183 DOI: 10.1177/0956797617721271

How your face betrays your personality, health and intelligence

By David Robson

You might expect a great philosopher to look past our surface into the depths of the soul – but Ancient Greek thinkers were surprisingly concerned with appearance. Aristotle and his followers even compiled a volume of the ways that your looks could reflect your spirit.

“Soft hair indicates cowardice and coarse hair courage,” they wrote. Impudence, the treatise says, was evident in “bright, wise-open eyes with heavy blood-shot lids”; a broad nose, meanwhile, was a sign of laziness, like in cattle.

Sensuous, fleshy lips fared little better. The philosophers saw it as a sign of folly, “like an ass”, while those with especially thin mouths were thought to be proud, like lions.

Today, we are taught not to judge a book by its cover. But while it is wise not to set too much by appearances, psychologists are finding that your face offers a window on our deepest secrets. Even if you keep a stony poker face, your features can reveal details about your personality, your health, and your intelligence.

“The idea is that our biology, like genes and hormone levels, influences our growth, and the same mechanisms will also shape our character,” explains Carmen Lefevre at Northumbria University.

Consider the face’s bone structure – whether it is relatively short and wide or long and thin. Lefevre has found that people with higher levels of testosterone tend to be wider-faced with bigger cheekbones, and they are also more likely to have more assertive, and sometimes aggressive, personalities.

The link between face shape and dominance is surprisingly widespread, from capuchin monkeys – the wider the face, the more likely they are to hold a higher rank in the group’s hierarchy – to professional football players. Examining the 2010 World Cup, Keith Welker at the University of Boulder, Colorado, recently showed that the ratio of the width and height of the footballers’ faces predicted both the number of fouls among midfielders, and the number of goals scored by the forwards.

(To calculate this yourself, compare the distance from ear-to-ear with the distance between the top of your eyes, and your upper lip. The average ratio of width-to-height is around 2 – Abraham Lincoln was 1.93)

It may even clue you in to a politician’s motives. Using volunteers to rate former US presidents on different psychological attributes, Lefevre found that face shape seemed to reflect their perceived ambition and drive. John F Kennedy had a thicker-set face than 19th Century Chester Arthur, for instance. Such analyses of historical figures are perhaps best taken with a pinch of salt, however, and it has to be said that other traits, such as cooperation and intelligence, should be equally important for success.

As you might expect, your health and medical history are also written in your countenance – and the detail it offers is surprising. The amount of fat on your face, for instance, provides a stronger indication of your fitness than more standard measures, such as your body mass index. Those with thinner faces are less likely to suffer infections, and when they do, the illness is less severe; they also tend to have lower rates of depression and anxiety, probably because mental health is often closely related to the body’s fitness in general.

How could the plumpness of your cheeks say so much about you? Benedict Jones at the University of Glasgow thinks a new understanding of fat’s role in the body may help explain it. “How healthy you are isn’t so much about how much fat you have, but where you have that fat,” he says. Pear-shaped people, with more weight around the hips and bottom but slimmer torsos, tend to be healthier than “apples” with a spare tyre around the midriff, since the adipose tissue around the chest is thought to release inflammatory molecules that can damage the core organs. Perhaps the fullness of your face reflects the fatty deposits in the more harmful areas, Jones says. Or it could be that facial fat is itself dangerous for some reason.

Besides these more overt cues, very subtle differences in skin colour can also reveal your health secrets. Jones and Lefevre emphasise this has nothing to do with the tones associated with ethnicity, but barely-noticeable tints that may reflect differences in lifestyle. You appear to be in more robust health, for instance, if your skin has a slightly yellowish, golden tone. The pigments in question are called carotenoids, which, as the name suggest, can be found in orange and red fruit and veg. Carotenoids help build a healthy immune system, says Lefevre. “But when we’ve eaten enough, they layer in the skin and dye it yellow. We exhibit them, because we haven’t used them to battle illness.” The glow of health, in turn, contributes significantly to your physical attraction – more so, in fact, than the more overt tones that might accompany a trip to the tanning salon.

A blush of pink, meanwhile, should signal the good circulation that comes with an active lifestyle – and it might also be a sign of a woman’s fertility. Jones has found that women tend to adopt a slightly redder flush at the peak of the menstrual cycle, perhaps because estradiol, a sex hormone, leads the blood vessels in the cheek to dilate slightly. It may be one of many tiny shifts in appearance and behaviour that together make a woman slightly more attractive when she is most likely to conceive.

As Jones points out, these secrets were hiding in plain sight – yet we were slow to uncover them. At the very least, this knowledge helps restore the reputation of “physiognomy”, which has suffered a somewhat chequered reputation since Aristotle’s musings. Tudor king Henry VIII was so sceptical of the idea that he even banned quack “professors” from profiting from their readings, and its status took a second bashing in the early 20th Century, when it was associated with phrenology – the mistaken idea that lumps and bumps on your head can predict your behaviour.

But now the discipline is gaining credibility, we may find that there are many more surprises hiding in your selfies. Intriguingly, we seem to be able to predict intelligence from someone’s face with modest accuracy – though it’s not yet clear what specific cues make someone look smart. (Needless to say, it is not as simple as whether or not they wear glasses.) Others are examining the “gaydar”. We often can guess someone’s sexual orientation within a split-second, even when there are no stereotypical clues, but it’s still a mystery as to what we’re actually reading. Further research might explain exactly how we make these snap judgements.

It will also be interesting to see how the link between personality, lifestyle and appearance changes across the lifetime. One study managed to examine records of personality and appearance, following subjects from the 1930s to the 1990s. The scientists found that although baby-faced men tended to be less dominant in their youth, they grew to be more assertive as the years wore on – perhaps because they learnt to compensate for the expectations brought about by their puppyish appearance.

More intriguingly, the authors also found evidence of a “Dorian Gray effect” – where the ageing face began to reflect certain aspects of the personality that hadn’t been obvious when the people were younger. Women who had more attractive, sociable, personalities from adolescence to their 30s slowly started to climb in physical attractiveness, so that in their 50s they were considered better-looking than those who had been less personable but naturally prettier. One possibility is that they simply knew how to make the best of their appearance, and that their inner confidence was reflected on subtle differences in expression.

After all, there is so much more to our appearance than the bone structure and skin tone, as one particularly clever study recently demonstrated. The scientists asked volunteers to wear their favourite clothes, and then took a photo of their face. Even though the clothes themselves were not visible in the mugshots, impartial judges considered them to be considerably more attractive than other pictures of the participants. The finding is particularly striking, considering that they were asked to keep neutral expressions: somehow, the boosted self-esteem shone through anyway.

Our faces aren’t just the product of our biology. We can’t change our genes or our hormones – but by cultivating our personality and sense of self-worth, they may begin to mirror something far more important.

http://www.bbc.com/future/story/20150312-what-the-face-betrays-about-you

Octopus DNA reveals secrets to intelligence

The elusive octopus genome has finally been untangled, which should allow scientists to discover answers to long-mysterious questions about the animal’s alienlike physiology: How does it camouflage itself so expertly? How does it control—and regenerate—those eight flexible arms and thousands of suckers? And, most vexing: How did a relative of the snail get to be so incredibly smart—able to learn quickly, solve puzzles and even use tools?

The findings, in Nature, reveal a vast, unexplored landscape full of novel genes, unlikely rearrangements—and some evolutionary solutions that look remarkably similar to those found in humans.

With the largest-known genome in the invertebrate world—similar in size to that of a house cat (2.7 billion base pairs) and with more genes (33,000) than humans (20,000 to 25,000)—the octopus sequence has long been known to be large and confusing. Even without a genetic map, these animals and their cephalopod cousins (squids, cuttlefishes and nautiluses) have been common subjects for neurobiology and pharmacology research. But a sequence for this group of mollusks has been “sorely needed,” says Annie Lindgren, a cephalopod researcher at Portland State University who was not involved in the new research. “Think about trying to assemble a puzzle, picture side down,” she says of octopus research to date. “A genome gives us a picture to work with.”

Among the biggest surprises contained within the genome—eliciting exclamation point–ridden e-mails from cephalopod researchers—is that octopuses possess a large group of familiar genes that are involved in developing a complex neural network and have been found to be enriched in other animals, such as mammals, with substantial processing power. Known as protocadherin genes, they “were previously thought to be expanded only in vertebrates,” says Clifton Ragsdale, an associate professor of neurobiology at the University of Chicago and a co-author of the new paper. Such genes join the list of independently evolved features we share with octopuses—including camera-type eyes (with a lens, iris and retina), closed circulatory systems and large brains.

Having followed such a vastly different evolutionary path to intelligence, however, the octopus nervous system is an especially rich subject for study. “For neurobiologists, it’s intriguing to understand how a completely distinct group has developed big, complex brains,” says Joshua Rosenthal of the University of Puerto Rico’s Institute of Neurobiology. “Now with this paper, we can better understand the molecular underpinnings.”

Part of octopuses’ sophisticated wiring system—which extends beyond the brain and is largely distributed throughout the body—controls their blink-of-an-eye camouflage. Researchers have been unsure how octopuses orchestrate their chromatophores, the pigment-filled sacs that expand and contract in milliseconds to alter their overall color and patterning. But with the sequenced genome in hand, scientists can now learn more about how this flashy system works—an enticing insight for neuroscientists and engineers alike.

Also contained in the octopus genome (represented by the California two-spot octopus, Octopus bimaculoides) are numerous previously unknown genes—including novel ones that help the octopus “taste” with its suckers. Researchers can also now peer deeper into the past of this rarely fossilized animal’s evolutionary history—even beyond their divergence with squid some 270 million years ago. In all of that time octopuses have become adept at tweaking their own genetic codes (known as RNA editing, which occurs in humans and other animals but at an extreme rate in octopuses), helping them keep nerves firing on cue at extreme temperatures. The new genetic analysis also found genes that can move around on the genome (known as transposons), which might play a role in boosting learning and memory.

One thing not found in the octopus genome, however, is evidence that its code had undergone wholesale duplication (as the genome of vertebrates had, which allowed the extra genes to acquire new functions). This was a surprise to researchers who had long marveled at the octopus’s complexity—and repeatedly stumbled over large amounts of repeated genetic code in earlier research.

The size of the octopus genome, combined with the large number of repeating sequences and, as Ragsdale describes, a “bizarre lack of interest from many genomicists,” made the task a challenging one. He was among the dozens of researchers who banded together in early 2012 to form the Cephalopod Sequencing Consortium, “to address the pressing need for genome sequencing of cephalopod mollusks,” as they noted in a white paper published later that year in Standards in Genomic Sciences.

The full octopus genome promises to make a splash in fields stretching from neurobiology to evolution to engineering. “This is such an exciting paper and a really significant step forward,” says Lindgren, who studies relationships among octopuses, which have evolved to inhabit all of the world’s oceans—from warm tidal shallows to the freezing Antarctic depths. For her and other cephalopod scientists, “having a whole genome is like suddenly getting a key to the biggest library in the world that previously you could only look into by peeking through partially blocked windows.”

http://www.scientificamerican.com/article/octopus-genome-reveals-secrets-to-complex-intelligence/

Human intelligence is withering as computers do more, but there’s a solution.


Computers are taking over the kinds of knowledge work long considered the preserve of well-educated, well-trained professionals.

By Nicholas Carr

Artificial intelligence has arrived. Today’s computers are discerning and sharp. They can sense the environment, untangle knotty problems, make subtle judgments and learn from experience. They don’t think the way we think—they’re still as mindless as toothpicks—but they can replicate many of our most prized intellectual talents. Dazzled by our brilliant new machines, we’ve been rushing to hand them all sorts of sophisticated jobs that we used to do ourselves.

But our growing reliance on computer automation may be exacting a high price. Worrisome evidence suggests that our own intelligence is withering as we become more dependent on the artificial variety. Rather than lifting us up, smart software seems to be dumbing us down.

It has been a slow process. The first wave of automation rolled through U.S. industry after World War II, when manufacturers began installing electronically controlled equipment in their plants. The new machines made factories more efficient and companies more profitable. They were also heralded as emancipators. By relieving factory hands of routine chores, they would do more than boost productivity. They would elevate laborers, giving them more invigorating jobs and more valuable talents. The new technology would be ennobling.

Then, in the 1950s, a Harvard Business School professor named James Bright went into the field to study automation’s actual effects on a variety of industries, from heavy manufacturing to oil refining to bread baking. Factory conditions, he discovered, were anything but uplifting. More often than not, the new machines were leaving workers with drabber, less demanding jobs. An automated milling machine, for example, didn’t transform the metalworker into a more creative artisan; it turned him into a pusher of buttons.

Bright concluded that the overriding effect of automation was (in the jargon of labor economists) to “de-skill” workers rather than to “up-skill” them. “The lesson should be increasingly clear,” he wrote in 1966. “Highly complex equipment” did not require “skilled operators. The ‘skill’ can be built into the machine.”

We are learning that lesson again today on a much broader scale. As software has become capable of analysis and decision-making, automation has leapt out of the factory and into the white-collar world. Computers are taking over the kinds of knowledge work long considered the preserve of well-educated, well-trained professionals: Pilots rely on computers to fly planes; doctors consult them in diagnosing ailments; architects use them to design buildings. Automation’s new wave is hitting just about everyone.

Computers aren’t taking away all the jobs done by talented people. But computers are changing the way the work gets done. And the evidence is mounting that the same de-skilling effect that ate into the talents of factory workers last century is starting to gnaw away at professional skills, even highly specialized ones. Yesterday’s machine operators are today’s computer operators.

Just look skyward. Since their invention a century ago, autopilots have helped to make air travel safer and more efficient. That happy trend continued with the introduction of computerized “fly-by-wire” jets in the 1970s. But now, aviation experts worry that we’ve gone too far. We have shifted so many cockpit tasks from humans to computers that pilots are losing their edge—and beginning to exhibit what the British aviation researcher Matthew Ebbatson calls “skill fade.”

In 2007, while working on his doctoral thesis at Cranfield University’s School of Engineering, Mr. Ebbatson conducted an experiment with a group of airline pilots. He had them perform a difficult maneuver in a flight simulator—bringing a Boeing jet with a crippled engine in for a landing in rough weather—and measured subtle indicators of their skill, such as the precision with which they maintained the plane’s airspeed.

When he compared the simulator readings with the aviators’ actual flight records, he found a close connection between a pilot’s adroitness at the controls and the amount of time the pilot had recently spent flying planes manually. “Flying skills decay quite rapidly towards the fringes of ‘tolerable’ performance without relatively frequent practice,” Mr. Ebbatson concluded. But computers now handle most flight operations between takeoff and touchdown—so “frequent practice” is exactly what pilots are not getting.

Even a slight decay in manual flying ability can risk tragedy. A rusty pilot is more likely to make a mistake in an emergency. Automation-related pilot errors have been implicated in several recent air disasters, including the 2009 crashes of Continental Flight 3407 in Buffalo and Air France Flight 447 in the Atlantic Ocean, and the botched landing of Asiana Flight 214 in San Francisco in 2013.

Late last year, a report from a Federal Aviation Administration task force on cockpit technology documented a growing link between crashes and an overreliance on automation. Pilots have become “accustomed to watching things happen, and reacting, instead of being proactive,” the panel warned. The FAA is now urging airlines to get pilots to spend more time flying by hand.

As software improves, the people using it become less likely to sharpen their own know-how. Applications that offer lots of prompts and tips are often to blame; simpler, less solicitous programs push people harder to think, act and learn.

Ten years ago, information scientists at Utrecht University in the Netherlands had a group of people carry out complicated analytical and planning tasks using either rudimentary software that provided no assistance or sophisticated software that offered a great deal of aid. The researchers found that the people using the simple software developed better strategies, made fewer mistakes and developed a deeper aptitude for the work. The people using the more advanced software, meanwhile, would often “aimlessly click around” when confronted with a tricky problem. The supposedly helpful software actually short-circuited their thinking and learning.

The philosopher Hubert Dreyfus of the University of California, Berkeley, wrote in 2002 that human expertise develops through “experience in a variety of situations, all seen from the same perspective but requiring different tactical decisions.” In other words, our skills get sharper only through practice, when we use them regularly to overcome different sorts of difficult challenges.

The goal of modern software, by contrast, is to ease our way through such challenges. Arduous, painstaking work is exactly what programmers are most eager to automate—after all, that is where the immediate efficiency gains tend to lie. In other words, a fundamental tension ripples between the interests of the people doing the automation and the interests of the people doing the work.

Nevertheless, automation’s scope continues to widen. With the rise of electronic health records, physicians increasingly rely on software templates to guide them through patient exams. The programs incorporate valuable checklists and alerts, but they also make medicine more routinized and formulaic—and distance doctors from their patients.

In a study conducted in 2007-08 in upstate New York, SUNY Albany professor Timothy Hoff interviewed more than 75 primary-care physicians who had adopted computerized systems. The doctors felt that the software was impoverishing their understanding of patients, diminishing their “ability to make informed decisions around diagnosis and treatment.”

Harvard Medical School professor Beth Lown, in a 2012 journal article written with her student Dayron Rodriquez, warned that when doctors become “screen-driven,” following a computer’s prompts rather than “the patient’s narrative thread,” their thinking can become constricted. In the worst cases, they may miss important diagnostic signals.

The risk isn’t just theoretical. In a recent paper published in the journal Diagnosis, three medical researchers—including Hardeep Singh, director of the health policy, quality and informatics program at the Veterans Administration Medical Center in Houston—examined the misdiagnosis of Thomas Eric Duncan, the first person to die of Ebola in the U.S., at Texas Health Presbyterian Hospital Dallas. They argue that the digital templates used by the hospital’s clinicians to record patient information probably helped to induce a kind of tunnel vision. “These highly constrained tools,” the researchers write, “are optimized for data capture but at the expense of sacrificing their utility for appropriate triage and diagnosis, leading users to miss the forest for the trees.” Medical software, they write, is no “replacement for basic history-taking, examination skills, and critical thinking.”

Even creative trades are increasingly suffering from automation’s de-skilling effects. Computer-aided design has helped architects to construct buildings with unusual shapes and materials, but when computers are brought into the design process too early, they can deaden the aesthetic sensitivity and conceptual insight that come from sketching and model-building.

Working by hand, psychological studies have found, is better for unlocking designers’ originality, expands their working memory and strengthens their tactile sense. A sketchpad is an “intelligence amplifier,” says Nigel Cross, a design professor at the Open University in the U.K.

When software takes over, manual skills wane. In his book “The Thinking Hand,” the Finnish architect Juhani Pallasmaa argues that overreliance on computers makes it harder for designers to appreciate the subtlest, most human qualities of their buildings. “The false precision and apparent finiteness of the computer image” narrow a designer’s perspective, he writes, which can mean technically stunning but emotionally sterile work. As University of Miami architecture professor Jacob Brillhart wrote in a 2011 paper, modern computer systems can translate sets of dimensions into precise 3-D renderings with incredible speed, but they also breed “more banal, lazy, and uneventful designs that are void of intellect, imagination and emotion.”

We do not have to resign ourselves to this situation, however. Automation needn’t remove challenges from our work and diminish our skills. Those losses stem from what ergonomists and other scholars call “technology-centered automation,” a design philosophy that has come to dominate the thinking of programmers and engineers.

When system designers begin a project, they first consider the capabilities of computers, with an eye toward delegating as much of the work as possible to the software. The human operator is assigned whatever is left over, which usually consists of relatively passive chores such as entering data, following templates and monitoring displays.

This philosophy traps people in a vicious cycle of de-skilling. By isolating them from hard work, it dulls their skills and increases the odds that they will make mistakes. When those mistakes happen, designers respond by seeking to further restrict people’s responsibilities—spurring a new round of de-skilling.

Because the prevailing technique “emphasizes the needs of technology over those of humans,” it forces people “into a supporting role, one for which we are most unsuited,” writes the cognitive scientist and design researcher Donald Norman of the University of California, San Diego.

There is an alternative.

In “human-centered automation,” the talents of people take precedence. Systems are designed to keep the human operator in what engineers call “the decision loop”—the continuing process of action, feedback and judgment-making. That keeps workers attentive and engaged and promotes the kind of challenging practice that strengthens skills.

In this model, software plays an essential but secondary role. It takes over routine functions that a human operator has already mastered, issues alerts when unexpected situations arise, provides fresh information that expands the operator’s perspective and counters the biases that often distort human thinking. The technology becomes the expert’s partner, not the expert’s replacement.

Pushing automation in a more humane direction doesn’t require any technical breakthroughs. It requires a shift in priorities and a renewed focus on human strengths and weaknesses.

Airlines, for example, could program cockpit computers to shift control back and forth between computer and pilot during a flight. By keeping the aviator alert and active, that small change could make flying even safer.

In accounting, medicine and other professions, software could be far less intrusive, giving people room to exercise their own judgment before serving up algorithmically derived suggestions.

When it comes to the computerization of knowledge work, writes John Lee of the University of Iowa, “a less-automated approach, which places the automation in the role of critiquing the operator, has met with much more success” than the typical practice of supplanting human judgment with machine calculations. The best decision-support systems provide professionals with “alternative interpretations, hypotheses, or choices.”

Human-centered automation doesn’t constrain progress. Rather, it guides progress onto a more humanistic path, providing an antidote to the all-too-common, misanthropic view that venerates computers and denigrates people.

One of the most exciting examples of the human-focused approach is known as adaptive automation. It employs cutting-edge sensors and interpretive algorithms to monitor people’s physical and mental states, then uses that information to shift tasks and responsibilities between human and computer. When the system senses that an operator is struggling with a difficult procedure, it allocates more tasks to the computer to free the operator of distractions. But when it senses that the operator’s interest is waning, it ratchets up the person’s workload to capture their attention and build their skills.

We are amazed by our computers, and we should be. But we shouldn’t let our enthusiasm lead us to underestimate our own talents. Even the smartest software lacks the common sense, ingenuity and verve of the skilled professional. In cockpits, offices or examination rooms, human experts remain indispensable. Their insight, ingenuity and intuition, honed through hard work and seasoned real-world judgment, can’t be replicated by algorithms or robots.

If we let our own skills fade by relying too much on automation, we are going to render ourselves less capable, less resilient and more subservient to our machines. We will create a world more fit for robots than for us.

Mr. Carr is the author of “The Shallows: What the Internet Is Doing to Our Brains” and most recently, of “The Glass Cage: Automation and Us.”

Thanks to R. Williams for bringing this to the attention of the It’s Interesting community.

http://online.wsj.com/articles/automation-makes-us-dumb-1416589342