New research suggests that other universes may be pulling ours

universe

Is our universe merely one of billions? Evidence of the existence of ‘multiverse’ revealed for the first time by a cosmic map of background radiation data gathered by Planck telescope. The first ‘hard evidence’ that other universes exist has been claimed to have been found by cosmologists studying new Planck data released this past June. They have concluded that it shows anomalies that can only have been caused by the gravitational pull of other universes.

“Such ideas may sound wacky now, just like the Big Bang theory did three generations ago,” says George Efstathiou, professor of astrophysics at Cambridge University.”But then we got evidence and now it has changed the whole way we think about the universe.”

Scientists had predicted that it should be evenly distributed, but the map shows a stronger concentration in the south half of the sky and a ‘cold spot’ that cannot be explained by current understanding of physics. Laura Mersini-Houghton, theoretical physicist at the University of North Carolina at Chapel Hill, and Richard Holman, professor at Carnegie Mellon University, predicted that anomalies in radiation existed and were caused by the pull from other universes in 2005. Mersini-Houghton will be in Britain soon promoting this theory and, we expect, the hard evidence at the Hay Festival on May 31 and at Oxford on June 11.

Dr Mersini-Houghton believes her hypothesis has been proven from the Planck data that data has been used to create a map of light from when the universe was just 380,000 years old. “These anomalies were caused by other universes pulling on our universe as it formed during the Big Bang,” she says. “They are the first hard evidence for the existence of other universes that we have seen.”

Columbia University mathematician Peter Woit writes in his blog, Not Even Wrong, that in recent years there have been many claims made for “evidence” of a multiverse, supposedly found in the CMB data. “Such claims often came with the remark that the Planck CMB data would convincingly decide the matter. When the Planck data was released two months ago, I looked through the press coverage and through the Planck papers for any sign of news about what the new data said about these multiverse evidence claims. There was very little there; possibly the Planck scientists found these claims to be so outlandish that it wasn’t worth the time to look into what the new data had to say about them.

“One exception,” Woit adds, “was this paper, where Planck looked for evidence of ‘dark flow’. They found nothing, and a New Scientist article summarized the situation: ‘The Planck team’s paper appears to rule out the claims of Kashlinsky and collaborators,’ says David Spergel of Princeton University, who was not involved in the work. If there is no dark flow, there is no need for exotic explanations for it, such as other universes, says Planck team member Elena Pierpaoli at the University of Southern California, Los Angeles. “You don’t have to think of alternatives.'”

“Dark Flow” sounds like a new SciFi Channel series. It’s not! The dark flow is controversial because the distribution of matter in the observed universe cannot account for it. Its existence suggests that some structure beyond the visible universe — outside our “horizon” — is pulling on matter in our vicinity.

Back in the Middle Ages, maps showed terrifying images of sea dragons at the boundaries of the known world. Today, scientists have observed strange new motion at the very limits of the known universe – kind of where you’d expect to find new things, but they still didn’t expect this. A huge swath of galactic clusters seem to be heading to a cosmic hotspot and nobody knows why.

Cosmologists regard the microwave background — a flash of light emitted 380,000 years after the universe formed — as the ultimate cosmic reference frame. Relative to it, all large-scale motion should show no preferred direction. A 2010 study tracked the mysterious cosmic ‘dark flow’ to twice the distance originally reported. The study was led by Alexander Kashlinsky at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

“This is not something we set out to find, but we cannot make it go away,” Kashlinsky said. “Now we see that it persists to much greater distances – as far as 2.5 billion light-years away,” he added.

Dark flow describes a possible non-random component of the peculiar velocity of galaxy clusters. The actual measured velocity is the sum of the velocity predicted by Hubble’s Law plus a small and unexplained (or dark) velocity flowing in a common direction. According to standard cosmological models, the motion of galaxy clusters with respect to the cosmic microwave background should be randomly distributed in all directions. However, analyzing the three-year WMAP data using the kinematic Sunyaev-Zel’dovich effect, the authors of the study found evidence of a “surprisingly coherent” 600–1000 km/s flow of clusters toward a 20-degree patch of sky between the constellations of Centaurus and Vela.

The clusters appear to be moving along a line extending from our solar system toward Centaurus/Hydra, but the direction of this motion is less certain. Evidence indicates that the clusters are headed outward along this path, away from Earth, but the team cannot yet rule out the opposite flow.

“We detect motion along this axis, but right now our data cannot state as strongly as we’d like whether the clusters are coming or going,” Kashlinsky said.

The unexplained motion has hundreds of millions of stars dashing towards a certain part of the sky at over eight hundred kilometers per second. Not much speed in cosmic terms, but the preferred direction certainly is: most cosmological models have things moving in all directions equally at the extreme edges of the universe. Something that could make things aim for a specific spot on such a massive scale hasn’t been imagined before. The scientists are keeping to the proven astrophysical strategy of calling anything they don’t understand “dark”, terming the odd motion a “dark flow”.

A black hole can’t explain the observations – objects would accelerate into the hole, while the NASA scientists see constant motion over a vast expanse of a billion light-years. You have no idea how big that is. This is giant on a scale where it’s not just that we can’t see what’s doing it; it’s that the entire makeup of the universe as we understand it can’t be right if this is happening.

The hot X-ray-emitting gas within a galaxy cluster scatters photons from the cosmic microwave background (CMB). Because galaxy clusters don’t precisely follow the expansion of space, the wavelengths of scattered photons change in a way that reflects each cluster’s individual motion.

This results in a minute shift of the microwave background’s temperature in the cluster’s direction. The change, which astronomers call the kinematic Sunyaev-Zel’dovich (KSZ) effect, is so small that it has never been observed in a single galaxy cluster.

But in 2000, Kashlinsky, working with Fernando Atrio-Barandela at the University of Salamanca, Spain, demonstrated that it was possible to tease the subtle signal out of the measurement noise by studying large numbers of clusters.

In 2008, armed with a catalog of 700 clusters assembled by Harald Ebeling at the University of Hawaii and Dale Kocevski, now at the University of California, Santa Cruz, the researchers applied the technique to the three-year WMAP data release. That’s when the mystery motion first came to light.

The new study builds on the previous one by using the five-year results from WMAP and by doubling the number of galaxy clusters.

“It takes, on average, about an hour of telescope time to measure the distance to each cluster we work with, not to mention the years required to find these systems in the first place,” Ebeling said. “This is a project requiring considerable followthrough.”

According to Atrio-Barandela, who has focused on understanding the possible errors in the team’s analysis, the new study provides much stronger evidence that the dark flow is real. For example, the brightest clusters at X-ray wavelengths hold the greatest amount of hot gas to distort CMB photons. “When processed, these same clusters also display the strongest KSZ signature — unlikely if the dark flow were merely a statistical fluke,” he said.

In addition, the team, which now also includes Alastair Edge at the University of Durham, England, sorted the cluster catalog into four “slices” representing different distance ranges. They then examined the preferred flow direction for the clusters within each slice. While the size and exact position of this direction display some variation, the overall trends among the slices exhibit remarkable agreement.

The researchers are currently working to expand their cluster catalog in order to track the dark flow to about twice the current distance. Improved modeling of hot gas within the galaxy clusters will help refine the speed, axis, and direction of motion.

Future plans call for testing the findings against newer data released from the WMAP project and the European Space Agency’s Planck mission, which is also currently mapping the microwave background.

Which is fantastic! Such discoveries force a whole new set of ideas onto the table which, even if they turn out to be wrong, are the greatest ways to advance science and our understanding of everything. One explanation that’s already been offered is that our universe underwent a period of hyper-inflation early in its existence, and everything we think of as the vast and infinite universe is actually a small corner under the sofa of the real expanse of reality. Which would be an amazing, if humbling, discovery.

The image at the top of the page shows the most distant object we have ever observed with high confidence, according to Wei Zheng, the leading astronomer of the team at Johns Hopkins University who that noticed the galaxy on multiple images from both the Hubble and Spitzer space telescopes. At 13.2-billion years old, we are technically seeing this galaxy when it was very young, but its light is only reaching Earth now.

http://www.dailygalaxy.com/my_weblog/2013/10/is-our-universe-one-of-billions-new-planck-data-has-anomalies-caused-by-unknown-gravitational-pull-t.html

New research shows that sleep functions to allow the brain to eliminate toxins that accumulate while we are awake

sleepingbrain_wide-e40290d47221863e13990f78f86b983781d5673e-s40-c85

While the brain sleeps, it clears out harmful toxins, a process that may reduce the risk of Alzheimer’s, researchers say.

During sleep, the flow of cerebrospinal fluid in the brain increases dramatically, washing away harmful waste proteins that build up between brain cells during waking hours, a study of mice found.

“It’s like a dishwasher,” says Dr. Maiken Nedergaard, a professor of neurosurgery at the University of Rochester and an author of the study in Science.

The results appear to offer the best explanation yet of why animals and people need sleep. If this proves to be true in humans as well, it could help explain a mysterious association between sleep disorders and brain diseases, including Alzheimer’s.

Nedergaard and a team of scientists discovered the cleaning process while studying the brains of sleeping mice. The scientists noticed that during sleep, the system that circulates cerebrospinal fluid through the brain and nervous system was “pumping fluid into the brain and removing fluid from the brain in a very rapid pace,” Nedergaard says.

The team discovered that this increased flow was possible in part because when mice went to sleep, their brain cells actually shrank, making it easier for fluid to circulate. When an animal woke up, the brain cells enlarged again and the flow between cells slowed to a trickle. “It’s almost like opening and closing a faucet,” Nedergaard says. “It’s that dramatic.”

Nedergaard’s team, which is funded by the National Institute of Neurological Disorders and Stroke, had previously shown that this fluid was carrying away waste products that build up in the spaces between brain cells.

The process is important because what’s getting washed away during sleep are waste proteins that are toxic to brain cells, Nedergaard says. This could explain why we don’t think clearly after a sleepless night and why a prolonged lack of sleep can actually kill an animal or a person, she says.

So why doesn’t the brain do this sort of housekeeping all the time? Nedergaard thinks it’s because cleaning takes a lot of energy. “It’s probably not possible for the brain to both clean itself and at the same time [be] aware of the surroundings and talk and move and so on,” she says.

The brain-cleaning process has been observed in rats and baboons, but not yet in humans, Nedergaard says. Even so, it could offer a new way of understanding human brain diseases including Alzheimer’s. That’s because one of the waste products removed from the brain during sleep is beta amyloid, the substance that forms sticky plaques associated with the disease.

That’s probably not a coincidence, Nedergaard says. “Isn’t it interesting that Alzheimer’s and all other diseases associated with dementia, they are linked to sleep disorders,” she says.

Researchers who study Alzheimer’s say Nedergaard’s research could help explain a number of recent findings related to sleep. One of these involves how sleep affects levels of beta amyloid, says Randall Bateman, a professor of neurology Washington University in St. Louis who wasn’t involved in the study.

“Beta amyloid concentrations continue to increase while a person is awake,” Bateman says. “And then after people go to sleep that concentration of beta amyloid decreases. This report provides a beautiful mechanism by which this may be happening.”

The report also offers a tantalizing hint of a new approach to Alzheimer’s prevention, Bateman says. “It does raise the possibility that one might be able to actually control sleep in a way to improve the clearance of beta amyloid and help prevent amyloidosis that we think can lead to Alzheimer’s disease.”

http://www.npr.org/blogs/health/2013/10/17/236211811/brains-sweep-themselves-clean-of-toxins-during-sleep

http://m.sciencemag.org/content/342/6156/373.abstract

Thanks to Kebmodee for bringing this to the It’s Interesting community.

Ballet dancers reduce their dizziness by shrinking part of their brains

ballet

A team from Imperial College London said dancers appear to suppress signals from the inner ear to the brain.

Dancers traditionally use a technique called “spotting”, which minimises head movement.

The researchers say their findings may help patients who experience chronic dizziness.

Dizziness is the feeling of movement when, in reality, you are still.

For most it is an occasional, temporary sensation. But around one person in four experiences chronic dizziness at some point in their life.

When someone turns or spins around rapidly, fluid in the vestibular organs of the inner ear can be felt moving through tiny hairs.

Once they stop, the fluid continues to move, which can make a person feel like they are still spinning.

Ballet dancers train hard to be able to spin, or pirouette, rapidly and repeatedly.

They use a technique called spotting, focusing on a spot – as they spin, their head should be the last bit to move and the first to come back.

In the study, published in the journal Cerebral Cortex, the team recruited 29 female ballet dancers and 20 female rowers of similar age and fitness levels.

After they were spun in the chair, each was asked to turn a handle in time with how quickly they felt like they were still spinning after they had stopped.

Eye reflexes triggered by input from the vestibular organs were also measured.

Magnetic resonance imaging (MRI) scans were also taken to look at participants’ brain structures.

Dancers’ perception of spinning lasted a shorter time than rowers’ – and the more experienced the dancers, the greater the effect.

The scans showed differences between the dancers and the rowers in two parts of the brain: the cerebellum, which is where sensory input from the vestibular organs is processed, and the cerebral cortex, which perceives dizziness.

The team also found that perception of spinning closely matched the eye reflexes triggered by vestibular signals in the rowers, but in dancers there was no such link.

Dr Barry Seemungal, of the department of medicine at Imperial College London, who led the research, said: “It’s not useful for a ballet dancer to feel dizzy or off balance. Their brains adapt over years of training to suppress that input.

“Consequently, the signal going to the brain areas responsible for perception of dizziness in the cerebral cortex is reduced, making dancers resistant to feeling dizzy.”

He added: “If we can target that same brain area or monitor it in patients with chronic dizziness, we can begin to understand how to treat them better.”

Deborah Bull, a former principal dancer with the Royal Ballet, who is now the executive director of the Cultural Institute at King’s College, London, told BBC Radio 4’s Today programme: “What’s really interesting is what ballet dancers have done is refine and make precise the instruction to the brain so that actually the brain has shrunk. We don’t need all those extra neurons.”

http://www.bbc.co.uk/news/health-24283709

New Solar Plant in Arizona Powers 70,000 Homes Day Or Night

solar power

Outside Phoenix, Ariz., on Wednesday, a power company turned on one of the largest solar power plants of its kind in the world. But unlike other solar farms, this plant continues giving power to 70,000 Arizona households long after the sunset.

The Solana plant uses 3,200 mirrors that are tilted so they focus the sun’s rays to heat a specially-designed oil. That boils water, which drives turbines and generates electricity. Or, the oil can heat giant tanks of salt, which soak up the energy. When the sun goes down, or when households need more power, the hot salt tanks heat up the oil, which again boils water to drive the turbines.

Whereas conventional solar panels only give power when the sun is up, these giant salt batteries give renewable energy on demand. They can store six hours-worth of energy, which can meet the demands of Arizona customers, according to months of test data.

“That’s the sort of thing you can do with a conventional gas plant that no one had envisioned doing with renewables,” says Patrick Dinkel, vice president of resource management for Arizona Public Service, which is Arizona’s largest utility company.

The company has already bought the power from this plant for the next 30 years, to add to the state’s goal of generating 15 percent of its energy from renewable sources by 2025. The plant does mean higher energy bills for APS customers — an extra $1.28 per month for the first five years, $1.09 per month for the next five, and then 94 cents per month after that, according to the company. Dinkel says the state won’t see a lot more of these plants soon because that would cost too much.

“Right now natural gas wins that race (for cheap power,)” Dinkel says. “The challenge is no one knows what those economics look like in five years.”

The U.S. Department of Energy lent Abengoa Solar, the Spanish company that built that plant as well as Europe’s first solar thermal power plant, $1.4 billion, out of the $2 billion price tag. It’s the same program that financed Solyndra, a solar panel firm that went bankrupt in 2011. But this is a different kind of investment, says Armando Zuluaga, general manager of Abengoa Solar. He points out the company already has a public utility buying their output for the next 30 years, so the government will get its money back with interest.

“There’s no market risk here,” Zuluaga says. “It’s just about getting the plant built.”

This won’t be the last we hear of Abengoa Solar and this technology. The company is building a similar, though smaller plant in the Mojave desert in California, which will come online next year, as well as plants in South Africa.

http://www.npr.org/blogs/thetwo-way/2013/10/11/232348077/in-ariz-a-solar-plant-that-powers-70-000-homes-day-or-night

Thanks to Ray Gaudette for bringing this to the attention of the It’s Interesting community.

How Exercise Beefs Up the Brain

exercise

New research explains how abstract benefits of exercise—from reversing depression to fighting cognitive decline—might arise from a group of key molecules.

While our muscles pump iron, our cells pump out something else: molecules that help maintain a healthy brain. But scientists have struggled to account for the well-known mental benefits of exercise, from counteracting depression and aging to fighting Alzheimer’s and Parkinson’s disease. Now, a research team may have finally found a molecular link between a workout and a healthy brain.

Much exercise research focuses on the parts of our body that do the heavy lifting. Muscle cells ramp up production of a protein called FNDC5 during a workout. A fragment of this protein, known as irisin, gets lopped off and released into the bloodstream, where it drives the formation of brown fat cells, thought to protect against diseases such as diabetes and obesity. (White fat cells are traditionally the villains.)

While studying the effects of FNDC5 in muscles, cellular biologist Bruce Spiegelman of Harvard Medical School in Boston happened upon some startling results: Mice that did not produce a so-called co-activator of FNDC5 production, known as PGC-1α, were hyperactive and had tiny holes in certain parts of their brains. Other studies showed that FNDC5 and PGC-1α are present in the brain, not just the muscles, and that both might play a role in the development of neurons.

Spiegelman and his colleagues suspected that FNDC5 (and the irisin created from it) was responsible for exercise-induced benefits to the brain—in particular, increased levels of a crucial protein called brain-derived neurotrophic factor (BDNF), which is essential for maintaining healthy neurons and creating new ones. These functions are crucial to staving off neurological diseases, including Alzheimer’s and Parkinson’s. And the link between exercise and BDNF is widely accepted. “The phenomenon has been established over the course of, easily, the last decade,” says neuroscientist Barbara Hempstead of Weill Cornell Medical College in New York City, who was not involved in the new work. “It’s just, we didn’t understand the mechanism.”

To sort out that mechanism, Spiegelman and his colleagues performed a series of experiments in living mice and cultured mouse brain cells. First, they put mice on a 30-day endurance training regimen. They didn’t have to coerce their subjects, because running is part of a mouse’s natural foraging behavior. “It’s harder to get them to lift weights,” Spiegelman notes. The mice with access to a running wheel ran the equivalent of a 5K every night.

Aside from physical differences between wheel-trained mice and sedentary ones—“they just look a little bit more like a couch potato,” says co-author Christiane Wrann, also of Harvard Medical School, of the latter’s plumper figures—the groups also showed neurological differences. The runners had more FNDC5 in their hippocampus, an area of the brain responsible for learning and memory.

Using mouse brain cells developing in a dish, the group next showed that increasing the levels of the co-activator PGC-1α boosts FNDC5 production, which in turn drives BDNF genes to produce more of the vital neuron-forming BDNF protein. They report these results online today in Cell Metabolism. Spiegelman says it was surprising to find that the molecular process in neurons mirrors what happens in muscles as we exercise. “What was weird is the same pathway is induced in the brain,” he says, “and as you know, with exercise, the brain does not move.”

So how is the brain getting the signal to make BDNF? Some have theorized that neural activity during exercise (as we coordinate our body movements, for example) accounts for changes in the brain. But it’s also possible that factors outside the brain, like those proteins secreted from muscle cells, are the driving force. To test whether irisin created elsewhere in the body can still drive BDNF production in the brain, the group injected a virus into the mouse’s bloodstream that causes the liver to produce and secrete elevated levels of irisin. They saw the same effect as in exercise: increased BDNF levels in the hippocampus. This suggests that irisin could be capable of passing the blood-brain barrier, or that it regulates some other (unknown) molecule that crosses into the brain, Spiegelman says.

Hempstead calls the findings “very exciting,” and believes this research finally begins to explain how exercise relates to BDNF and other so-called neurotrophins that keep the brain healthy. “I think it answers the question that most of us have posed in our own heads for many years.”

The effect of liver-produced irisin on the brain is a “pretty cool and somewhat surprising finding,” says Pontus Boström, a diabetes researcher at the Karolinska Institute in Sweden. But Boström, who was among the first scientists to identify irisin in muscle tissue, says the work doesn’t answer a fundamental question: How much of exercise’s BDNF-promoting effects come from irisin reaching the brain from muscle cells via the bloodstream, and how much are from irisin created in the brain?

Though the authors point out that other important regulator proteins likely play a role in driving BDNF and other brain-nourishing factors, they are focusing on the benefits of irisin and hope to develop an injectable form of FNDC5 as a potential treatment for neurological diseases and to improve brain health with aging.

http://news.sciencemag.org/biology/2013/10/how-exercise-beefs-brain

Thanks to Dr. Rajadhyaksha for bringing this to the attention of the It’s Interesting community.

Graduate student frozen out of research in Antarctica because of U.S. government shutdown

antarctica

Time on his hands. Sebastian Vivancos (inset) is part of the newly arrived team whose planned research activities at the U.S. Palmer Station in Antarctica are being thwarted by the government shutdown.

After 5 years as a lieutenant in the U.S. Coast Guard, Jamie Collins knows what it’s like to be at sea. But nothing in his military service prepared him for his current 30,000-km scientific round trip to nowhere, courtesy of the failure of the U.S. Congress to approve a budget. His predicament is one of the stranger—and sadder—tales of how the government-wide shutdown is affecting researchers.

Collins, a third-year graduate student in chemical oceanography, arrived Wednesday at the National Science Foundation’s (NSF’s) Palmer Station in Antarctica. He was eager to begin working on a long-running ecological research project funded by NSF and to start collecting data for his dissertation in a graduate program run jointly by the Massachusetts Institute of Technology and the Woods Hole Oceanographic Institution. But the rough seas he encountered during his 4-day crossing of the notorious Drake Passage in the south Atlantic—the final leg of a journey that began in Boston—paled in comparison to the storm he encountered once he stepped off the Laurence M. Gould, a U.S. icebreaking research vessel that ferries scientists and supplies between Puenta Arenas, Chile, and the west Antarctic Peninsula.

On Tuesday, NSF had announced that its contractor for Antarctic logistical support, Lockheed Martin, would begin putting the three U.S. stations on “caretaker” status unless Congress passed an appropriations bill to continue funding the government by 14 October. Although legislators will eventually adopt such a bill, nobody expects them to act in the next few days. Without an appropriation, NSF has no money to operate the stations.

For Collins, that announcement meant his plans for an intensive 5-month research regime had suddenly melted away. “The station manager told us not to unpack our stuff and to stay on the ship,” he says in a phone call to ScienceInsider from the ship. “She said we were to wait here for a week while they prepare to shut down the station. Then we’d sail back to Chile, and go home.”

Collins was stunned. “I had spent all summer preparing for this trip,” he says. He had filled three pallets with supplies for his experiments on how algae in the region detect and react to the presence of ultraviolet radiation, part of a larger effort to understand the role that bacteria play in sequestering carbon in the Southern Ocean. “Without the data from those experiments, I may have to reevaluate what to do for my Ph.D.,” he adds.

Collins was also part of the first wave of students arriving at Palmer this season to work on a research project, begun in 1990, that explores how the extent of annual sea ice affects the polar biota. The project is one of 26 so-called LTER (Long Term Ecological Research) sites around the world that NSF supports. He was scheduled to divide his time at Palmer between his own research and monitoring penguin colonies on several offshore islands as part of the LTER project. And he had signed up for a 6-week research cruise aboard the Gould that supplements the land-based LTER observations with oceanographic data collected up and down the peninsula.

Despite the jarring news, the 31-year-old Collins says that he is more worried about what it may mean to some of his younger colleagues with less worldly experience. “I spent 5 years in the military and I’m used to dealing with bureaucracy,” he explains. “And nothing that happens here is going to deter me from pursuing my goal of a career in science. But for some of the undergraduates on the trip, this is their first taste of what Congress thinks about the value of scientific research. And it’s sending them a pretty horrific message.”

http://news.sciencemag.org/people-events/2013/10/tales-shutdown-grad-student-frozen-out-research-antarctica

Thanks to Dr. Rajadhyaksha for bringing this to the attention of the It’s Interesting community.

World record solar cell with 44.7% efficiency

solar cell

German Fraunhofer Institute for Solar Energy Systems, Soitec, CEA-Leti and the Helmholtz Center Berlin announced today that they have achieved a new world record for the conversion of sunlight into electricity using a new solar cell structure with four solar subcells. Surpassing competition after only over three years of research, and entering the roadmap at world class level, a new record efficiency of 44.7% was measured at a concentration of 297 suns. This indicates that 44.7% of the solar spectrum’s energy, from ultraviolet through to the infrared, is converted into electrical energy. This is a major step towards reducing further the costs of solar electricity and continues to pave the way to the 50% efficiency roadmap.

Back in May 2013, the German-French team of Fraunhofer ISE, Soitec, CEA-Leti and the Helmholtz Center Berlin had already announced a solar cell with 43.6% efficiency. Building on this result, further intensive research work and optimization steps led to the present efficiency of 44.7%.

These solar cells are used in concentrator photovoltaics (CPV), a technology which achieves more than twice the efficiency of conventional PV power plants in sun-rich locations. The terrestrial use of so-called III-V multi-junction solar cells, which originally came from space technology, has prevailed to realize highest efficiencies for the conversion of sunlight to electricity. In this multi-junction solar cell, several cells made out of different III-V semiconductor materials are stacked on top of each other. The single subcells absorb different wavelength ranges of the solar spectrum.

“We are incredibly proud of our team which has been working now for three years on this four-junction solar cell,” says Frank Dimroth, Department Head and Project Leader in charge of this development work at Fraunhofer ISE. “This four-junction solar cell contains our collected expertise in this area over many years. Besides improved materials and optimization of the structure, a new procedure called wafer bonding plays a central role. With this technology, we are able to connect two semiconductor crystals, which otherwise cannot be grown on top of each other with high crystal quality. In this way we can produce the optimal semiconductor combination to create the highest efficiency solar cells.”

“This world record increasing our efficiency level by more than 1 point in less than 4 months demonstrates the extreme potential of our four-junction solar cell design which relies on Soitec bonding techniques and expertise,” says André-Jacques Auberton-Hervé, Soitec’s Chairman and CEO. “It confirms the acceleration of the roadmap towards higher efficiencies which represents a key contributor to competitiveness of our own CPV systems. We are very proud of this achievement, a demonstration of a very successful collaboration.”

“This new record value reinforces the credibility of the direct semiconductor bonding approaches that is developed in the frame of our collaboration with Soitec and Fraunhofer ISE. We are very proud of this new result, confirming the broad path that exists in solar technologies for advanced III-V semiconductor processing,” said Leti CEO Laurent Malier.

Concentrator modules are produced by Soitec (started in 2005 under the name Concentrix Solar, a spin-off of Fraunhofer ISE). This particularly efficient technology is employed in solar power plants located in sun-rich regions with a high percentage of direct radiation. Presently Soitec has CPV installations in 18 different countries including Italy, France, South Africa and California.

http://phys.org/news/2013-09-world-solar-cell-efficiency.html

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

Scientists create never-before-seen form of matter

matter

Harvard and MIT scientists are challenging the conventional wisdom about light, and they didn’t need to go to a galaxy far, far away to do it.

Working with colleagues at the Harvard-MIT Center for Ultracold Atoms, a group led by Harvard Professor of Physics Mikhail Lukin and MIT Professor of Physics Vladan Vuletic have managed to coax photons into binding together to form molecules – a state of matter that, until recently, had been purely theoretical. The work is described in a September 25 paper in Nature.

The discovery, Lukin said, runs contrary to decades of accepted wisdom about the nature of light. Photons have long been described as massless particles which don’t interact with each other – shine two laser beams at each other, he said, and they simply pass through one another.

“Photonic molecules,” however, behave less like traditional lasers and more like something you might find in science fiction – the light saber.

“Most of the properties of light we know about originate from the fact that photons are massless, and that they do not interact with each other,” Lukin said. “What we have done is create a special type of medium in which photons interact with each other so strongly that they begin to act as though they have mass, and they bind together to form molecules. This type of photonic bound state has been discussed theoretically for quite a while, but until now it hadn’t been observed.

“It’s not an in-apt analogy to compare this to light sabers,” Lukin added. “When these photons interact with each other, they’re pushing against and deflect each other. The physics of what’s happening in these molecules is similar to what we see in the movies.”

To get the normally-massless photons to bind to each other, Lukin and colleagues, including Harvard post-doctoral fellow Ofer Fisterberg, former Harvard doctoral student Alexey Gorshkov and MIT graduate students Thibault Peyronel and Qiu Liang couldn’t rely on something like the Force – they instead turned to a set of more extreme conditions.

Researchers began by pumped rubidium atoms into a vacuum chamber, then used lasers to cool the cloud of atoms to just a few degrees above absolute zero. Using extremely weak laser pulses, they then fired single photons into the cloud of atoms.

As the photons enter the cloud of cold atoms, Lukin said, its energy excites atoms along its path, causing the photon to slow dramatically. As the photon moves through the cloud, that energy is handed off from atom to atom, and eventually exits the cloud with the photon.

“When the photon exits the medium, its identity is preserved,” Lukin said. “It’s the same effect we see with refraction of light in a water glass. The light enters the water, it hands off part of its energy to the medium, and inside it exists as light and matter coupled together, but when it exits, it’s still light. The process that takes place is the same it’s just a bit more extreme – the light is slowed considerably, and a lot more energy is given away than during refraction.”

When Lukin and colleagues fired two photons into the cloud, they were surprised to see them exit together, as a single molecule.

The reason they form the never-before-seen molecules?

An effect called a Rydberg blockade, Lukin said, which states that when an atom is excited, nearby atoms cannot be excited to the same degree. In practice, the effect means that as two photons enter the atomic cloud, the first excites an atom, but must move forward before the second photon can excite nearby atoms.

The result, he said, is that the two photons push and pull each other through the cloud as their energy is handed off from one atom to the next.

“It’s a photonic interaction that’s mediated by the atomic interaction,” Lukin said. “That makes these two photons behave like a molecule, and when they exit the medium they’re much more likely to do so together than as single photons.”

While the effect is unusual, it does have some practical applications as well.

“We do this for fun, and because we’re pushing the frontiers of science,” Lukin said. “But it feeds into the bigger picture of what we’re doing because photons remain the best possible means to carry quantum information. The handicap, though, has been that photons don’t interact with each other.”

To build a quantum computer, he explained, researchers need to build a system that can preserve quantum information, and process it using quantum logic operations. The challenge, however, is that quantum logic requires interactions between individual quanta so that quantum systems can be switched to perform information processing.

“What we demonstrate with this process allows us to do that,” Lukin said. “Before we make a useful, practical quantum switch or photonic logic gate we have to improve the performance, so it’s still at the proof-of-concept level, but this is an important step. The physical principles we’ve established here are important.”

The system could even be useful in classical computing, Lukin said, considering the power-dissipation challenges chip-makers now face. A number of companies – including IBM – have worked to develop systems that rely on optical routers that convert light signals into electrical signals, but those systems face their own hurdles.

Lukin also suggested that the system might one day even be used to create complex three-dimensional structures – such as crystals – wholly out of light.

“What it will be useful for we don’t know yet, but it’s a new state of matter, so we are hopeful that new applications may emerge as we continue to investigate these photonic molecules’ properties,” he said.

http://phys.org/news/2013-09-scientists-never-before-seen.html

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

First mechanical gear discovered in a living creature

mechanical gear 2

mechanical gear 1

With two diminutive legs locked into a leap-ready position, the tiny jumper bends its body taut like an archer drawing a bow. At the top of its legs, a minuscule pair of gears engage—their strange, shark-fin teeth interlocking cleanly like a zipper. And then, faster than you can blink, think, or see with the naked eye, the entire thing is gone. In 2 milliseconds it has bulleted skyward, accelerating at nearly 400 g’s—a rate more than 20 times what a human body can withstand. At top speed the jumper breaks 8 mph—quite a feat considering its body is less than one-tenth of an inch long.

This miniature marvel is an adolescent issus, a kind of planthopper insect and one of the fastest accelerators in the animal kingdom. As a duo of researchers in the U.K. reported recently in the journal Science, the issus also the first living creature ever discovered to sport a functioning gear. “Jumping is one of the most rapid and powerful things an animal can do,” says Malcolm Burrows, a zoologist at the University of Cambridge and the lead author of the paper, “and that leads to all sorts of crazy specializations.”

The researchers believe that the issus—which lives chiefly on European climbing ivy—evolved its acrobatic prowess because it needs to flee dangerous situations. Although they’re not exactly sure if the rapid jump evolved to escape hungry birds, parasitizing wasps, or the careless mouths of large grazing animals, “there’s been enormous evolutionary pressure to become faster and faster, and jump further and further away,” Burrows says. But gaining this high acceleration has put incredible demands on the reaction time of insect’s body parts, and that’s where the gears—which “you can imagine being at the top of the thigh bone in a human,” Burrows says—come in.

“As the legs unfurl to power the jump,” Burrows says, “both have to move at exactly the same time. If they didn’t, the animal would start to spiral out of control.” Larger animals, whether kangaroos or NBA players, rely on their nervous system to keep their legs in sync when pushing off to jump—using a constant loop of adjustment and feedback. But for the issus, their legs outpace their nervous system. By the time the insect has sent a signal from its legs to its brain and back again, roughly 5 or 6 milliseconds, the launch has long since happened. Instead, the gears, which engage before the jump, let the issus lock its legs together—synchronizing their movements to a precision of 1/300,000 of a second.

The gears themselves are an oddity. With gear teeth shaped like cresting waves, they look nothing like what you’d find in your car or in a fancy watch. (The style that you’re most likely familiar with is called an involute gear, and it was designed by the Swiss mathematician Leonhard Euler in the 18th century.) There could be two reasons for this. Through a mathematical oddity, there is a limitless number of ways to design intermeshing gears. So, either nature evolved one solution at random, or, as Gregory Sutton, coauthor of the paper and insect researcher at the University of Bristol, suspects, the shape of the issus’s gear is particularly apt for the job it does. It’s built for “high precision and speed in one direction,” he says. “It’s a prototype for a new type of gear.”

Another odd thing about this discovery is that although there are many jumping insects like the issus—including ones that are even faster and better jumpers—the issus is apparently the only one with natural gears. Most other bugs synchronize the quick jolt of their leaping legs through friction, using bumpy or grippy surfaces to press the top of their legs together, says Duke University biomechanics expert Steve Vogel, who was not involved in this study. Like gears, this ensures the legs move at the same rate, but without requiring a complicated interlocking mechanism. “There are a lot of friction pads around, and they accomplish pretty much of the same thing,” he says. “So I wonder what extra capacity these gears confer. They’re rather specialized, and there are lots of other jumpers that don’t have them, so there must be some kind of advantage.”

Even stranger is that the issus doesn’t keep these gears throughout its life cycle. As the adolescent insect grows, it molts half a dozen times, upgrading its exoskeleton (gears included) for larger and larger versions. But after its final molt into adulthood—poof, the gears are gone. The adult syncs its legs by friction like all the other planthoppers. “I’m gobsmacked,” says Sutton. “We have a hypothesis as to why this is the case, but we can’t tell you for sure.”

Their idea: If one of the gear teeth were to slip and break in an adult (the researchers observed this in adolescent bugs), its jumping ability would be hindered forever. With no more molts, it would have no chance to grow more gears. And with every bound, “the whole system might slip, accelerating damage to the rest of the gear teeth,” Sutton says. “Just like if your car has a gear train missing a tooth. Every time you get to that missing tooth, the gear train jerks.”

Read more: http://www.popularmechanics.com/science/environment/the-first-gear-discovered-in-nature-15916433?click=pm_latest

Thanks to Jody Troupe for bringing this to the attention of the It’s Interesting community.

New study in the journal Science shows that poverty reduces brainpower needed for navigating other areas of life

povertyResearch based at Princeton University found that poverty and all its related concerns require so much mental energy that the poor have less remaining brainpower to devote to other areas of life. Experiments showed that the impact of financial concerns on the cognitive function of low-income individuals was similar to a 13-point dip in IQ, or the loss of an entire night’s sleep. To gauge the influence of poverty in natural contexts, the researchers tested 464 sugarcane farmers in India who rely on the annual harvest for at least 60 percent of their income. Each farmer performed better on common fluid-intelligence and cognition tests post-harvest compared to pre-harvest.

Poverty and all its related concerns require so much mental energy that the poor have less remaining brainpower to devote to other areas of life, according to research based at Princeton University. As a result, people of limited means are more likely to make mistakes and bad decisions that may be amplified by — and perpetuate — their financial woes.

Published in the journal Science, the study presents a unique perspective regarding the causes of persistent poverty. The researchers suggest that being poor may keep a person from concentrating on the very avenues that would lead them out of poverty. A person’s cognitive function is diminished by the constant and all-consuming effort of coping with the immediate effects of having little money, such as scrounging to pay bills and cut costs. Thusly, a person is left with fewer “mental resources” to focus on complicated, indirectly related matters such as education, job training and even managing their time.

In a series of experiments, the researchers found that pressing financial concerns had an immediate impact on the ability of low-income individuals to perform on common cognitive and logic tests. On average, a person preoccupied with money problems exhibited a drop in cognitive function similar to a 13-point dip in IQ, or the loss of an entire night’s sleep.

But when their concerns were benign, low-income individuals performed competently, at a similar level to people who were well off, said corresponding author Jiaying Zhao, who conducted the study as a doctoral student in the lab of co-author Eldar Shafir, Princeton’s William Stewart Tod Professor of Psychology and Public Affairs. Zhao and Shafir worked with Anandi Mani, an associate professor of economics at the University of Warwick in Britain, and Sendhil Mullainathan, a Harvard University economics professor.

“These pressures create a salient concern in the mind and draw mental resources to the problem itself. That means we are unable to focus on other things in life that need our attention,” said Zhao, who is now an assistant professor of psychology at the University of British Columbia.

“Previous views of poverty have blamed poverty on personal failings, or an environment that is not conducive to success,” she said. “We’re arguing that the lack of financial resources itself can lead to impaired cognitive function. The very condition of not having enough can actually be a cause of poverty.”

The mental tax that poverty can put on the brain is distinct from stress, Shafir explained. Stress is a person’s response to various outside pressures that — according to studies of arousal and performance — can actually enhance a person’s functioning, he said. In the Science study, Shafir and his colleagues instead describe an immediate rather than chronic preoccupation with limited resources that can be a detriment to unrelated yet still important tasks.

“Stress itself doesn’t predict that people can’t perform well — they may do better up to a point,” Shafir said. “A person in poverty might be at the high part of the performance curve when it comes to a specific task and, in fact, we show that they do well on the problem at hand. But they don’t have leftover bandwidth to devote to other tasks. The poor are often highly effective at focusing on and dealing with pressing problems. It’s the other tasks where they perform poorly.”

The fallout of neglecting other areas of life may loom larger for a person just scraping by, Shafir said. Late fees tacked on to a forgotten rent payment, a job lost because of poor time-management — these make a tight money situation worse. And as people get poorer, they tend to make difficult and often costly decisions that further perpetuate their hardship, Shafir said. He and Mullainathan were co-authors on a 2012 Science paper that reported a higher likelihood of poor people to engage in behaviors that reinforce the conditions of poverty, such as excessive borrowing.

“They can make the same mistakes, but the outcomes of errors are more dear,” Shafir said. “So, if you live in poverty, you’re more error prone and errors cost you more dearly — it’s hard to find a way out.”

The first set of experiments took place in a New Jersey mall between 2010 and 2011 with roughly 400 subjects chosen at random. Their median annual income was around $70,000 and the lowest income was around $20,000. The researchers created scenarios wherein subjects had to ponder how they would solve financial problems, for example, whether they would handle a sudden car repair by paying in full, borrowing money or putting the repairs off. Participants were assigned either an “easy” or “hard” scenario in which the cost was low or high — such as $150 or $1,500 for the car repair. While participants pondered these scenarios, they performed common fluid-intelligence and cognition tests.

Subjects were divided into a “poor” group and a “rich” group based on their income. The study showed that when the scenarios were easy — the financial problems not too severe — the poor and rich performed equally well on the cognitive tests. But when they thought about the hard scenarios, people at the lower end of the income scale performed significantly worse on both cognitive tests, while the rich participants were unfazed.

To better gauge the influence of poverty in natural contexts, between 2010 and 2011 the researchers also tested 464 sugarcane farmers in India who rely on the annual harvest for at least 60 percent of their income. Because sugarcane harvests occur once a year, these are farmers who find themselves rich after harvest and poor before it. Each farmer was given the same tests before and after the harvest, and performed better on both tests post-harvest compared to pre-harvest.

The cognitive effect of poverty the researchers found relates to the more general influence of “scarcity” on cognition, which is the larger focus of Shafir’s research group. Scarcity in this case relates to any deficit — be it in money, time, social ties or even calories — that people experience in trying to meet their needs. Scarcity consumes “mental bandwidth” that would otherwise go to other concerns in life, Zhao said.

“These findings fit in with our story of how scarcity captures attention. It consumes your mental bandwidth,” Zhao said. “Just asking a poor person to think about hypothetical financial problems reduces mental bandwidth. This is an acute, immediate impact, and has implications for scarcity of resources of any kind.”

“We documented similar effects among people who are not otherwise poor, but on whom we imposed scarce resources,” Shafir added. “It’s not about being a poor person — it’s about living in poverty.”

Many types of scarcity are temporary and often discretionary, said Shafir, who is co-author with Mullainathan of the book, “Scarcity: Why Having Too Little Means So Much,” to be published in September. For instance, a person pressed for time can reschedule appointments, cancel something or even decide to take on less.

“When you’re poor you can’t say, ‘I’ve had enough, I’m not going to be poor anymore.’ Or, ‘Forget it, I just won’t give my kids dinner, or pay rent this month.’ Poverty imposes a much stronger load that’s not optional and in very many cases is long lasting,” Shafir said. “It’s not a choice you’re making — you’re just reduced to few options. This is not something you see with many other types of scarcity.”

The researchers suggest that services for the poor should accommodate the dominance that poverty has on a person’s time and thinking. Such steps would include simpler aid forms and more guidance in receiving assistance, or training and educational programs structured to be more forgiving of unexpected absences, so that a person who has stumbled can more easily try again.

“You want to design a context that is more scarcity proof,” said Shafir, noting that better-off people have access to regular support in their daily lives, be it a computer reminder, a personal assistant, a housecleaner or a babysitter.

“There’s very little you can do with time to get more money, but a lot you can do with money to get more time,” Shafir said. “The poor, who our research suggests are bound to make more mistakes and pay more dearly for errors, inhabit contexts often not designed to help.”

The paper, “Poverty impedes cognitive function,” was published Aug. 30 by Science. The work was supported by the National Science Foundation (award number SES-0933497), the International Finance Corporation and the IFMR Trust in India.

http://www.princeton.edu/main/news/archive/S37/75/69M50/index.xml?section=topstories