Posts Tagged ‘physics’

by Matt Williams

The idea of traveling to another star system has been the dream of people long before the first rockets and astronauts were sent to space. But despite all the progress we have made since the beginning of the Space Age, interstellar travel remains just that – a dream. While theoretical concepts have been proposed, the issues of cost, travel time and fuel remain highly problematic.

A lot of hopes currently hinge on the use of directed energy and lightsails to push tiny spacecraft to relativistic speeds. But what if there was a way to make larger spacecraft fast enough to conduct interstellar voyages? According to Prof. David Kipping, the leader of Columbia University’s Cool Worlds lab, future spacecraft could rely on a halo drive, which uses the gravitational force of a black hole to reach incredible speeds.

Prof. Kipping described this concept in a recent study that appeared online (the preprint is also available on the Cool Worlds website). In it, Kipping addressed one of the greatest challenges posed by space exploration, which is the sheer amount of time and energy it would take to send a spacecraft on a mission to explore beyond our solar system.

Kipping told Universe Today via email: “Interstellar travel is one of the most challenging technical feats we can conceive of. Whilst we can envisage drifting between the stars over millions of years – which is legitimately interstellar travel – to achieve journeys on timescales of centuries or less requires relativistic propulsion.”

As Kipping put it, relativistic propulsion (or accelerating to a fraction of the speed of light) is very expensive in terms of energy. Existing spacecraft simply don’t have the fuel capacity to get up to those kinds of speeds, and short of detonating nukes to generate thrust à la Project Orion, or building a fusion ramjet à la Project Daedalus, there are not a lot of options available.

In recent years, attention has shifted toward the idea of using lightsails and nanocraft to conduct interstellar missions. A well-known example is Breakthrough Starshot, an initiative that aims to send a smartphone-sized spacecraft to Alpha Centauri within our lifetime. Using a powerful laser array, the lightsail would be accelerated to speeds of up to 20 percent of the speed of light – thus making the trip in 20 years.

“But even here, you are talking about several terra-joules of energy for the most minimalist (a gram-mass) spacecraft conceivable,” said Kipping. “That’s the cumulative energy output of nuclear power stations running for weeks on end… so this is why it’s hard.”

To this, Kipping suggests a modified version of the “Dyson Slingshot,” an idea proposed by venerated theoretical physicist Freeman Dyson, the theorist behind the Dyson Sphere. In the 1963 book Interstellar Communications (Chapter 12: “Gravitational Machines”), Dyson described how spacecraft could slingshot around compact binary stars in order to receive a significant boost in velocity.

As Dyson described it, a ship would be dispatched to a compact binary system where it would perform a gravity-assist maneuver. This would consist of the spaceship picking up speed from the binary’s intense gravity, adding the equivalent of twice their rotational velocity to its own, and is then flung out of the system.

While the prospect of harnessing this kind of energy for the sake of propulsion was highly theoretical in Dyson’s time (and still is), Dyson offered two reasons why “gravitational machines” were worth exploring:

“First, if our species continues to expand its population and its technology at an exponential rate, there may come a time in the remote future where engineering on an astronomical scale may be both feasible and necessary. Second, if we are searching for signs of technologically advanced life already existing elsewhere in the universe, it is useful to consider what kind of observable phenomena a really advanced technology might be capable of producing.”

In short, gravitational machines are worth studying in case they become possible someday, and because this study could allow us to spot possible extraterrestrial intelligences (ETIs) by detecting the technosignatures such machines would create. Expanding upon this, Kipping considers how black holes, especially those found in binary pairs, could constitute even more powerful gravitational slingshots.

This proposal is based in part on the recent success of the Laser Interferometer Gravitational-Wave Observatory (LIGO), which has detected multiple gravitational wave signals since 2016. According to recent estimates based on these detections, there could be as many as 100 million black holes in the Milky Way galaxy alone.

Where binaries occur, they possess an incredible amount of rotational energy, which is the result of their spin and the way they rapidly orbit one another. In addition, as Kipping notes, black holes can also act as a gravitational mirror – where photons directed at the edge of the event horizon will bend around and come straight back at the source. As Kipping put it:

“So the binary black hole is really a couple of giant mirrors circling around one another at potentially high velocity. The halo drive exploits this by bouncing photons off the “mirror” as the mirror approaches you, the photons bounce back, pushing you along, but also steal some of the energy from the black hole binary itself (think about how a ping pong ball thrown against a moving wall would come back faster). Using this setup, one can harvest the binary black hole energy for propulsion.”

This method of propulsion offers several obvious advantages. For starters, it offers users the potential to travel at relativistic speeds without the need for fuel, which currently accounts for the majority of a launch vehicle’s mass. And there are many, many black holes that exist throughout the Milky Way, which could act as a network for relativistic space travel.

What’s more, scientists have already witnessed the power of gravitational slingshots thanks to the discovery of hyper-velocity stars. According to research from the Harvard-Smithsonian Center for Astrophysics (CfA), these stars are a result of galactic mergers and interaction with massive black holes, which kick them out of their galaxies at one-tenth to one-third the speed of light – around 30,000 to 100,000 km/s (18,600 to 62,000 mps).

But of course, the concept comes with innumerable challenges and more than a few disadvantages. In addition to building spacecraft that can endure being flung around the event horizon of a black hole, a tremendous amount of precision is required – otherwise, the ship and crew (if it has one) could be pulled apart in the maw of the black hole. Additionally, there’s simply the matter of reaching one:

“[T]he thing has a huge disadvantage for us in that we have to first get to one of these black holes. I tend to think of it like a interstellar highway system – you have to pay a one-time toll to get on the highway, but once you’re on, you can ride across the galaxy as much as you like without expending any more fuel.”

The challenge of how humanity might go about reaching the nearest suitable black hole will be the subject of Kipping’s next paper, he indicated. And while an idea like this is about as remote to us as building a Dyson Sphere or using black holes to power starships, it does offer some pretty exciting possibilities for the future.

In short, the concept of a black hole gravity machine presents humanity with a plausible path to becoming an interstellar species. In the meantime, the study of the concept will provide SETI researchers with another possible technosignature to look for. So until the day comes when we might attempt this ourselves, we will be able to see if any other species have already made it work.

Read more at: https://phys.org/news/2019-03-black-holes-conquer-space-halo.html#jCp

Advertisements

by Jonathan O’Callaghan

You might be forgiven for thinking our understanding of classical physics had reached its peak in the four centuries since Isaac Newton devised his eponymous laws of motion. But surprising new research shows there are still secrets waiting to be found, hidden in plain sight—or, at least in this case, within earshot.

In a paper published in Physical Review Letters, a group of scientists has theorized that sound waves possess mass, meaning sounds would be directly affected by gravity. They suggest phonons, particlelike collective excitations responsible for transporting sound waves across a medium, might exhibit a tiny amount of mass in a gravitational field. “You would expect classical physics results like this one to have been known for a long time by now,” says Angelo Esposito from Columbia University, the lead author on the paper. “It’s something we stumbled upon almost by chance.”

Esposito and his colleagues built on a previous paper published last year, in which Alberto Nicolis of Columbia and Riccardo Penco from Carnegie Mellon University first suggested phonons could have mass in a superfluid. The latest study, however, shows this effect should hold true for other materials, too, including regular liquids and solids, and even air itself.

And although the amount of mass carried by the phonons is expected to be tiny—comparable with a hydrogen atom, about 10–24 grams—it may actually be measurable. Except, if you were to measure it, you would find something deeply counterintuitive: The mass of the phonons would be negative, meaning they would fall “up.” Over time their trajectory would gradually move away from a gravitational source such as Earth. “If their gravitational mass was positive, they would fall downward,” Penco says. “Because their gravitational mass is negative, phonons fall upwards.” And the amount they would “fall” is equally small, varying depending on the medium the phonon is traveling through. In water, where sound moves at 1.5 kilometers per second, the negative mass of the phonon would cause it to drift at about 1 degree per second. But this corresponds to a change of 1 degree over 15 kilometers, which would be exceedingly difficult to measure.

Difficult it might be, but such a measurement should still be possible. Esposito notes that to distinguish the phonons’ mass, one could look for them in a medium where the speed of sound was very slow. That might be possible in superfluid helium, where the speed of sound can drop to hundreds of meters per second or less, and the passage of a single phonon might shift an atom’s equivalent of material.

Alternatively, instead of seeking minuscule effects magnified by exotic substances, researchers might look for more obvious signs of mass-carrying phonons by closely studying extremely intense sound waves. Earthquakes offer one possibility, Esposito says. According to his calculations, a magnitude 9 temblor would release enough energy so that the resulting change in the gravitational acceleration of the earthquake’s sound wave might be measurable using atomic clocks. (Although current techniques are not sensitive enough to detect the gravitational field of a seismic wave, future advancements in technology might make this possible.)

Sound waves having mass are unlikely to have a major impact on day-to-day life, but the possibility something so fundamental has gone unnoticed for so long is intriguing. “Until this paper, it was thought that sound waves do not transport mass,” says Ira Rothstein from Carnegie Mellon University, who was not involved in this research. “So in that sense it’s a really remarkable result. Because anytime you find any new result in classical physics, given that it’s been around since Newton, you would have thought it would be completely understood. If you look carefully enough, you can find fresh [ideas] even in fields which have been covered for centuries.”

As for why this has never been spotted before, Esposito is uncertain. “Maybe because we are high-energy physicists, gravity is more our language,” he says. “It’s not some theoretical mumbo jumbo kind of thing. In principle people could have discovered it years ago.”

https://www.scientificamerican.com/article/sound-by-the-pound-surprising-discovery-hints-sonic-waves-carry-mass/

Back in 1961, the Nobel Prize–winning physicist Eugene Wigner outlined a thought experiment that demonstrated one of the lesser-known paradoxes of quantum mechanics. The experiment shows how the strange nature of the universe allows two observers—say, Wigner and Wigner’s friend—to experience different realities.

Since then, physicists have used the “Wigner’s Friend” thought experiment to explore the nature of measurement and to argue over whether objective facts can exist. That’s important because scientists carry out experiments to establish objective facts. But if they experience different realities, the argument goes, how can they agree on what these facts might be?

That’s provided some entertaining fodder for after-dinner conversation, but Wigner’s thought experiment has never been more than that—just a thought experiment.

Last year, however, physicists noticed that recent advances in quantum technologies have made it possible to reproduce the Wigner’s Friend test in a real experiment. In other words, it ought to be possible to create different realities and compare them in the lab to find out whether they can be reconciled.

And today, Massimiliano Proietti at Heriot-Watt University in Edinburgh and a few colleagues say they have performed this experiment for the first time: they have created different realities and compared them. Their conclusion is that Wigner was correct—these realities can be made irreconcilable so that it is impossible to agree on objective facts about an experiment.

Wigner’s original thought experiment is straightforward in principle. It begins with a single polarized photon that, when measured, can have either a horizontal polarization or a vertical polarization. But before the measurement, according to the laws of quantum mechanics, the photon exists in both polarization states at the same time—a so-called superposition.

Wigner imagined a friend in a different lab measuring the state of this photon and storing the result, while Wigner observed from afar. Wigner has no information about his friend’s measurement and so is forced to assume that the photon and the measurement of it are in a superposition of all possible outcomes of the experiment.

Wigner can even perform an experiment to determine whether this superposition exists or not. This is a kind of interference experiment showing that the photon and the measurement are indeed in a superposition.

From Wigner’s point of view, this is a “fact”—the superposition exists. And this fact suggests that a measurement cannot have taken place.

But this is in stark contrast to the point of view of the friend, who has indeed measured the photon’s polarization and recorded it. The friend can even call Wigner and say the measurement has been done (provided the outcome is not revealed).

So the two realities are at odds with each other. “This calls into question the objective status of the facts established by the two observers,” say Proietti and co.

That’s the theory, but last year Caslav Brukner, at the University of Vienna in Austria, came up with a way to re-create the Wigner’s Friend experiment in the lab by means of techniques involving the entanglement of many particles at the same time.

The breakthrough that Proietti and co have made is to carry this out. “In a state-of-the-art 6-photon experiment, we realize this extended Wigner’s friend scenario,” they say.

They use these six entangled photons to create two alternate realities—one representing Wigner and one representing Wigner’s friend. Wigner’s friend measures the polarization of a photon and stores the result. Wigner then performs an interference measurement to determine if the measurement and the photon are in a superposition.

The experiment produces an unambiguous result. It turns out that both realities can coexist even though they produce irreconcilable outcomes, just as Wigner predicted.

That raises some fascinating questions that are forcing physicists to reconsider the nature of reality.

The idea that observers can ultimately reconcile their measurements of some kind of fundamental reality is based on several assumptions. The first is that universal facts actually exist and that observers can agree on them.

But there are other assumptions too. One is that observers have the freedom to make whatever observations they want. And another is that the choices one observer makes do not influence the choices other observers make—an assumption that physicists call locality.

If there is an objective reality that everyone can agree on, then these assumptions all hold.

But Proietti and co’s result suggests that objective reality does not exist. In other words, the experiment suggests that one or more of the assumptions—the idea that there is a reality we can agree on, the idea that we have freedom of choice, or the idea of locality—must be wrong.

Of course, there is another way out for those hanging on to the conventional view of reality. This is that there is some other loophole that the experimenters have overlooked. Indeed, physicists have tried to close loopholes in similar experiments for years, although they concede that it may never be possible to close them all.

Nevertheless, the work has important implications for the work of scientists. “The scientific method relies on facts, established through repeated measurements and agreed upon universally, independently of who observed them,” say Proietti and co. And yet in the same paper, they undermine this idea, perhaps fatally.

The next step is to go further: to construct experiments creating increasingly bizarre alternate realities that cannot be reconciled. Where this will take us is anybody’s guess. But Wigner, and his friend, would surely not be surprised.

Ref: arxiv.org/abs/1902.05080 : Experimental Rejection of Observer-Independence in the Quantum World

https://www.technologyreview.com/s/613092/a-quantum-experiment-suggests-theres-no-such-thing-as-objective-reality/

081616_TO_wormhole_main
A new uncertainty principle holds that quantum objects can be at two temperatures at once, which is similar to the famous Schrödinger’s cat thought experiment, in which a cat in a box with a radioactive element can be both alive and dead.

By Meredith Fore

The famous thought experiment known as Schrödinger’s cat implies that a cat in a box can be both dead and alive at the same time — a bizarre phenomenon that is a consequence of quantum mechanics.

Now, physicists at the University of Exeter in England have found that a similar state of limbo may exist for temperatures: Objects can be two temperatures at the same time at the quantum level. This weird quantum paradox is the first completely new quantum uncertainty relation to be formulated in decades.

Heisenberg’s other principle
In 1927, German physicist Werner Heisenberg postulated that the more precisely you measure a quantum particle’s position, the less precisely you can know its momentum, and vice versa — a rule that would become the now-famous Heisenberg uncertainty principle.

The new quantum uncertainty, which states that the more precisely you know temperature, the less you can say about energy, and vice versa, has big implications for nanoscience, which studies incredibly tiny objects smaller than a nanometer. This principle will change how scientists measure the temperature of extremely small things such as quantum dots, small semiconductors or single cells, the researchers said in the new study, which was published in June in the journal Nature Communications.

In the 1930s, Heisenberg and Danish physicist Niels Bohr established an uncertainty relation between energy and temperature on the nonquantum scale. The idea was that, if you wanted to know the exact temperature of an object, the best and most precise scientific way to do that would be to immerse it in a “reservoir” — say, a tub of water, or a fridge full of cold air — with a known temperature, and allow the object to slowly become that temperature. This is called thermal equilibrium.

However, that thermal equilibrium is maintained by the object and the reservoir constantly exchanging energy. The energy in your object therefore goes up and down by infinitesimal amounts, making it impossible to define precisely. On the flip side, if you wanted to know the precise energy in your object, you would have to isolate it so that it could not come into contact with, and exchange energy with, anything. But if you isolated it, you would not be able to precisely measure its temperature using a reservoir. This limitation makes the temperature uncertain.

Things get weirder when you go to the quantum scale.

A new uncertainty relation
Even if a typical thermometer has an energy that goes up and down slightly, that energy can still be known to within a small range. This is not true at all on the quantum level, the new research showed, and it’s all due to Schrödinger’s cat. That thought experiment proposed a theoretical cat in a box with a poison that could be activated by the decay of a radioactive particle. According to the laws of quantum mechanics, the particle could have decayed and not decayed at the same time, meaning that until the box was opened, the cat would be both dead and alive at the same time — a phenomenon known as superposition.

The researchers used math and theory to predict exactly how such superposition affects the measurement of the temperature of quantum objects.

“In the quantum case, a quantum thermometer … will be in a superposition of energy states simultaneously,”Harry Miller, one of the physicists at the University of Exeter who developed the new principle, told Live Science. “What we find is that because the thermometer no longer has a well-defined energy and is actually in a combination of different states at once, that this actually contributes to the uncertainty in the temperature that we can measure.”

In our world, a thermometer may tell us an object is between 31 and 32 degrees Fahrenheit (minus 0.5 and zero degrees Celsius). In the quantum world, a thermometer may tell us an object is both those temperatures at the same time. The new uncertainty principle accounts for that quantum weirdness.

Interactions between objects at the quantum scale can create superpositions, and also create energy. The old uncertainty relation ignored these effects, because it doesn’t matter for nonquantum objects. But it matters a lot when you’re trying to measure the temperature of a quantum dot, and this new uncertainty relation makes up a theoretical framework to take these interactions into account.

The new paper could help anyone who’s designing an experiment to measure temperature changes in objects below the nanometer scale, Miller said. “Our result is going to tell them exactly how to accurately design their probes and tell them how to account for the additional quantum uncertainty that you get.”

https://www.livescience.com/63595-schrodinger-uncertainty-relation-temperature.html

time

by MIKE MCRAE

For around a century it’s been thought that particles don’t have defined properties until we nail them down with a measurement.

That kind of quantum madness opens up a whole world of counter-intuitive paradoxes. Take this one, for example – it’s possible for a single particle to experience two sequences of events at the same time, making it impossible to know which came first.

Physicists from the University of Queensland designed a race course for light that forced a single particle to traverse two pathways at once, making it impossible to say in which order it completed a pair of operations.

In boring old everyday life you could roll a single ball down a ramp and have it ring bell A and then ring bell B. Or, if you’d prefer, you could roll it down another ramp and have it ring B before A.

If you want to get fancy you could even set up a rig so one bell causes the other bell to ring.

None of this is mind blowing, since we’re used to events in the Universe having a set order, where one thing precedes another in such a way that we presume an order of causation.

But nothing is so simple when we accept that reality is a blur of possibility prior to it being measured.

To demonstrate this, the physicists created a physical equivalent of something called a quantum switch, where multiple operations occur while a particle is in a superposition of all its possible locations.

Keeping it simple, the team set up a pathway that split apart and converged again in an interferometer, with access to each fork dependent on the polarisation of the light entering it.

Light waves travelling down each fork in the pathway would then merge and interfere to create a distinctive pattern depending on its properties.

In this particular case, the two light waves were actually the same photon taking both paths at the same time.

Before being measured, a photon can be either vertically or horizontally polarised. Or, more precisely, it’s polarised both vertical and horizontal at the same time until a measurement confirms one over the other.

Since this undefined photon’s polarisation is both vertical and horizontal, it enters both pathways, with the vertically polarised version of the photon barrelling down one channel and the horizontally polarised version heading down the second.

Following the two paths, the team had the quantum equivalent of those bells we mentioned earlier – in the form of lenses that subtly changed the shape of the photon.

The horizontal polarisation would hit ‘bell’ A before striking B, while the vertical polarisation would strike ‘bell’ B, and then A.

An analysis of the interference pattern of the reunited photon revealed signs of this mess of possible sequences.

On one hand, it’s easy to imagine two separate light particles – one horizontally polarised, the other vertically polarised – passing each lens in separate orders.

That’s not what happened, though. This was a single photon with two possible histories, neither of which set in reality until they’re measured.

While the events A and B were independent in this quantum switch, they could be linked to affect one another. A could cause B, or B could cause A … all depending on which history you wanted after the event.

Putting aside daydreams of travelling back in time to undo that big mistake (what were you thinking?!), this does have one possible practical application in the emerging field of quantum communications.

Transmitting photons down a noisy channel could be disastrous for their quantum information, quickly making a mess of their precious superposition. Sending them down channels fitted with a quantum switch, however, could in principle give the quantum information an opportunity to get through.

A paper the team published on the pre-peer review website arxiv.org back in July shows how a quantum switch applied to two noisy channels can allow a superposition to survive.

Whatever weird clockwork is going on in reality’s basement, we won’t pretend to understand it. But the very fact physicists are able to craft it into new technology is truly mindblowing in itself.

This research was published in Physical Review Letters.

https://www.sciencealert.com/quantum-switch-causation-superposition-applied-technology

universe
The detailed, all-sky picture of the infant universe created from nine years of WMAP data. The image reveals 13.77 billion year old temperature fluctuations (shown as color differences) that correspond to the seeds that grew to become the galaxies. The signal from our galaxy was subtracted using the multi-frequency data. This image shows a temperature range of ± 200 microKelvin.CREDIT: NASA/WMAP SCIENCE TEAM

by Jesse Shanahan

In a study published earlier this month, a team of theoretical physicists is claiming to have discovered the remnants of previous universes hidden within the leftover radiation from the Big Bang. Our universe is a vast collection of observable matter, like gas, dust, stars, etc., in addition to the ever-elusive dark matter and dark energy. In some sense, this universe is all we know, and even then, we can only directly study about 5% of it, leaving 95% a mystery that scientists are actively working to solve. However, this group of physicists is arguing that our universe isn’t alone; it’s just one in a long line of universes that are born, grow, and die. Among these scientists is mathematical physicist Roger Penrose, who worked closely with Stephen Hawking and currently is the Emeritus Rouse Ball Professor of Mathematics at Oxford University. Penrose and his collaborators follow a cosmological theory called conformal cyclic cosmology (CCC) in which universes, much like human beings, come into existence, expand, and then perish.

As a universe ages, it expands, and the constituent parts grow farther and farther apart from each other. Consequently, the interactions between galaxies that drive star formation and evolution become rarer. Eventually, the stars die out, and the remaining gas and dust is captured by black holes. In one of his most famous theories, Stephen Hawking proposed that this isn’t the end; black holes might have a way to slowly lose mass and energy by radiating certain particles. So, after many eons, the remaining black holes in the universe would disappear, leaving only disparate particles. Seemingly a wasteland, this end-state eventually mirrors the environment of our universe’s birth, and so, the cycle starts anew.

universe 2
Artist’s logarithmic scale conception of the observable universe with the Solar System at the center, inner and outer planets, Kuiper belt, Oort cloud, Alpha Centauri, Perseus Arm, Milky Way galaxy, Andromeda galaxy, nearby galaxies, Cosmic Web, Cosmic microwave radiation and Big Bang’s invisible plasma on the edge.CREDIT: WIKIPEDIA/PABLO CARLOS BUDASSI

When our universe was very young, before any recognizable components like stars, planets, or galaxies formed, it was filled with a dense, hot soup of plasma. As the universe expanded, it cooled, and eventually, particles could combine to form atoms. Eventually, the interaction and fusion of these atoms resulted in all of the matter that we observe today. However, we can still observe the leftover radiation from that initial, dense period in our universe’s history. This leftover glow, called the Cosmic Microwave Background (CMB), is the oldest electromagnetic radiation, and it fills the entirety of our universe. If the CCC theory were true, then there would be hints of previous universes in our universe’s CMB.

At the end of a universe, when those final black holes dissolve, CCC theory states they should leave behind a signature that would survive the death of that universe and persist into the next. Although not definitive proof of previous universes, detecting that signature would be strong evidence in support of CCC theory. In searching for these “Hawking points”, cosmologists face a difficult obstacle as the CMB is faint and varies randomly. However, Penrose is claiming that a comparison between a model CMB with Hawking points and actual data from our CMB has proven that Hawking points actually exist. If true, this would be the first-ever detection of evidence from another universe.

Unfortunately, as groundbreaking as this discovery seems, the scientific community has largely dismissed it. One of the fundamental characteristics of the CMB is that, although it has patterns, the variations are entirely statistically random. In fact, Penrose’s former collaborator, Stephen Hawking, spotted his own initials in the CMB while others have found a deer, a parrot, and numerous other recognizable shapes in the noise. Similarly, the Wilkinson Anisotropy Microscope Probe that mapped the CMB released an interactive image where you can search for familiar shapes and patterns. An avoidable result of both these random fluctuations and the sheer size of the CMB is that if scientists look hard enough, they can find whatever pattern they need, like the existence of Hawking points, perhaps. Another criticism of Penrose’s claim is that if CCC theory holds true, our universe should have tens of thousands of Hawking points in the CMB. Regrettably, Penrose could find only about 20.

Still, the possibility of alternate universes, whether long-dead or existing in parallel to our own, is tantalizing. Many other theories also claim to find traces of other universes hiding in the patterns of the CMB as well. Although it sounds like science fiction, we are left to wonder: is this just the cosmological equivalent of seeing shapes in random clouds or will scientists one day discover that we are one among many infinite universes?

Jesse Shanahan is an astrophysicist, EMT, and science communicator. For more space and language news, follow her on Twitter here.

https://www.forbes.com/sites/jesseshanahan/2018/08/24/did-scientists-actually-spot-evidence-of-another-universe/#2278663f1425

The Standard Model. What dull name for the most accurate scientific theory known to human beings.

More than a quarter of the Nobel Prizes in physics of the last century are direct inputs to or direct results of the Standard Model. Yet its name suggests that if you can afford a few extra dollars a month you should buy the upgrade. As a theoretical physicist, I’d prefer The Absolutely Amazing Theory of Almost Everything. That’s what the Standard Model really is.

Many recall the excitement among scientists and media over the 2012 discovery of the Higgs boson. But that much-ballyhooed event didn’t come out of the blue – it capped a five-decade undefeated streak for the Standard Model. Every fundamental force but gravity is included in it. Every attempt to overturn it to demonstrate in the laboratory that it must be substantially reworked – and there have been many over the past 50 years – has failed.

In short, the Standard Model answers this question: What is everything made of, and how does it hold together?

The smallest building blocks

You know, of course, that the world around us is made of molecules, and molecules are made of atoms. Chemist Dmitri Mendeleev figured that out in the 1860s and organized all atoms – that is, the elements – into the periodic table that you probably studied in middle school. But there are 118 different chemical elements. There’s antimony, arsenic, aluminum, selenium … and 114 more.


But these elements can be broken down further.

Physicists like things simple. We want to boil things down to their essence, a few basic building blocks. Over a hundred chemical elements is not simple. The ancients believed that everything is made of just five elements – earth, water, fire, air and aether. Five is much simpler than 118. It’s also wrong.

By 1932, scientists knew that all those atoms are made of just three particles – neutrons, protons and electrons. The neutrons and protons are bound together tightly into the nucleus. The electrons, thousands of times lighter, whirl around the nucleus at speeds approaching that of light. Physicists Planck, Bohr, Schroedinger, Heisenberg and friends had invented a new science – quantum mechanics – to explain this motion.

That would have been a satisfying place to stop. Just three particles. Three is even simpler than five. But held together how? The negatively charged electrons and positively charged protons are bound together by electromagnetism. But the protons are all huddled together in the nucleus and their positive charges should be pushing them powerfully apart. The neutral neutrons can’t help.

What binds these protons and neutrons together? “Divine intervention” a man on a Toronto street corner told me; he had a pamphlet, I could read all about it. But this scenario seemed like a lot of trouble even for a divine being – keeping tabs on every single one of the universe’s 10⁸⁰ protons and neutrons and bending them to its will.

Expanding the zoo of particles

Meanwhile, nature cruelly declined to keep its zoo of particles to just three. Really four, because we should count the photon, the particle of light that Einstein described. Four grew to five when Anderson measured electrons with positive charge – positrons – striking the Earth from outer space. At least Dirac had predicted these first anti-matter particles. Five became six when the pion, which Yukawa predicted would hold the nucleus together, was found.

Then came the muon – 200 times heavier than the electron, but otherwise a twin. “Who ordered that?” I.I. Rabi quipped. That sums it up. Number seven. Not only not simple, redundant.

By the 1960s there were hundreds of “fundamental” particles. In place of the well-organized periodic table, there were just long lists of baryons (heavy particles like protons and neutrons), mesons (like Yukawa’s pions) and leptons (light particles like the electron, and the elusive neutrinos) – with no organization and no guiding principles.

Into this breach sidled the Standard Model. It was not an overnight flash of brilliance. No Archimedes leapt out of a bathtub shouting “eureka.” Instead, there was a series of crucial insights by a few key individuals in the mid-1960s that transformed this quagmire into a simple theory, and then five decades of experimental verification and theoretical elaboration.

Quarks. They come in six varieties we call flavors. Like ice cream, except not as tasty. Instead of vanilla, chocolate and so on, we have up, down, strange, charm, bottom and top. In 1964, Gell-Mann and Zweig taught us the recipes: Mix and match any three quarks to get a baryon. Protons are two ups and a down quark bound together; neutrons are two downs and an up. Choose one quark and one antiquark to get a meson. A pion is an up or a down quark bound to an anti-up or an anti-down. All the material of our daily lives is made of just up and down quarks and anti-quarks and electrons.


The Standard Model of elementary particles provides an ingredients list for everything around us.

Simple. Well, simple-ish, because keeping those quarks bound is a feat. They are tied to one another so tightly that you never ever find a quark or anti-quark on its own. The theory of that binding, and the particles called gluons (chuckle) that are responsible, is called quantum chromodynamics. It’s a vital piece of the Standard Model, but mathematically difficult, even posing an unsolved problem of basic mathematics. We physicists do our best to calculate with it, but we’re still learning how.

The other aspect of the Standard Model is “A Model of Leptons.” That’s the name of the landmark 1967 paper by Steven Weinberg that pulled together quantum mechanics with the vital pieces of knowledge of how particles interact and organized the two into a single theory. It incorporated the familiar electromagnetism, joined it with what physicists called “the weak force” that causes certain radioactive decays, and explained that they were different aspects of the same force. It incorporated the Higgs mechanism for giving mass to fundamental particles.

Since then, the Standard Model has predicted the results of experiment after experiment, including the discovery of several varieties of quarks and of the W and Z bosons – heavy particles that are for weak interactions what the photon is for electromagnetism. The possibility that neutrinos aren’t massless was overlooked in the 1960s, but slipped easily into the Standard Model in the 1990s, a few decades late to the party.

Discovering the Higgs boson in 2012, long predicted by the Standard Model and long sought after, was a thrill but not a surprise. It was yet another crucial victory for the Standard Model over the dark forces that particle physicists have repeatedly warned loomed over the horizon. Concerned that the Standard Model didn’t adequately embody their expectations of simplicity, worried about its mathematical self-consistency, or looking ahead to the eventual necessity to bring the force of gravity into the fold, physicists have made numerous proposals for theories beyond the Standard Model. These bear exciting names like Grand Unified Theories, Supersymmetry, Technicolor, and String Theory.

Sadly, at least for their proponents, beyond-the-Standard-Model theories have not yet successfully predicted any new experimental phenomenon or any experimental discrepancy with the Standard Model.

After five decades, far from requiring an upgrade, the Standard Model is worthy of celebration as the Absolutely Amazing Theory of Almost Everything.

https://theconversation.com/the-standard-model-of-particle-physics-the-absolutely-amazing-theory-of-almost-everything-94700#?utm_source=ls-newsletter&utm_medium=email&utm_campaign=05272018-ls