The Milky Way’s hot spot

The center of our galaxy is a busy place. But it might be one of the best sites to hunt for dark matter.

When you look up at night, the Milky Way appears as a swarm of stars arranged in a misty white band across the sky. 

But from an outside perspective, our galaxy looks more like a disk, with spiral arms of stars reaching out into the universe. At the center of this disk is a small region around which the entire pinwheel of our galaxy rotates, a region packed with exotic astronomical phenomena ranging from dark matter and newborn stars to a supermassive black hole. Astronomers call this region of the Milky Way the galactic center. 

It’s a strange neighborhood, and scientists have reason to believe it’s one of the best places to hunt for dark matter.

The Spitzer Space Telescope provides an infrared view of the galactic center region.

Courtesy of: NASA/JPL-Caltech/ESA/CXC/STScI

Phenomena in our galaxy’s heart

In the ’70s, scientists hypothesized that a supermassive black hole might be lurking in the center of the Milky Way. Black holes are points of space-time where gravity is so strong that not even light can escape.

After decades of trying to indirectly identify the mysterious object in the galactic center by tracing the orbits of stars and gas, astronomers were finally able to calculate its mass in 2008. It weighed more than 4 million times as much as the sun, making it a prime supermassive black hole candidate.

About 10 percent of all new star formation in the galaxy occurs in the galactic center. This is strange because local conditions produce an extreme environment in which it should be difficult for stars to form.

Scientists believe that at least some of the new stars being formed should explode and transform into pulsars, but they aren’t seeing any. Pulsars emit a regular pulsating signal, like a lighthouse. One early explanation for the apparent lack of pulsars in the galactic center was that the magnetic fields there could be bending their radio waves on their way to us, hiding their pulsating signals. But recently scientists measured the strength of the fields and realized the bending was much less than they had anticipated. The mystery of the missing pulsars remains unsolved.

The galactic center also has a notably high concentration of cosmic rays, high-energy charged particles that hurtle through outer space. Scientists still don’t understand where these particles come from or how they reach such intense energies.

The Hubble Space Telescope, though better known for its visible light images, also captured an infrared light picture of the galactic center (the bright patch in the lower right).

Courtesy of: NASA/JPL-Caltech/ESA/CXC/STScI

Hunting for dark matter

We know that the Milky Way is rotating because when we look along it, we see some stars moving towards us and some stars moving away. But the speed at which our galaxy rotates is faster than it should be for the amount of matter we can see. 

This leads scientists to believe that there is matter located in the center of our galaxy that we cannot see. Despite all of the other stuff going on there, this makes the inner galaxy the perfect hunting ground for this “dark matter,” an invisible substance that makes up most of the matter in the universe.

Scientists looking for dark matter take advantage of the fact that it likely interacts with itself. Researchers predict that when dark matter particles run into each other, they annihilate. They believe that this might produce a distinctive spectrum of gamma rays. 

Over the past few years, scientists have detected an excess of gamma rays from the Milky Way’s galactic center. Many scientists believe that this could be a very strong signal for dark matter. The events look the way they would expect dark matter to look, and the energy spectrum and the way the gamma rays are concentrated resemble what scientists would expect from dark matter. 

Other scientists believe that it is pulsars, not dark matter, that create this signal. Because the excess appears clumped, instead of smooth, scientists believe that it could be coming from compact sources like an ancient population of pulsars.

To determine whether this excess is a dark matter signal, scientists are looking for similar signatures elsewhere in the universe, in places like dwarf galaxies. These small galaxies are cleaner places to look for dark matter with a lot less going on, but the trade-off is that they do not produce as much gamma radiation. 

This Chandra X-ray Observatory image distinguishes between lower energy X-rays (pink) and higher energy X-rays (blue).

Courtesy of: NASA/JPL-Caltech/ESA/CXC/STScI

Peering into the galactic center

The galactic center is clouded from our view by about 25,000 light-years of dust and gas, making it difficult to observe in visible light. Scientists have taken to studying different wavelengths, ranging from radio to gamma ray, to tackle the rich landscape of the galactic center. Among the instruments looking at the galactic center, there are a few that see in very different wavelengths.

The W.M. Keck Observatory, a two-telescope observatory near the summit of a dormant volcano in Hawaii, studies the galactic center in the infrared. The Chandra X-ray Observatory, a space observatory launched in 1999, observes the galactic center in X-rays.

Imaging Atmospheric Cherenkov Telescopes are ground-based detectors that scientists use to study gamma rays from the air showers they create when they smash into our atmosphere. The High Energy Stereoscopic System, or HESS, is the world’s largest Cherenkov telescope array and is located in Namibia.

The Fermi Gamma-ray Space Telescope is another observatory scientists use to investigate the galactic center. This is a satellite-based telescope that maps the whole sky at gamma-ray wavelengths. Since its launch in 2008, Fermi has been an important tool in probing the contents of the inner galaxy, from dark matter to pulsars to black holes. 

The longer these instruments collect data, the closer we get to figuring out dark matter and untangling the mess of marvels at the heart of our galaxy.

The center of our galaxy is a busy place. But it might be one of the best sites to hunt for dark matter.

When you look up at night, the Milky Way appears as a swarm of stars arranged in a misty white band across the sky. 

But from an outside perspective, our galaxy looks more like a disk, with spiral arms of stars reaching out into the universe. At the center of this disk is a small region around which the entire pinwheel of our galaxy rotates, a region packed with exotic astronomical phenomena ranging from dark matter and newborn stars to a supermassive black hole. Astronomers call this region of the Milky Way the galactic center. 

It’s a strange neighborhood, and scientists have reason to believe it’s one of the best places to hunt for dark matter.

The Spitzer Space Telescope provides an infrared view of the galactic center region.

Courtesy of: NASA/JPL-Caltech/ESA/CXC/STScI

Phenomena in our galaxy’s heart

In the ’70s, scientists hypothesized that a supermassive black hole might be lurking in the center of the Milky Way. Black holes are points of space-time where gravity is so strong that not even light can escape.

After decades of trying to indirectly identify the mysterious object in the galactic center by tracing the orbits of stars and gas, astronomers were finally able to calculate its mass in 2008. It weighed more than 4 million times as much as the sun, making it a prime supermassive black hole candidate.

About 10 percent of all new star formation in the galaxy occurs in the galactic center. This is strange because local conditions produce an extreme environment in which it should be difficult for stars to form.

Scientists believe that at least some of the new stars being formed should explode and transform into pulsars, but they aren’t seeing any. Pulsars emit a regular pulsating signal, like a lighthouse. One early explanation for the apparent lack of pulsars in the galactic center was that the magnetic fields there could be bending their radio waves on their way to us, hiding their pulsating signals. But recently scientists measured the strength of the fields and realized the bending was much less than they had anticipated. The mystery of the missing pulsars remains unsolved.

The galactic center also has a notably high concentration of cosmic rays, high-energy charged particles that hurtle through outer space. Scientists still don’t understand where these particles come from or how they reach such intense energies.

The Hubble Space Telescope, though better known for its visible light images, also captured an infrared light picture of the galactic center (the bright patch in the lower right).

Courtesy of: NASA/JPL-Caltech/ESA/CXC/STScI

Hunting for dark matter

We know that the Milky Way is rotating because when we look along it, we see some stars moving towards us and some stars moving away. But the speed at which our galaxy rotates is faster than it should be for the amount of matter we can see. 

This leads scientists to believe that there is matter located in the center of our galaxy that we cannot see. Despite all of the other stuff going on there, this makes the inner galaxy the perfect hunting ground for this “dark matter,” an invisible substance that makes up most of the matter in the universe.

Scientists looking for dark matter take advantage of the fact that it likely interacts with itself. Researchers predict that when dark matter particles run into each other, they annihilate. They believe that this might produce a distinctive spectrum of gamma rays. 

Over the past few years, scientists have detected an excess of gamma rays from the Milky Way’s galactic center. Many scientists believe that this could be a very strong signal for dark matter. The events look the way they would expect dark matter to look, and the energy spectrum and the way the gamma rays are concentrated resemble what scientists would expect from dark matter. 

Other scientists believe that it is pulsars, not dark matter, that create this signal. Because the excess appears clumped, instead of smooth, scientists believe that it could be coming from compact sources like an ancient population of pulsars.

To determine whether this excess is a dark matter signal, scientists are looking for similar signatures elsewhere in the universe, in places like dwarf galaxies. These small galaxies are cleaner places to look for dark matter with a lot less going on, but the trade-off is that they do not produce as much gamma radiation. 

This Chandra X-ray Observatory image distinguishes between lower energy X-rays (pink) and higher energy X-rays (blue).

Courtesy of: NASA/JPL-Caltech/ESA/CXC/STScI

Peering into the galactic center

The galactic center is clouded from our view by about 25,000 light-years of dust and gas, making it difficult to observe in visible light. Scientists have taken to studying different wavelengths, ranging from radio to gamma ray, to tackle the rich landscape of the galactic center. Among the instruments looking at the galactic center, there are a few that see in very different wavelengths.

The W.M. Keck Observatory, a two-telescope observatory near the summit of a dormant volcano in Hawaii, studies the galactic center in the infrared. The Chandra X-ray Observatory, a space observatory launched in 1999, observes the galactic center in X-rays.

Imaging Atmospheric Cherenkov Telescopes are ground-based detectors that scientists use to study gamma rays from the air showers they create when they smash into our atmosphere. The High Energy Stereoscopic System, or HESS, is the world’s largest Cherenkov telescope array and is located in Namibia.

The Fermi Gamma-ray Space Telescope is another observatory scientists use to investigate the galactic center. This is a satellite-based telescope that maps the whole sky at gamma-ray wavelengths. Since its launch in 2008, Fermi has been an important tool in probing the contents of the inner galaxy, from dark matter to pulsars to black holes. 

The longer these instruments collect data, the closer we get to figuring out dark matter and untangling the mess of marvels at the heart of our galaxy.

70 “Batman” v “Superman”

In celebration of “Batman v Superman: Dawn of Justice,” Amy and Devin pin Tim Burton’s “Batman” against Richard Donner’s “Superman.” They break down parts of the films like attention to detail in “Batman” and Clark and Lois’s captivating chemistry in “Superman.” Tune in, and head over to the Earwolf forums to cast your vote.

In celebration of “Batman v Superman: Dawn of Justice,” Amy and Devin pin Tim Burton’s “Batman” against Richard Donner’s “Superman.” They break down parts of the films like attention to detail in “Batman” and Clark and Lois’s captivating chemistry in “Superman.” Tune in, and head over to the Earwolf forums to cast your vote.

70 Batman (1989) vs. Superman (1978)

In celebration of “Batman v Superman: Dawn of Justice,” Amy and Devin pin Tim Burton’s “Batman” against Richard Donner’s “Superman.” They break down parts of the films like attention to detail in “Batman” and Clark and Lois’s captivating chemistry in “Superman.” Tune in, and head over to the Earwolf forums to cast your vote.

In celebration of “Batman v Superman: Dawn of Justice,” Amy and Devin pin Tim Burton’s “Batman” against Richard Donner’s “Superman.” They break down parts of the films like attention to detail in “Batman” and Clark and Lois’s captivating chemistry in “Superman.” Tune in, and head over to the Earwolf forums to cast your vote.

Dan Daneshvar: Making The Death Call

To study a dangerous disease, Dan Daneshvar asks families to consider donating their loved one’s brains.

To study a dangerous disease, Dan Daneshvar asks families to consider donating their loved one’s brains.

Dan Daneshvar received an S.B. from the Massachusetts Institute of Technology in Brain and Cognitive Sciences with Concentrations in Cognitive Neuroscience and Poetry. He joined the CTE Center at Boston University School of Medicine in January 2009, where he studies the effects of repetitive head impacts in athletes, including chronic traumatic encephalopathy (CTE). He will receive M.D./Ph.D. dual degrees in May 2016 before beginning residency at Stanford. He also founded Team Up Against Concussions, an educational program that has educated over 25,000 middle and high schools students about concussions.

The next big LHC upgrade? Software.

Compatible and sustainable software could revolutionize high-energy physics research.

The World Wide Web may have been invented at CERN, but it was raised and cultivated abroad. Now a group of Large Hadron Collider physicists are looking outside academia to solve one of the biggest challenges in physics—creating a software framework that is sophisticated, sustainable and more compatible with rest of the world.

“The software we used to build the LHC and perform our analyses is 20 years old,” says Peter Elmer, a physicist at Princeton University. “Technology evolves, so we have to ask, does our software still make sense today? Will it still do what we need 20 or 30 years from now?”

Elmer is part of a new initiative funded by the National Science Foundation called the DIANA/HEP project, or Data Intensive ANAlysis for High Energy Physics. The DIANA project has one main goal: improve high-energy physics software by incorporating best practices and algorithms from other disciplines.

“We want to discourage physics from re-inventing the wheel,” says Kyle Cranmer, a physicist at New York University and co-founder of the DIANA project. “There has been an explosion of high-quality scientific software in recent years. We want to start incorporating the best products into our research so that we can perform better science more efficiently.”

DIANA is the first project explicitly funded to work on sustainable software, but not alone in the endeavor to improve the way high energy physicists perform their analyses. In 2010 physicist Noel Dawe started the rootpy project, a community-driven initiative to improve the interface between ROOT and Python.

“ROOT is the central tool that every physicist in my field uses,” says Dawe, who was a graduate student at Simon Fraser University when he started rootpy and is currently a fellow at the University of Melbourne. “It does quite a bit, but sometimes the best tool for the job is something else. I started rootpy as a side project when I was a graduate student because I wanted to find ways to interface ROOT code with other tools.”

Physicists began developing ROOT in the 1990s in the computing language C++. This software has evolved a lot since then, but has slowly become outdated, cumbersome and difficult to interface with new scientific tools written in languages such as Python or Julia. C++ has also evolved over the course of the last twenty years, but physicists must maintain a level of backward compatibility in order to preserve some of their older code.

“It’s in a bubble,” says Gilles Louppe, a machine learning expert working on the DIANA project. “It’s hard to get in and it’s hard to get out. It’s isolated from the rest of the world.”

Before coming to CERN, Louppe was a core developer of the machine learning platform scikit-learn, an open source library of versatile data mining and data analysis tools. He is now a postdoctoral researcher at New York University and working closely with physicists to improve the interoperability between common LHC software products and the scientific python ecosystem. Improved interoperability will make it easier for physicists to benefit from global advancements in machine learning and data analysis.

“Software and technology are changing so fast,” Cranmer says. “We can reap the rewards of industry and everything the world is coming up with.”

One trend that is spreading rapidly in the data science community is the computational notebook: a hybrid of analysis code, plots and narrative text.  Project Jupyter is developing the technology that enables these notebooks. Two developers from the Jupyter team recently visited CERN to work with the ROOT team and further develop the ROOT version, ROOTbook.

“ROOTbooks represent a confluence of two communities and two technologies,” says Cranmer.

Physics patterns

To perform tasks such as identifying and tagging particles, physicists use machine learning. They essentially train their LHC software to identify certain patterns in the data by feeding it thousands of simulations. According to Elmer, this task is like one big “needle in a haystack” problem.

“Imagine the book Where’s Waldo. But instead of just looking for one Waldo in one picture, there are many different kinds of Waldos and 100,000 pictures every second that need to be analyzed.”

But what if these programs could learn to recognize patterns on their own with only minimal guidance? One small step outside the LHC is a thriving multi-billion dollar industry doing just that.

“When I take a picture with my iPhone, it instantly interprets the thousands of pixels to identify people’s faces,” Elmer says. Companies like Facebook and Google are also incorporating more and more machine learning techniques to identify and catalogue information so that it is instantly accessible anywhere in the world.

Organizations such as Google, Facebook and Russia’s Yandex are releasing more and more tools as open source. Scientists in other disciplines, such as astronomy, are incorporating these tools into the way they do science. Cranmer hopes that high-energy physics will move to a model that makes it easier to take advantage of these new offerings as well.

 “New software can expand the reach of what we can do at the LHC,” Cranmer says. “The potential is hard to guess.”

Compatible and sustainable software could revolutionize high-energy physics research.

The World Wide Web may have been invented at CERN, but it was raised and cultivated abroad. Now a group of Large Hadron Collider physicists are looking outside academia to solve one of the biggest challenges in physics—creating a software framework that is sophisticated, sustainable and more compatible with rest of the world.

“The software we used to build the LHC and perform our analyses is 20 years old,” says Peter Elmer, a physicist at Princeton University. “Technology evolves, so we have to ask, does our software still make sense today? Will it still do what we need 20 or 30 years from now?”

Elmer is part of a new initiative funded by the National Science Foundation called the DIANA/HEP project, or Data Intensive ANAlysis for High Energy Physics. The DIANA project has one main goal: improve high-energy physics software by incorporating best practices and algorithms from other disciplines.

“We want to discourage physics from re-inventing the wheel,” says Kyle Cranmer, a physicist at New York University and co-founder of the DIANA project. “There has been an explosion of high-quality scientific software in recent years. We want to start incorporating the best products into our research so that we can perform better science more efficiently.”

DIANA is the first project explicitly funded to work on sustainable software, but not alone in the endeavor to improve the way high energy physicists perform their analyses. In 2010 physicist Noel Dawe started the rootpy project, a community-driven initiative to improve the interface between ROOT and Python.

“ROOT is the central tool that every physicist in my field uses,” says Dawe, who was a graduate student at Simon Fraser University when he started rootpy and is currently a fellow at the University of Melbourne. “It does quite a bit, but sometimes the best tool for the job is something else. I started rootpy as a side project when I was a graduate student because I wanted to find ways to interface ROOT code with other tools.”

Physicists began developing ROOT in the 1990s in the computing language C++. This software has evolved a lot since then, but has slowly become outdated, cumbersome and difficult to interface with new scientific tools written in languages such as Python or Julia. C++ has also evolved over the course of the last twenty years, but physicists must maintain a level of backward compatibility in order to preserve some of their older code.

“It’s in a bubble,” says Gilles Louppe, a machine learning expert working on the DIANA project. “It’s hard to get in and it’s hard to get out. It’s isolated from the rest of the world.”

Before coming to CERN, Louppe was a core developer of the machine learning platform scikit-learn, an open source library of versatile data mining and data analysis tools. He is now a postdoctoral researcher at New York University and working closely with physicists to improve the interoperability between common LHC software products and the scientific python ecosystem. Improved interoperability will make it easier for physicists to benefit from global advancements in machine learning and data analysis.

“Software and technology are changing so fast,” Cranmer says. “We can reap the rewards of industry and everything the world is coming up with.”

One trend that is spreading rapidly in the data science community is the computational notebook: a hybrid of analysis code, plots and narrative text.  Project Jupyter is developing the technology that enables these notebooks. Two developers from the Jupyter team recently visited CERN to work with the ROOT team and further develop the ROOT version, ROOTbook.

“ROOTbooks represent a confluence of two communities and two technologies,” says Cranmer.

Physics patterns

To perform tasks such as identifying and tagging particles, physicists use machine learning. They essentially train their LHC software to identify certain patterns in the data by feeding it thousands of simulations. According to Elmer, this task is like one big “needle in a haystack” problem.

“Imagine the book Where’s Waldo. But instead of just looking for one Waldo in one picture, there are many different kinds of Waldos and 100,000 pictures every second that need to be analyzed.”

But what if these programs could learn to recognize patterns on their own with only minimal guidance? One small step outside the LHC is a thriving multi-billion dollar industry doing just that.

“When I take a picture with my iPhone, it instantly interprets the thousands of pixels to identify people’s faces,” Elmer says. Companies like Facebook and Google are also incorporating more and more machine learning techniques to identify and catalogue information so that it is instantly accessible anywhere in the world.

Organizations such as Google, Facebook and Russia’s Yandex are releasing more and more tools as open source. Scientists in other disciplines, such as astronomy, are incorporating these tools into the way they do science. Cranmer hopes that high-energy physics will move to a model that makes it easier to take advantage of these new offerings as well.

 “New software can expand the reach of what we can do at the LHC,” Cranmer says. “The potential is hard to guess.”

New Posts…For All You RSS Readers

The post New Posts…For All You RSS Readers appeared first on John Battelle's Search Blog.

I’ve been writing a lot at NewCo’s publication, and will continue to do so. But I want to make sure you folks know about that work, so here are links to a couple of  new pieces. And The Award for the Best Marketing Execution At SXSW Goes To … I went to SXSW again this year, […]

The post New Posts…For All You RSS Readers appeared first on John Battelle's Search Blog.

The post New Posts…For All You RSS Readers appeared first on John Battelle's Search Blog.

I’ve been writing a lot at NewCo’s publication, and will continue to do so. But I want to make sure you folks know about that work, so here are links to a couple of  new pieces.

And The Award for the Best Marketing Execution At SXSW Goes To …

I went to SXSW again this year, and IBM really nailed their million-dollar activation.

Because Calling It “Profiting From The Financialization of Death” Won’t Make the Phones Ring

Pretty joints after midnight stuff, but man, I’m reading Rana’s new book and this one made my head spin.

Lastly, if you want to stay current on my work at NewCo, which is increasingly editorial in nature, sign up for the Daily newsletter. We’re also launching a Weekly version, for which I’ll be writing a regular column. Sign up here!

The post New Posts…For All You RSS Readers appeared first on John Battelle's Search Blog.

Why are particle accelerators so large?

CERN physicist Edda Gschwendtner explains why we need big machines to study tiny particles.

The Large Hadron Collider at CERN is a whopping 27 kilometers in circumference. Edda Gschwendtner, physicist and project leader for CERN’s plasma wakefield acceleration experiment (AWAKE), explains why scientists use such huge machines.

We can only see so much with the naked eye. To see things that are smaller, we use a microscope, and to see things that are further away, we use a telescope. The more powerful the tool, the more we can see.

Particle accelerators are tools that allow us probe both the fundamental components of nature and the evolution and origin of all matter in the visible (and maybe even the invisible?) universe. The more powerful the accelerator, the further we can see into the infinitely small and the infinitely large.

You can think about particle accelerators like a racetrack for particles. Racecars don’t start out going 200 miles per hour—they must gradually accelerate over time on either a large circular racetrack or a long, straight road.

In physics, these two types of “tracks” are circular accelerators and linear accelerators.

Particles in circular accelerators gradually gain energy as they race through an accelerating structure at a certain position in the ring. For instance, the protons in the LHC make 11,000 laps every second for 20 minutes before they reach their collision energy. During their journey, magnets guide the particles around the bends in the accelerator and keep them on course.

But just like a car on a curvy mountain road, the particles’ energy is limited by the curves in the accelerators. If the turns are too tight or the magnets are too weak, the particles will eventually fly off course.

Linear accelerators don’t have this problem, but they face an equally challenging aspect: particles in linear accelerators only have the length of the track where they pass through accelerating structures to reach their desired energy. Once they reach the end, that’s it.

So if we want to look deeper into matter and further back toward the start of the universe, we have to go higher in energy, which means we need more powerful tools.

One option is to build larger accelerators—linear accelerators hundreds of miles long or giant circular accelerators with long, mellow turns.

We can also invest in our technology. We can develop accelerating structure techniques to rapidly and effectively accelerate particles in linear accelerators over a short distance.  We can also design and build incredibly strong magnets—stronger than anything that exists today—that can bend ultra-high energy particles around the turns in circular accelerators.

Realistically, the future tools we use to look into the infinitely small and infinitely large will involve a combination of technological advancement and large-scale engineering to bring us closer to understanding the unknown.

Have a burning question about particle physics? Let us know via email or Twitter (using the hashtag #AskSymmetry). We might answer you in a future video!

CERN physicist Edda Gschwendtner explains why we need big machines to study tiny particles.

The Large Hadron Collider at CERN is a whopping 27 kilometers in circumference. Edda Gschwendtner, physicist and project leader for CERN’s plasma wakefield acceleration experiment (AWAKE), explains why scientists use such huge machines.

We can only see so much with the naked eye. To see things that are smaller, we use a microscope, and to see things that are further away, we use a telescope. The more powerful the tool, the more we can see.

Particle accelerators are tools that allow us probe both the fundamental components of nature and the evolution and origin of all matter in the visible (and maybe even the invisible?) universe. The more powerful the accelerator, the further we can see into the infinitely small and the infinitely large.

You can think about particle accelerators like a racetrack for particles. Racecars don’t start out going 200 miles per hour—they must gradually accelerate over time on either a large circular racetrack or a long, straight road.

In physics, these two types of “tracks” are circular accelerators and linear accelerators.

Particles in circular accelerators gradually gain energy as they race through an accelerating structure at a certain position in the ring. For instance, the protons in the LHC make 11,000 laps every second for 20 minutes before they reach their collision energy. During their journey, magnets guide the particles around the bends in the accelerator and keep them on course.

But just like a car on a curvy mountain road, the particles’ energy is limited by the curves in the accelerators. If the turns are too tight or the magnets are too weak, the particles will eventually fly off course.

Linear accelerators don’t have this problem, but they face an equally challenging aspect: particles in linear accelerators only have the length of the track where they pass through accelerating structures to reach their desired energy. Once they reach the end, that’s it.

So if we want to look deeper into matter and further back toward the start of the universe, we have to go higher in energy, which means we need more powerful tools.

One option is to build larger accelerators—linear accelerators hundreds of miles long or giant circular accelerators with long, mellow turns.

We can also invest in our technology. We can develop accelerating structure techniques to rapidly and effectively accelerate particles in linear accelerators over a short distance.  We can also design and build incredibly strong magnets—stronger than anything that exists today—that can bend ultra-high energy particles around the turns in circular accelerators.

Realistically, the future tools we use to look into the infinitely small and infinitely large will involve a combination of technological advancement and large-scale engineering to bring us closer to understanding the unknown.

Have a burning question about particle physics? Let us know via email or Twitter (using the hashtag #AskSymmetry). We might answer you in a future video!

69 The Passion of the Christ vs. The Last Temptation of Christ

Devin and Amy dive into a “versus” episode following last week’s Christ theme. Will Martin Scorsese’s non-traditional take on Christ’s life defeat Jim Caviezel’s magical eyes? Tune in, and head over to the Earwolf forums to cast your vote.

Devin and Amy dive into a “versus” episode following last week’s Christ theme. Will Martin Scorsese’s non-traditional take on Christ’s life defeat Jim Caviezel’s magical eyes? Tune in, and head over to the Earwolf forums to cast your vote.

Joe Palca: 175 Riverside Drive

A series of incidents propels Joe Palca to a career in sleep research.

A series of incidents propels Joe Palca to a career in sleep research.

Joe Palca is a science correspondent for NPR. He comes to journalism from a science background, having received a Ph.D. in psychology from the University of California at Santa Cruz where he worked on human sleep physiology. Since joining NPR in 1992, Dr. Palca has covered a range of science topics — everything from biomedical research to astronomy. He is currently focused on the eponymous series, “Joe’s Big Idea.” Stories in the series explore the minds and motivations of scientists and inventors. Palca has also worked as a television science producer, a senior correspondent for Science Magazine, and Washington news editor of Nature. Palca has won numerous awards, several of which came with attractive certificates. With Flora Lichtman, Palca is the co-author of Annoying: The Science of What Bugs Us (Wiley, 2011).

Bump Watch 2016

A bump in the LHC data has physicists electrified…but what does it mean?

In December, the ATLAS and CMS experiments presented a sneak peek of the new data collected during the first few months of the Large Hadron Collider’s enormously energetic second run. Both experiments reported a small excess of photon pairs with a combined mass around 750 GeV. This small excess could be the first hint of a new massive particle that spits out two photons as it decays, or it might be a coincidental fluctuation that will disappear with more information.

Now, physicists are presenting their latest analyses at the Moriond conference in La Thuile, Italy, including a full investigation of this mysterious bump. After carefully checking, cross-checking and rechecking the data, both experiments have come to the same conclusion—the bump is still there.

“We’ve re-calibrated our data and made several improvements to our analyses,” says Livia Soffi, a postdoc at Cornell University. “These are the best, most refined results we have. But we’re still working with the same amount of data we collected in 2015. At this point, only more data could make a significant difference in our ongoing research.”

LHC physicists wield a myriad of powerful tools to investigate the mysteries of the universe: a 17-mile-long particle accelerator; huge and intricate particle detectors; a worldwide network of computing centers. But from all these resources, there’s one tool that can make or break any potential discovery—statistics.

In 2015, LHC scientists recorded data from 20 trillion proton-proton collisions. A few tens of thousands of these collisions simultaneously produced a high-energy and clean pair of photons. Around 1200 of these photon pairs have a combined energy of 125 GeV (scientists now know that Higgs Bosons spat-out about 100 of them. The other 1100 were produced by normal and well known processes.) Moving towards higher energies, the spectrum starts to fluctuate more and more as there are fewer and fewer pairs recorded. At around 750 GeV, scientists observed only a few dozen photon pairs, and a handful more than predicted.

But whether this extra handful is evidence of a new particle or just another normal statistical fluctuation is essentially a coin toss.

“In physics we sometimes see excesses due to statistical fluctuations as we go up to higher and higher energies,” says Massimiliano Bellomo, a postdoc at the University of Massachusetts, Amherst. “We’re currently at the very edge of our sensitivity and cannot confirm or exclude any of the bumps we’re watching until we have much more data.”

On its own, one small bump means nothing. But excitement builds when independent experiments start to see the same bump popping up over and over again.

“I saw this fluctuation while doing my PhD thesis with CMS data from Run 1 and didn’t think anything of it,” Soffi says. “Now CMS and ATLAS have both seen it again in the new data. This could easily be a coincidence. But if it keeps showing up, then we might have something.”

For their new analysis, the CMS experiment incorporated about 20 percent more data—data that was recorded during Run 2 when the CMS magnet was turned off. They also recalculated the energies of particles recorded by the detector using more refined calibrations. After integrating these two improvements into their analysis, CMS is still seeing the bump; and it’s slightly more pronounced than before.

ATLAS scientists are also delving deeper into this mystery. This week they re-examined data collected from the first run of the LHC to see if this recalcitrant bump would make yet another appearance. Scientists performed two independent searches, each employing a slightly different method to classify and separate the photon pairs. In one analysis, scientists again saw a small excess of photons pairs at 750 GeV. But in the other analysis, they saw nothing out of the ordinary. A further investigation of the 13 TeV data from 2015 shows that the bump is still there, but still not significant.

“The bottom line is that we can’t say anything definitive until we have more data,” says Beate Heinemann, a researcher at the US Department of Energy’s Berkeley Laboratory and deputy spokesperson for the ATLAS experiment. “Therefore we are now focusing on getting ready to record and analyze the large amount of data that the LHC will deliver this year.”

Even with the limited amount of data, physicists are already looking ahead and speculating what this little bump could be if it grows over time. A popular and emerging theory is that it could be the first glimpse of a heavier cousin of the Higgs boson.

“We have discovered one Higgs boson with a mass of 125 GeV,” says Andrei Gritsan, a professor of physics at Johns Hopkins University. “We are trying to understand this boson deeper, but at the same time we are looking for other possible Higgs bosons with higher masses. We are excited about the excess at 750 GeV in one decay channel, but we need to establish if this signal is real and if it appears anywhere else before we can say something about it. All we can do today is hypothesize and speculate.”

If this bump is early evidence of a new boson, theorists predict that it will transform into a wide assortments of particles—not just two photons. For instance, a heavier cousin of the Higgs boson would likely behave like the known 125 GeV Higgs boson, which spits out a pair of Z bosons, a pair of W bosons, or a pair of photons when it decays.

Experimentalists recently grouped Z boson and W boson pairs with approximately the same energy and compared the number of pairs-per-group with the predicted totals, which were generated by thousands of computer simulations.

“Evidence of more pairs than predicted materializing yet again at 750 GeV would be exciting,” says Jim Olsen, a professor of physics at Princeton University. “And then the question would be if and how all the observations are related.”

But after performing these analyses, scientists are reporting that they haven’t observed anything out of the ordinary in any other channels. Scientists are also mapping the energies of Z bosons paired with a photon, a channel which theorists predict could be a goldmine for new physical phenomena. But the first results also show no anomalies.

Physicists are at the very beginning of LHC Run 2 and have collected about one-tenth as much data as they did during Run 1. The new data comes from collisions that are 1.6 times more energetic than Run 1 collisions and opens up a new energy regime that was not previously accessible. But physicists need time for the data to accumulate.

“We’re expecting to get a factor of 30 times more data over the course of the next three years, which will allow us to probe this higher mass range better,” says Bellomo. “We will resume collecting data again in later spring and should be able to say a lot more about this bump and others searches by the end of the summer.”

A bump in the LHC data has physicists electrified…but what does it mean?

In December, the ATLAS and CMS experiments presented a sneak peek of the new data collected during the first few months of the Large Hadron Collider’s enormously energetic second run. Both experiments reported a small excess of photon pairs with a combined mass around 750 GeV. This small excess could be the first hint of a new massive particle that spits out two photons as it decays, or it might be a coincidental fluctuation that will disappear with more information.

Now, physicists are presenting their latest analyses at the Moriond conference in La Thuile, Italy, including a full investigation of this mysterious bump. After carefully checking, cross-checking and rechecking the data, both experiments have come to the same conclusion—the bump is still there.

“We’ve re-calibrated our data and made several improvements to our analyses,” says Livia Soffi, a postdoc at Cornell University. “These are the best, most refined results we have. But we’re still working with the same amount of data we collected in 2015. At this point, only more data could make a significant difference in our ongoing research.”

LHC physicists wield a myriad of powerful tools to investigate the mysteries of the universe: a 17-mile-long particle accelerator; huge and intricate particle detectors; a worldwide network of computing centers. But from all these resources, there’s one tool that can make or break any potential discovery—statistics.

In 2015, LHC scientists recorded data from 20 trillion proton-proton collisions. A few tens of thousands of these collisions simultaneously produced a high-energy and clean pair of photons. Around 1200 of these photon pairs have a combined energy of 125 GeV (scientists now know that Higgs Bosons spat-out about 100 of them. The other 1100 were produced by normal and well known processes.) Moving towards higher energies, the spectrum starts to fluctuate more and more as there are fewer and fewer pairs recorded. At around 750 GeV, scientists observed only a few dozen photon pairs, and a handful more than predicted.

But whether this extra handful is evidence of a new particle or just another normal statistical fluctuation is essentially a coin toss.

“In physics we sometimes see excesses due to statistical fluctuations as we go up to higher and higher energies,” says Massimiliano Bellomo, a postdoc at the University of Massachusetts, Amherst. “We’re currently at the very edge of our sensitivity and cannot confirm or exclude any of the bumps we’re watching until we have much more data.”

On its own, one small bump means nothing. But excitement builds when independent experiments start to see the same bump popping up over and over again.

“I saw this fluctuation while doing my PhD thesis with CMS data from Run 1 and didn’t think anything of it,” Soffi says. “Now CMS and ATLAS have both seen it again in the new data. This could easily be a coincidence. But if it keeps showing up, then we might have something.”

For their new analysis, the CMS experiment incorporated about 20 percent more data—data that was recorded during Run 2 when the CMS magnet was turned off. They also recalculated the energies of particles recorded by the detector using more refined calibrations. After integrating these two improvements into their analysis, CMS is still seeing the bump; and it’s slightly more pronounced than before.

ATLAS scientists are also delving deeper into this mystery. This week they re-examined data collected from the first run of the LHC to see if this recalcitrant bump would make yet another appearance. Scientists performed two independent searches, each employing a slightly different method to classify and separate the photon pairs. In one analysis, scientists again saw a small excess of photons pairs at 750 GeV. But in the other analysis, they saw nothing out of the ordinary. A further investigation of the 13 TeV data from 2015 shows that the bump is still there, but still not significant.

“The bottom line is that we can’t say anything definitive until we have more data,” says Beate Heinemann, a researcher at the US Department of Energy’s Berkeley Laboratory and deputy spokesperson for the ATLAS experiment. “Therefore we are now focusing on getting ready to record and analyze the large amount of data that the LHC will deliver this year.”

Even with the limited amount of data, physicists are already looking ahead and speculating what this little bump could be if it grows over time. A popular and emerging theory is that it could be the first glimpse of a heavier cousin of the Higgs boson.

“We have discovered one Higgs boson with a mass of 125 GeV,” says Andrei Gritsan, a professor of physics at Johns Hopkins University. “We are trying to understand this boson deeper, but at the same time we are looking for other possible Higgs bosons with higher masses. We are excited about the excess at 750 GeV in one decay channel, but we need to establish if this signal is real and if it appears anywhere else before we can say something about it. All we can do today is hypothesize and speculate.”

If this bump is early evidence of a new boson, theorists predict that it will transform into a wide assortments of particles—not just two photons. For instance, a heavier cousin of the Higgs boson would likely behave like the known 125 GeV Higgs boson, which spits out a pair of Z bosons, a pair of W bosons, or a pair of photons when it decays.

Experimentalists recently grouped Z boson and W boson pairs with approximately the same energy and compared the number of pairs-per-group with the predicted totals, which were generated by thousands of computer simulations.

“Evidence of more pairs than predicted materializing yet again at 750 GeV would be exciting,” says Jim Olsen, a professor of physics at Princeton University. “And then the question would be if and how all the observations are related.”

But after performing these analyses, scientists are reporting that they haven’t observed anything out of the ordinary in any other channels. Scientists are also mapping the energies of Z bosons paired with a photon, a channel which theorists predict could be a goldmine for new physical phenomena. But the first results also show no anomalies.

Physicists are at the very beginning of LHC Run 2 and have collected about one-tenth as much data as they did during Run 1. The new data comes from collisions that are 1.6 times more energetic than Run 1 collisions and opens up a new energy regime that was not previously accessible. But physicists need time for the data to accumulate.

“We’re expecting to get a factor of 30 times more data over the course of the next three years, which will allow us to probe this higher mass range better,” says Bellomo. “We will resume collecting data again in later spring and should be able to say a lot more about this bump and others searches by the end of the summer.”