50 years ago, engineers tried catching commercial planes in nets

A gigantic emergency arresting gear system, capable of stopping the largest four-engined jet aircraft without discomfort to passengers, is being developed for the French Ministry of Transportation. The system consists of a nylon net … which engages the aircraft for the full width of its wingspan. Net and airplane are brought to a slow stop by energy absorbing devices located along the sides of the runway. — Science News, September 28, 1967
Catching commercial airliners in giant nets never took off. However, aircraft carriers have deployed nets since 1931 for emergency landings. In modern versions, nets are linked to energy-absorbers below deck to help bring a plane to a safe stop. Today’s net systems are a big improvement over the original barricade: Aviation pioneer Eugene Ely first landed an airplane on a ship, the USS Pennsylvania, in 1911. His landing relied on sandbag-secured ropes across the deck plus a canvas awning between the plane and the sea.
Editor’s note: This story was corrected on November 6, 2017. The nets used on the aircraft carriers to arrest airplanes were not made of nylon until after nylon became available in 1935.

Oldest traces of a dysentery-causing parasite were found in ancient toilets

Giardia has plagued people for a long time.

The parasite can bring about dysentery — a miserable (and occasionally deadly) mixture of diarrhea, cramps and fever. Scientists have now uncovered traces of the giardia parasite in the remains of two roughly 2,600-year-old toilets once used by the wealthy denizens of Jerusalem. The remains are the oldest known biological evidence of giardia anywhere in the world, researchers report May 25 in Parasitology.

The single-cell parasite Giardia duodenalis can be found today in human guts around the planet. This wasn’t always the case — but working out how pathogens made their debut and moved around is no easy feat (SN: 2/2/22). While some intestinal parasites can be preserved for centuries in the ground, others, like giardia, quickly disintegrate and can’t be spotted under a microscope.
In 1991 and 2019, archeologists working at two sites in Jerusalem came across stone toilet seats in the remains of mansionlike homes. These “were quite posh toilets” used by “swanky people,” says Piers Mitchel, a paleoparasitolgist at the University of Cambridge.

The original excavators of soil taken from beneath the seats of these toilets glimpsed traces of roundworm and other possible intestinal parasites in soil samples put under a microscope. Mitchel and his colleagues built on this analysis by using antibodies to search for the remains of giardia and two other fragile parasites in the millennia-old decomposed feces under both seats.

There was “plenty of doubt” that giardia was around in Jerusalem at the time because it’s so hard to reconstruct the movement of ancient disease, Mitchel says.

But the find hints that it was a regular presence in the region, says Mattieu le Bailly, a paleoparasitolgist at the University Bourgogne Franche-Comté in Besançon, France, who was not involved in the study.

The idea that a pathogen like giardia, which spreads via contaminated water and sometimes flies, existed and was possibly widespread in ancient Jerusalem makes a lot of sense, Mitchel says, given the hot, dry, insect-ridden climate around the Iron Age city.

These ants build tall nest hills to help show the way home

Some ants have figured out how to keep from getting lost: Build taller anthills.

Desert ants that live in the hot, flat salt pans of Tunisia spend their days looking for food. Successful grocery runs can take the insects as far as 1.1 kilometers from their nests. So some of these ants build towering hills over their nests that serve as a landmark to guide the way home, researchers report in the July 10 Current Biology.
“I am surprised and fascinated that ants have visual acuity at the distances implied in this work,” says ecologist Judith Bronstein of the University of Arizona in Tucson who wasn’t involved in the new study. It “also implies that ants regularly assess the complexity of their local habitat and change their decisions based on what they conclude about it.”

Desert ants (Cataglyphis spp.) use a navigation system called path integration, relying on the sun’s position and counting their steps to keep track of where they are relative to their nest (SN: 1/19/17). But this system becomes increasingly unreliable as distance from the nest increases. Like other types of ants, desert ants also rely more generally on sight and smell. But the vast, almost featureless salt pans look nearly the same in every direction.

“We realized that, whenever the ants in salt pans came closer to their nest, they suddenly pinpointed the nest hill … from several meters distance,” says Markus Knaden, a neuroethologist at Max Planck Institute for Chemical Ecology in Jena, Germany. “This made us think that the hill functions as a nest-defining landmark.”

So Knaden and colleagues captured ants (C. fortis) from nests in the middle of salt pans and from along their shorelines. Only nests from the salt pan interiors had distinct hills, which can be up to 40 centimeters tall, whereas the hills on shoreline nests were lower or barely noticeable.
Next, the team removed any hills and placed the captured insects some distance away from their nests. Ants from the salt pans’ interiors struggled more than shore ants to find home. Since the shore ants were adept at using the shoreline for guidance, they weren’t as affected by the hill removal, the researchers conclude.

The team wanted to know if the ants were deliberately building a taller hill when their surroundings lacked any visible landmark. So, the researchers removed the hills of 16 salt pan nests and installed two 50-centimeter-tall black cylinders apiece near eight of them. The other eight nests were left without any artificial visual aid.
After three days, the researchers found that ants from seven of the unaided nests had rebuilt their hills. But ants from only two of the nests with cylinders had bothered to rebuild.

“These desert ants already told us about path integration and step counting for orientation…. But this business of building your own visual landmark, incredible,” says entomologist John Longino of the University of Utah in Salt Lake City who wasn’t involved in the research. “Are they sitting down to a council meeting to decide whether they need a bigger landmark? Is this somehow an evolved behavior in this one desert ant species?”

For now, it’s unclear how the ants decide to build, or not to build, a hill. Interestingly, nest building is usually performed by younger ants that are not foragers yet, Knaden says, and have not experienced the difficulty in finding a nest in the absence of a hill. That means there is an exchange of information between the veteran foraging ants and their novice nest mates, he says.

Bronstein also wonders about the risks of building the taller structures. Such risks “are implied by the fact that the ants don’t build such a structure where it isn’t needed,” she says. But, “for instance, isn’t it a clear cue to ant predators that food can be found there?”

Here’s the real story on jellyfish taking over the world

Jellyfish have gotten a bad rap. In recent years, concerns about rising jellyfish populations in some parts of the world have mushroomed into headlines like “Meet your new jellyfish overlords.” These floating menaces are taking over the world’s oceans thanks to climate change and ocean acidification, the thinking goes, and soon waters will be filled with little more than the animals’ pulsating goo.

It’s a vivid and frightening image, but researchers aren’t at all certain that it’s true. In her new book, Spineless, former marine scientist Juli Berwald sets out to find the truth about the jellyfish take-over. In the process, she shares much more about these fascinating creatures than merely their numbers.
Among a few of the amazing jellyfish facts and tales throughout the book: Jellyfish have astoundingly complex vision for animals without a brain. They are also the most efficient swimmers ever studied, among the most ancient animals surviving on Earth today and some of the most toxic sea creatures (SN: 9/6/14, p. 16).

Rather than merely reciting these facts, Berwald takes readers on a personal journey, tracing how life pulled her away from science after she earned her Ph.D. — and how jellies brought her back. Through the tale of her experiments with a home jellyfish aquarium, she explains jelly biology, from the amazing shape-shifting properties of the mesoglea that forms a jellyfish’s bulk to why so many species are transparent. As she juggles family life with interviews with the world’s leading jellyfish researchers, Berwald also documents her travels to places around the globe where jellyfish and humans intersect, such as Israel’s coral reefs and Japan’s fisheries.
The answer to the question of whether jellyfish populations are on the rise ultimately lies at this intersection, Berwald finds. Marine scientists are split on whether populations are increasing globally. It depends on which data you include, and it’s possible that jellyfish numbers fluctuate naturally on a 20-year cycle. What is clear is that in coastal areas around the world, people have unwittingly created spawning grounds for huge numbers of jellyfish simply by building docks and other structures that quickly multiplying jellyfish polyps can attach to.

In the end, Berwald says, jellyfish became a “vehicle for me to explore the threats to the ocean’s future. They’re a way to start a conversation about things that can seem boring and abstract — acidification, warming, overfishing and coastal development — but that are changing our oceans in fundamental ways.” And that’s more interesting than an ocean full of goo.

New camera on Palomar telescope will seek out supernovas, asteroids and more

A new eye on the variable sky just opened. The Zwicky Transient Facility, a robotic camera designed to rapidly scan the sky nightly for objects that move, flash or explode, took its first image on November 1.

The camera, mounted on a telescope at Caltech’s Palomar Observatory near San Diego, succeeds the Palomar Transient Factory. Between 2009 and 2017, the Palomar Transient Factory caught two separate supernovas hours after they exploded, one in 2011 (SN: 9/24/11, p. 5) and one earlier this year (SN: 2/13/17). It also found the longest-lasting supernova ever, from a star that seems to explode over and over (SN: 11/8/17).

The Zwicky survey will spot similar short-lived events and other cosmic blips, like stars being devoured by black holes (SN: 4/1/17, p. 5), as well as asteroids and comets. But Zwicky will work much faster than its predecessor: It will operate 10 times as fast, cover seven times as much of the sky in a single image and take 2.5 times as many exposures each night. Computers will search the images for any astronomical object that changes from one scan to the next.

The camera is named for Caltech astronomer Fritz Zwicky, who first used the term “supernova” in 1931 to describe the explosions that mark a star’s death (SN: 10/24/13).

Simulating the universe using Einstein’s theory of gravity may solve cosmic puzzles

If the universe were a soup, it would be more of a chunky minestrone than a silky-smooth tomato bisque.

Sprinkled with matter that clumps together due to the insatiable pull of gravity, the universe is a network of dense galaxy clusters and filaments — the hearty beans and vegetables of the cosmic stew. Meanwhile, relatively desolate pockets of the cosmos, known as voids, make up a thin, watery broth in between.

Until recently, simulations of the cosmos’s history haven’t given the lumps their due. The physics of those lumps is described by general relativity, Albert Einstein’s theory of gravity. But that theory’s equations are devilishly complicated to solve. To simulate how the universe’s clumps grow and change, scientists have fallen back on approximations, such as the simpler but less accurate theory of gravity devised by Isaac Newton.
Relying on such approximations, some physicists suggest, could be mucking with measurements, resulting in a not-quite-right inventory of the cosmos’s contents. A rogue band of physicists suggests that a proper accounting of the universe’s clumps could explain one of the deepest mysteries in physics: Why is the universe expanding at an increasingly rapid rate?

The accepted explanation for that accelerating expansion is an invisible pressure called dark energy. In the standard theory of the universe, dark energy makes up about 70 percent of the universe’s “stuff” — its matter and energy. Yet scientists still aren’t sure what dark energy is, and finding its source is one of the most vexing problems of cosmology.

Perhaps, the dark energy doubters suggest, the speeding up of the expansion has nothing to do with dark energy. Instead, the universe’s clumpiness may be mimicking the presence of such an ethereal phenomenon.
Most physicists, however, feel that proper accounting for the clumps won’t have such a drastic impact. Robert Wald of the University of Chicago, an expert in general relativity, says that lumpiness is “never going to contribute anything that looks like dark energy.” So far, observations of the universe have been remarkably consistent with predictions based on simulations that rely on approximations.
As observations become more detailed, though, even slight inaccuracies in simulations could become troublesome. Already, astronomers are charting wide swaths of the sky in great detail, and planning more extensive surveys. To translate telescope images of starry skies into estimates of properties such as the amount of matter in the universe, scientists need accurate simulations of the cosmos’s history. If the detailed physics of clumps is important, then simulations could go slightly astray, sending estimates off-kilter. Some scientists already suggest that the lumpiness is behind a puzzling mismatch of two estimates of how fast the universe is expanding.

Researchers are attempting to clear up the debate by conquering the complexities of general relativity and simulating the cosmos in its full, lumpy glory. “That is really the new frontier,” says cosmologist Sabino Matarrese of the University of Padua in Italy, “something that until a few years ago was considered to be science fiction.” In the past, he says, scientists didn’t have the tools to complete such simulations. Now researchers are sorting out the implications of the first published results of the new simulations. So far, dark energy hasn’t been explained away, but some simulations suggest that certain especially sensitive measurements of how light is bent by matter in the universe might be off by as much as 10 percent.

Soon, simulations may finally answer the question: How much do lumps matter? The idea that cosmologists might have been missing a simple answer to a central problem of cosmology incessantly nags some skeptics. For them, results of the improved simulations can’t come soon enough. “It haunts me. I can’t let it go,” says cosmologist Rocky Kolb of the University of Chicago.

Smooth universe
By observing light from different eras in the history of the cosmos, cosmologists can compute the properties of the universe, such as its age and expansion rate. But to do this, researchers need a model, or framework, that describes the universe’s contents and how those ingredients evolve over time. Using this framework, cosmologists can perform computer simulations of the universe to make predictions that can be compared with actual observations.
After Einstein introduced his theory in 1915, physicists set about figuring out how to use it to explain the universe. It wasn’t easy, thanks to general relativity’s unwieldy, difficult-to-solve suite of equations. Meanwhile, observations made in the 1920s indicated that the universe wasn’t static as previously expected; it was expanding. Eventually, researchers converged on a solution to Einstein’s equations known as the Friedmann-Lemaître-Robertson-Walker metric. Named after its discoverers, the FLRW metric describes a simplified universe that is homogeneous and isotropic, meaning that it appears identical at every point in the universe and in every direction. In this idealized cosmos, matter would be evenly distributed, no clumps. Such a smooth universe would expand or contract over time.
A smooth-universe approximation is sensible, because when we look at the big picture, averaging over the structures of galaxy clusters and voids, the universe is remarkably uniform. It’s similar to the way that a single spoonful of minestrone soup might be mostly broth or mostly beans, but from bowl to bowl, the overall bean-to-broth ratios match.

In 1998, cosmologists revealed that not only was the universe expanding, but its expansion was also accelerating (SN: 2/2/08, p. 74). Observations of distant exploding stars, or supernovas, indicated that the space between us and them was expanding at an increasing clip. But gravity should slow the expansion of a universe evenly filled with matter. To account for the observed acceleration, scientists needed another ingredient, one that would speed up the expansion. So they added dark energy to their smooth-universe framework.

Now, many cosmologists follow a basic recipe to simulate the universe — treating the cosmos as if it has been run through an imaginary blender to smooth out its lumps, adding dark energy and calculating the expansion via general relativity. On top of the expanding slurry, scientists add clumps and track their growth using approximations, such as Newtonian gravity, which simplifies the calculations.

In most situations, Newtonian gravity and general relativity are near-twins. Throw a ball while standing on the surface of the Earth, and it doesn’t matter whether you use general relativity or Newtonian mechanics to calculate where the ball will land — you’ll get the same answer. But there are subtle differences. In Newtonian gravity, matter directly attracts other matter. In general relativity, gravity is the result of matter and energy warping spacetime, creating curves that alter the motion of objects (SN: 10/17/15, p. 16). The two theories diverge in extreme gravitational environments. In general relativity, for example, hulking black holes produce inescapable pits that reel in light and matter (SN: 5/31/14, p. 16). The question, then, is whether the difference between the two theories has any impact in lumpy-universe simulations.

Most cosmologists are comfortable with the status quo simulations because observations of the heavens seem to fit neatly together like interlocking jigsaw puzzle pieces. Predictions based on the standard framework agree remarkably well with observations of the cosmic microwave background — ancient light released when the universe was just 380,000 years old (SN: 3/21/15, p. 7). And measurements of cosmological parameters — the fraction of dark energy and matter, for example — are generally consistent, whether they are made using the light from galaxies or the cosmic microwave background.

However, the reliance on Newton’s outdated theory irks some cosmologists, creating a lingering suspicion that the approximation is causing unrecognized problems. And some cosmological question marks remain. Physicists still puzzle over what makes up dark energy, along with another unexplained cosmic constituent, dark matter, an additional kind of mass that must exist to explain observations of how galaxies and galaxy clusters rotate. “Both dark energy and dark matter are a bit of an embarrassment to cosmologists, because they have no idea what they are,” says cosmologist Nick Kaiser of École Normale Supérieure in Paris.
Dethroning dark energy
Some cosmologists hope to explain the universe’s accelerating expansion by fully accounting for the universe’s lumpiness, with no need for the mysterious dark energy.

These researchers argue that clumps of matter can alter how the universe expands, when the clumps’ influence is tallied up over wide swaths of the cosmos. That’s because, in general relativity, the expansion of each local region of space depends on how much matter is within. Voids expand faster than average; dense regions expand more slowly. Because the universe is mostly made up of voids, this effect could produce an overall expansion and potentially an acceleration. Known as backreaction, this idea has lingered in obscure corners of physics departments for decades, despite many claims that backreaction’s effect is small or nonexistent.

Backreaction continues to appeal to some researchers because they don’t have to invent new laws of physics to explain the acceleration of the universe. “If there is an alternative which is based only upon traditional physics, why throw that away completely?” Matarrese asks.

Most cosmologists, however, think explaining away dark energy just based on the universe’s lumps is unlikely. Previous calculations have indicated any effect would be too small to account for dark energy, and would produce an acceleration that changes in time in a way that disagrees with observations.

“My personal view is that it’s a much smaller effect,” says astrophysicist Hayley Macpherson of Monash University in Melbourne, Australia. “That’s just basically a gut feeling.” Theories that include dark energy explain the universe extremely well, she points out. How could that be if the whole approach is flawed?

New simulations by Macpherson and others that model how lumps evolve in general relativity may be able to gauge the importance of backreaction once and for all. “Up until now, it’s just been too hard,” says cosmologist Tom Giblin of Kenyon College in Gambier, Ohio.

To perform the simulations, researchers needed to get their hands on supercomputers capable of grinding through the equations of general relativity as the simulated universe evolves over time. Because general relativity is so complex, such simulations are much more challenging than those that use approximations, such as Newtonian gravity. But, a seemingly distinct topic helped lay some of the groundwork: gravitational waves, or ripples in the fabric of spacetime.
The Advanced Laser Interferometer Gravitational-Wave Observatory, LIGO, searches for the tremors of cosmic dustups such as colliding black holes (SN: 10/28/17, p. 8). In preparation for this search, physicists honed their general relativity skills on simulations of the spacetime storm kicked up by black holes, predicting what LIGO might see and building up the computational machinery to solve the equations of general relativity. Now, cosmologists have adapted those techniques and unleashed them on entire, lumpy universes.

The first lumpy universe simulations to use full general relativity were unveiled in the June 2016 Physical Review Letters. Giblin and colleagues reported their results simultaneously with Eloisa Bentivegna of the University of Catania in Italy and Marco Bruni of the University of Portsmouth in England.

So far, the simulations have not been able to account for the universe’s acceleration. “Nearly everybody is convinced [the effect] is too small to explain away the need for dark energy,” says cosmologist Martin Kunz of the University of Geneva. Kunz and colleagues reached the same conclusion in their lumpy-universe simulations, which have one foot in general relativity and one in Newtonian gravity. They reported their first results in Nature Physics in March 2016.

Backreaction aficionados still aren’t dissuaded. “Before saying the effect is too small to be relevant, I would, frankly, wait a little bit more,” Matarrese says. And the new simulations have potential caveats. For example, some simulated universes behave like an old arcade game — if you walk to one edge of the universe, you cross back over to the other side, like Pac-Man exiting the right side of the screen and reappearing on the left. That geometry would suppress the effects of backreaction in the simulation, says Thomas Buchert of the University of Lyon in France. “This is a good beginning,” he says, but there is more work to do on the simulations. “We are in infancy.”

Different assumptions in a simulation can lead to disparate results, Bentivegna says. As a result, she doesn’t think that her lumpy, general-relativistic simulations have fully closed the door on efforts to dethrone dark energy. For example, tricks of light might be making it seem like the universe’s expansion is accelerating, when in fact it isn’t.

When astronomers observe far-away sources like supernovas, the light has to travel past all of the lumps of matter between the source and Earth. That journey could make it look like there’s an acceleration when none exists. “It’s an optical illusion,” Bentivegna says. She and colleagues see such an effect in a simulation reported in March in the Journal of Cosmology and Astroparticle Physics. But, she notes, this work simulated an unusual universe, in which matter sits on a grid — not a particularly realistic scenario.

For most other simulations, the effect of optical illusions remains small. That leaves many cosmologists, including Giblin, even more skeptical of the possibility of explaining away dark energy: “I feel a little like a downer,” he admits.
Surveying the skies
Subtle effects of lumps could still be important. In Hans Christian Andersen’s “The Princess and the Pea,” the princess felt a tiny pea beneath an impossibly tall stack of mattresses. Likewise, cosmologists’ surveys are now so sensitive that even if the universe’s lumps have a small impact, estimates could be thrown out of whack.

The Dark Energy Survey, for example, has charted 26 million galaxies using the Victor M. Blanco Telescope in Chile, measuring how the light from those galaxies is distorted by the intervening matter on the journey to Earth. In a set of papers posted online August 4 at arXiv.org, scientists with the Dark Energy Survey reported new measurements of the universe’s properties, including the amount of matter (both dark and normal) and how clumpy that matter is (SN: 9/2/17, p. 32). The results are consistent with those from the cosmic microwave background — light emitted billions of years earlier.

To make the comparison, cosmologists took the measurements from the cosmic microwave background, early in the universe, and used simulations to extrapolate to what galaxies should look like later in the universe’s history. It’s like taking a baby’s photograph, precisely computing the number and size of wrinkles that should emerge as the child ages and finding that your picture agrees with a snapshot taken decades later. The matching results so far confirm cosmologists’ standard picture of the universe — dark energy and all.

“So far, it has not yet been important for the measurements that we’ve made to actually include general relativity in those simulations,” says Risa Wechsler, a cosmologist at Stanford University and a founding member of the Dark Energy Survey. But, she says, for future measurements, “these effects could become more important.” Cosmologists are edging closer to Princess and the Pea territory.

Those future surveys include the Dark Energy Spectroscopic Instrument, DESI, set to kick off in 2019 at Kitt Peak National Observatory near Tucson; the European Space Agency’s Euclid satellite, launching in 2021; and the Large Synoptic Survey Telescope in Chile, which is set to begin collecting data in 2023.

If cosmologists keep relying on simulations that don’t use general relativity to account for lumps, certain kinds of measurements of weak lensing — the bending of light due to matter acting like a lens — could be off by up to 10 percent, Giblin and colleagues reported at arXiv.org in July. “There is something that we’ve been ignoring by making approximations,” he says.

That 10 percent could screw up all kinds of estimates, from how dark energy changes over the universe’s history to how fast the universe is currently expanding, to the calculations of the masses of ethereal particles known as neutrinos. “You have to be extremely certain that you don’t get some subtle effect that gets you the wrong answers,” Geneva’s Kunz says, “otherwise the particle physicists are going to be very angry with the cosmologists.”

Some estimates may already be showing problem signs, such as the conflicting estimates of the cosmic expansion rate (SN: 8/6/16, p. 10). Using the cosmic microwave background, cosmologists find a slower expansion rate than they do from measurements of supernovas. If this discrepancy is real, it could indicate that dark energy changes over time. But before jumping to that conclusion, there are other possible causes to rule out, including the universe’s lumps.

Until the issue of lumps is smoothed out, scientists won’t know how much lumpiness matters to the cosmos at large. “I think it’s rather likely that it will turn out to be an important effect,” Kolb says. Whether it explains away dark energy is less certain. “I want to know the answer so I can get on with my life.”

50 years ago, folate deficiency was linked to birth defects

Pregnant women who do not have enough folic acid — a B vitamin — in their bodies can pass the deficiency on to their unborn children. It may lead to retarded growth and congenital malformation, according to Dr. A. Leonard Luhby…. “Folic acid deficiency in pregnant women could well constitute a public health problem of dimensions we have not originally recognized,” he says. — Science News. December 9, 1967

Update
Folic acid — or folate — can prevent brain and spinal cord defects in developing fetuses. Since the U.S. Food and Drug Administration required that all enriched grain products contain the vitamin starting in 1998, birth defects have been prevented in about 1,300 babies each year. But some women still don’t get enough folate, while others may be overdoing it. About 10 percent of women may ingest more than the upper limit of 1,000 micrograms daily — about 2.5 times the recommended amount, a 2011 study found. Too much folate may increase a woman’s risk for certain cancers and interfere with some epilepsy drugs.

Why science still can’t pinpoint a mass shooter in the making

Immediately after a 19-year-old shot and killed 17 people and wounded 17 others at a Florida high school on Valentine’s Day, people leaped to explain what had caused the latest mass slaughter.

By now, it’s a familiar drill: Too many readily available guns. Too much untreated mental illness. Too much warped masculinity. Don’t forget those shoot-’em-up video games and movies. Add (or repeat, with voice raised) your own favorite here.

Now the national debate has received an invigorated dose of activism. Inspired by students from the targeted Florida high school, as many as 500,000 people are expected to rally against gun violence and in favor of stricter gun laws on March 24 in Washington, D.C., with sister marches taking place in cities across the world. But a big problem haunts the justifiable outrage over massacres of innocents going about their daily affairs: Whatever we think we know about school shootings, or mass public shootings in general, is either sheer speculation or wrong. A science of mass shootings doesn’t exist.

“There is little good research on what are probably a host of problems contributing to mass violence,” says criminologist Grant Duwe of the Minnesota Department of Corrections in St. Paul. Duwe has spent more than two decades combing through federal crime records and newspaper accounts to track trends in mass killings.
Perhaps this dearth of data is no surprise. Research on any kind of gun violence gets little federal funding (SN Online: 3/9/18; SN: 5/14/16, p. 16). Criminologist James Alan Fox of Northeastern University in Boston has argued for more than 20 years that crime researchers mostly ignore mass shootings. Some of these researchers assume that whatever causes people to commit any form of murder explains mass shootings. Others regard mass killings as driven by severe mental disorders, thus falling outside the realm of crime studies.

When a research vacuum on a matter of public safety meets a 24-hour news cycle juiced up on national anguish, a thousand speculations bloom. “Everybody’s an expert on this issue, but we’re relying on anecdotes,” says sociologist Michael Rocque of Bates College in Lewiston, Maine.

Rocque and Duwe published a review of what’s known about reasons for mass public shootings, sometimes called rampage shootings, in the February Current Opinion in Psychology. Their conclusion: not much. Scientific ignorance on this issue is especially concerning given that Rocque and Duwe describe a slight, but not unprecedented, recent uptick in the national rate of rampage shootings.
Shooting stats
Defining mass public shootings to track their frequency is tricky. A consensus among researchers is emerging that these events occur in public places, include at least four people killed by gunshots within a 24-hour period and are not part of a robbery or any other separate crime, Rocque and Duwe say. Such incidents include workplace and school shootings.
Overall, mass public shootings are rare, Duwe says, though intense media coverage may suggest the opposite. Even less obvious is that rampage shootings have been occurring for at least 100 years.

Using Federal Bureau of Investigation homicide reports, Congressional Research Service data on mass shootings and online archives of news accounts about multiple murders, Duwe has tracked U.S. rates of mass public shootings from 1915 to 2017.

He has identified a total of 185 such events through 2017, 150 of which have occurred since 1966. (In 2016, he published results up to 2013 in the Wiley Handbook of the Psychology of Mass Shootings.) In the earliest known case, from 1915, a Georgia man shot five people dead in the street, after killing an attorney he blamed for financial losses, and wounded 32 others. Another lawyer, who came to the crime scene upon hearing gunshots and was wounded by a bullet, ended the rampage when he grabbed a pistol from a hardware store and killed the shooter.

What stands out more than a century later is that, contrary to popular opinion, mass public shooting rates have not ballooned to record highs. While the average rate of these crimes has increased since 2005, it’s currently no greater than rates for some earlier periods. Crime trends are usually calculated as rates per 100,000 people for, say, robberies and assaults. But because of the small number of mass public shootings, Duwe calculates annual rates per 100 million people in the United States.

The average annual rate of mass public shootings since 2010 is about 1.44 per 100 million people. That roughly equals the 1990s rate of 1.41, Duwe finds.

The average annual rate from 1988 to 1993 reached 1.52, about the same as the 1.51 rate from 2007 to 2012. After dropping to just below 1 per 100 million people in 2013 and 2014, rates increased to nearly 1.3 the next three years.

From 1994 to 2004, rates mostly hovered around 1 per 100 million people or below, but spiked to over 2.5 in 1999. That’s the year two teens killed 13 people at Columbine High School in Colorado.

In contrast, rates were minuscule from 1950 to 1965, when only three mass public shootings were recorded. The average annual rate for 1970 to 1979 reached 0.52, based on 13 mass public shootings.

Numbers of people killed and wounded per shooting incident have risen in the last decade, though. Two events in 2012 were particularly horrific. Shootings at a movie theater in Aurora, Colo., and an elementary school in Newtown, Conn., resulted in 40 murders, many of children, and 60 nonfatal gunshot wounds. Whether this trend reflects an increasing use of guns with large-capacity magazines or other factors “is up for grabs,” Duwe says.
The unknowns
No good evidence exists that either limiting or loosening gun access would reduce mass shootings, Rocque says. Virtually no research has examined whether a federal ban on assault weapons from 1994 to 2004 contributed to the relatively low rate of mass public shootings during that period. The same questions apply to concealed-carry laws, promoted as a way to deter rampage killers. As a gun owner and longtime hunter in his home state of Maine, Rocque calls for “an evidence-based movement” to establish links between gun laws and trends in mass shootings.

Mental illness also demands closer scrutiny, Duwe says. Of 160 mass public shooters from 1915 to 2013, about 60 percent had been assigned a psychiatric diagnosis or had shown signs of serious mental illness before the attack, Duwe has found. In general, mental illness is not linked to becoming violent. But, he says, many mass shooters are tormented and paranoid individuals who want to end their painful lives after evening the score with those they feel have wronged them.

Masculinity also regularly gets raised as a contributor to mass public shootings. It’s a plausible idea, since males committed all but one of the tragedies in Duwe’s review. Sociologist Michael Kimmel of Stony Brook University in New York contends that a sense of wounded masculinity as a result of various life failures inspires rage and even violence. But researchers have yet to examine how any facet of masculinity plays into school or workplace shootings, Rocque says.

Although school shooters often report feeling a desperate need to make up for having been inadequate as men, many factors contribute to their actions, argues clinical psychologist Peter Langman. Based in Allentown, Pa., Langman has interviewed and profiled several dozen school shooters in the United States and other countries.
He divides perpetrators into three psychological categories: psychopathic (lacking empathy and concern for others), psychotic (experiencing paranoid delusions, hearing voices and having poor social skills) and traumatized (coming from families marked by drug addiction, sexual abuse and other severe problems).

But only a few of the millions of people who qualify for those categories translate their personal demons into killing sprees. Any formula to tag mass shooters in the making will inevitably round up lots of people who would never pose a deadly threat.

“There is no good evidence on what differentiates a bitter, aggrieved man from a bitter, aggrieved and dangerous man,” says psychologist Benjamin Winegard of Carroll College in Helena, Mont.

Nor does any published evidence support claims that being a bully or a victim of bullying, or watching violent video games and movies, leads to mass public shootings, Winegard contends. Bullying affects a disturbingly high proportion of youngsters and has been linked to later anxiety and depression (SN: 5/30/15, p. 12) but not to later violence. In laboratory studies, youngsters who play violent computer games or watch violent videos generally don’t become more aggressive or violent in experimental situations. Investigators have found that some school shooters, including the Newtown perpetrator, preferred playing nonviolent video games, Winegard says.

He and a colleague presented this evidence in the Wiley Handbook of the Psychology of Mass Shootings. Northeastern’s Fox also coauthored a chapter in that publication.

Still, a small but tragic group of kids lead lives that somehow turn them into killers of classmates or random strangers (SN: 5/27/06, p. 328). If some precise mix of, say, early brain damage, social ineptitude, paranoia and fury over life’s unfair twists cooks up mass killers, scientists don’t know the toxic recipe. And it won’t be easy to come up with one given the small number of mass public shooters to study.

Duwe recommends that researchers first do a better job of documenting the backgrounds of individual mass shooters and any events or experiences that may have precipitated their deadly actions. Then investigators can address broader social influences on mass shootings, including gun legislation and media coverage.

But more than a century after a distraught Georgia man mowed down six of his fellow citizens, research on mass violence still takes a backseat to public fear and outrage. “If we’re bemoaning the state of research,” Duwe says, “we have no one to blame but ourselves.”

Live heart cells make this material shift color like a chameleon

To craft a new color-switching material, scientists have again taken inspiration from one of nature’s masters of disguise: the chameleon.

Thin films made of heart cells and hydrogel change hues when the films shrink or stretch, much like chameleon skin. This material, described online March 28 in Science Robotics, could be used to test new medications or possibly to build camouflaging robots.

The material is made of a paper-thin hydrogel sheet engraved with nanocrystal patterns, topped with a layer of living heart muscle cells from rats. These cells contract and expand — just as they would inside an actual rat heart to make it beat — causing the underlying hydrogel to shrink and stretch too. That movement changes the way light bounces off the etched crystal, making the material reflect more blue light when it contracts and more red light when it’s relaxed.
This design is modeled after nanocrystals embedded in chameleon skin, which also reflect different colors of light when stretched (SN Online: 3/13/15).

When researchers treated the material with a drug normally used to boost heart rate, the films changed color more quickly — indicating the heart cells were pulsating more rapidly. That finding suggests the material could help drug developers monitor how heart cells react to new medications, says study coauthor Luoran Shang, a physicist at Southeast University in Nanjing, China. Or these kinds of films could also be used to make color-changing skins for soft robots, Shang says.

A dozen new black holes found in Milky Way’s center

The center of the Milky Way may be abuzz with black holes. For the first time, a dozen small black holes have been spotted within the inner region of the galaxy in an area spanning just a few light-years — and there could be thousands more.

Astrophysicist Charles Hailey of Columbia University and his colleagues spotted the black holes thanks to the holes’ interactions with stars slowly spiraling inward, the team reports in Nature on April 4. Isolated black holes emit no light, but black holes stealing material from orbiting stars will heat that material until it emits X-rays.
In 12 years of telescope data from NASA’s orbiting Chandra X-ray Observatory, Hailey and colleagues found 12 objects emitting the right X-ray energy to be black holes with stellar companions. Based on theoretical predictions of how many black holes are paired with stars, there should be up to 20,000 invisible solo black holes just in that small part of the galaxy.
The discovery follows decades of astronomers searching for small black holes in the galactic center, where a supermassive black hole lives (SN: 3/4/17, p. 8). Theory predicted that the galaxy should contain millions or even 100 million black holes overall, with a glut of black holes piled up near the center (SN: 9/16/17, p. 7). But none had been found.
“It was always kind of a mystery,” Hailey says. “If there’s so many that are supposed to be jammed into the central parsec [about 3.26 light-years], why haven’t we seen any evidence?” Finding the 12 was “really hard,” he admits.

It’s unclear how the black holes got to the galaxy’s center. Gravity could have tugged them toward the supermassive black hole. Or a new theory from Columbia astronomer Aleksey Generozov suggests black holes could be born in a disk around the supermassive black hole.

The researchers ruled out other objects emitting X-rays, such as neutron stars and white dwarfs, but acknowledged that up to half of the sources they found could be fast-spinning stellar corpses called millisecond pulsars rather than black holes. That could add to the debate over whether a mysterious excess in gamma rays at the galactic center is from pulsars or dark matter (SN: 12/23/17, p. 12).

“The theorists are going to have to slug it out and figure out what’s going on,” Hailey says.