Public, doctors alike confused about food allergies

Our grasp of food allergy science is as jumbled as a can of mixed nuts. While there are tantalizing clues on how food allergies emerge and might be prevented, misconceptions are plentiful and broad conclusions are lacking, concludes a new report by the National Academies of Sciences, Engineering and Medicine.

As a result, both the general public and medical community are confused and ill-informed about food allergies and what to do about them. Most prevention strategies and many tests used to diagnose a food allergy aren’t supported by scientific evidence and should be abandoned, the 562-page report concludes.
“We are much more in the dark than we thought,” says Virginia Stallings, a coeditor of the new report, released November 30.

While solid data are hard to come by, the report notes, estimates suggest that 12 million to 15 million Americans suffer from food allergies. Common culprits include peanuts, milk, eggs, fish, shellfish, sesame, wheat and soy.

Food allergies should be distinguished from food intolerances; the two are often confused by the public and practitioners, says Stallings, a pediatrician and research director of the nutrition center at the Children’s Hospital of Philadelphia. Strictly defined food allergies, the primary focus of the report, arise from a specific immune response to even a small amount of the allergen; they produce effects such as hives, swelling, vomiting, diarrhea and, most crucially, anaphylaxis, a severe, potentially deadly allergic reaction. These effects reliably occur within two hours after every time a person ingests that food. Allergic reactions that fall outside this strict definition and food-related intolerances, such as a gastrointestinal distress after ingesting lactose, are a legitimate public health concern. But the mechanisms behind them are probably very different than the more strictly defined food allergies, as are the outcomes, says Stallings.

Anyone suspecting a food allergy should see a specialist. If medical history and preliminary results hint at problems, then the gold standard diagnostic test should be applied: the oral food challenge. This test exposes an individual to small amounts of the potentially offending food while under supervision. Doctors and others in health care should abandon many unproven tests, such as ones that analyze gastric juices or measure skin’s electrical resistance, the report concludes.

Regarding prevention, research has borne a little fruit: The authors recommend that parents should give infants foods that contain potential allergens. This recommendation is largely based on peanut allergy research suggesting early exposure is better than late (SN: 3/21/2015, p. 15). There’s little to no evidence supporting virtually all other behaviors thought to prevent food allergies, such as taking vitamin D supplements, or women avoiding allergens while pregnant or breastfeeding.
While additional rigorous long-term studies are needed to better understand why food allergies arise, the report addresses many issues that society can confront in the meantime. Industry needs to develop a low-dose (0.075 milligrams) epinephrine injector to treat infants who experience food allergy anaphylaxis; the U.S. Food and Drug Administration, Department of Agriculture and the food manufacturing industry need to revamp food labeling so it reflects allergy risks; and relevant agencies should establish consistent guidelines for schools and airplanes that include first-aid training and on-site epinephrine supplies.

“This report is mammoth and very impressive,” says Anita Kozyrskyj, whose research focuses on the infant gut microbiome. Kozyrskyj, of the University of Alberta in Canada, presented research to the reports’ authors while they were gathering evidence. She says the report identifies issues that can help guide the research community. But its real value is in the recommendations for parents, schools, caregivers and health care providers who are dealing with food allergies in the here and now.

First spider superdads discovered

The first normally solitary spider to win Dad of the Year sets up housekeeping in a web above his offspring and often ends up as their sole defender and single parent.

Moms handle most parental care known in spiders, says Rafael Rios Moura at the Federal University of Uberlândia in Brazil. But either or both parents care for egg sacs and spiderlings in the small Manogea porracea species he and colleagues studied in a eucalyptus plantation. The dad builds a dome-shaped web above the mom’s web, and either parent will fight hungry invaders looking for baby-spider lunch. In webs with no parents, only about four spiderlings survived per egg sac. But with dad, mom or both on duty, survival more than doubled, the researchers report in the January 2017 Animal Behaviour.
“To the best of my knowledge, there really aren’t other examples where male spiders step up to care for young or eggs,” says Linda Rayor of Cornell University, who has studied spider maternal care. In a group-living Stegodyphus species, some of the males in a communal web will attack intruders, but Manogea dads do much more. They switch from solitary life to a dad-web upstairs, brush rainwater off egg sacs and share defense, sometimes at the cost of their own lives.
Many male web-building spiders stop feeding as adults because they’re out searching for mates instead of catching food with their web, Moura says. Manogea males, however, stick with a female they mated with and build a new food-catching web. Now Moura would like to know whether such commitment makes males unusually choosy about females, he says.

To predators, females “must be very delicious,” Moura says. In the wild he found that many females disappeared, probably eaten, by the end of the breeding season, leaving dads as the sole protector for 68 percent of the egg sacs.

That high female mortality could have been important for evolution of the dads’ care-taking, says behavioral ecologist Eric Yip of Penn State. Just why this species has such high female mortality puzzles him, though. Females, geared up for egg-laying, have rich nutrient stores. Yet, he says, “that’s generally true for all spiders — that females are going to be more nutritious and males are going to be mostly legs.”

Magnetic stars could have created LIGO’s massive black holes

To create a heavy black hole, it might help to start with a massive magnetic star.

Strong magnetic fields could help stem the flow of gas from a heavyweight star, leaving behind enough material to form hefty black holes, a new study suggests. A pair of such magnetic stars could be responsible for giving birth to the black hole duo that created recently detected gravitational waves, researchers report online December 1 in Monthly Notices of the Royal Astronomical Society.
The shake-up in spacetime that was picked up the Advanced Laser Interferometric Gravitational-Wave Observatory, or LIGO, in 2015 came from a collision between two black holes weighing about 29 and 36 times the mass of the sun (SN: 3/5/16, p. 6). Such plump black holes were surprising. The creation of a big black hole requires the explosive death of a gargantuan star. But weighty stars are so bright that the light blows gas into space.

“These massive stars can lose up to half their mass to their dense stellar winds,” says study coauthor Véronique Petit, an astrophysicist at Florida Institute of Technology in Melbourne. That leaves only enough mass to make a more modest black hole.

Having a paucity of elements heavier than helium is one way a massive star might retain gas. Atoms such as carbon, oxygen and iron present large targets to the radiation streaming from a star. Photons nudge these atoms along, generating strong stellar winds. A lack of heavy elements could keep these winds in check.

Petit and colleagues have proposed another idea: intense magnetic fields that might redirect escaping gas back onto the star. Observations in recent years have led to the discovery that about 10 percent of stellar heavyweights have powerful magnetic fields, some exceeding 10,000 gauss (the sun’s magnetic field is, on average, closer to 1 gauss).

Computer simulations allowed researchers to see how much mass a star could retain if it were blanketed by magnetic fields. Magnetism is an effective levee, they found. A magnetic star that starts off with 80 times as much mass as the sun, for example, ends its life about 20 suns heavier than a similarly massive one that’s not magnetic.
“This is an interesting alternative hypothesis for how stars can end up holding onto more of their mass, so they can form such heavy black holes,” says Vicky Kalogera, an astrophysicist at Northwestern University in Evanston, Ill. But, she cautions, “the mechanism is somewhat speculative.” Astronomers don’t have a good handle yet on how magnetic fields change as a star evolves, she says, particularly as the star approaches the end of its life.

“It’s going to be hard to test our hypothesis,” Petit says. Pinpointing the host galaxy of a future collision between obese black holes might help, but that’s fraught with ambiguity. If the galaxy is rich in heavy elements, then perhaps magnetic fields are needed to hold back the flow of gas from gigantic stars. But that doesn’t mean the black holes were born in that environment. They could also have formed early in the universe, says Petit, when their galaxy had fewer heavy elements, in which case magnetic fields might not be necessary.

Genome clues help explain the strange life of seahorses

A seahorse’s genetic instruction book is giving biologists a few insights into the creature’s odd physical features and rare parenting style.

Researchers decoded a male tiger tail seahorse’s (Hippocampus comes) genome and compared it to the genomes of other seahorses and ray-finned fishes. The analysis revealed a bevy of missing genes and other genetic elements responsible for enamel and fin formation. The absence of these genes may explain their tubelike snouts, small toothless mouths, armored bodies and flexible square tails, the team reports online December 14 in Nature.

Although H. comes may be short a few genes, the seahorse has a surplus of other genes important for male pregnancy — a trait unique to seahorses, sea dragons and pipefish. These genetic differences suggest the tiger tail seahorse has evolved more quickly than its relatives, the researchers conclude.

New footprint finds suggest range of body sizes for Lucy’s species

Famous footprints of nearly 3.7-million-year-old hominids, found in 1976 at Tanzania’s Laetoli site, now have sizable new neighbors.

While excavating small pits in 2015 to evaluate the impact of a proposed field museum at Laetoli, researchers uncovered comparably ancient hominid footprints about 150 meters from the original discoveries. The new finds reveal a vast range of body sizes for ancient members of the human evolutionary family, reports an international team led by archaeologists Fidelis Masao and Elgidius Ichumbaki, both of the University of Dar es Salaam in Tanzania.
A description of the new Laetoli footprints appears online December 14 in eLife.

Scientists exposed 14 hominid footprints, made by two individuals as they walked across wet volcanic ash. More than 500 footprints of ancient horses, rhinos, birds and other animals dotted the area around the hominid tracks. Like previously unearthed tracks of three individuals who apparently strode across the same layer of soft ash at the same time, the latest footprints were probably made by members of Australopithecus afarensis, the team says. Best known for Lucy, a partial skeleton discovered in Ethiopia in 1974, A. afarensis inhabited East Africa from around 4 million to 3 million years ago.

All but one of the 14 hominid impressions come from the same individual. Based on footprint dimensions, the researchers estimate that this presumed adult male — nicknamed Chewie in honor of the outsized Star Wars character Chewbacca — stood about 5 feet 5 inches tall and weighed nearly 100 pounds. That makes him the tallest known A. afarensis. The team calculates that the remaining hominid footprint was probably made by a 4-foot-9-inch female who weighed roughly 87 pounds. Stature estimates based on the other three Laetoli footprint tracks fall below that of the ancient female.

Lucy lived later than the Laetoli crowd, around 3.2 million years ago, and was about 3 ½ feet tall.
If Laetoli’s five impression-makers were traveling together, “we can suppose that the Laetoli social group was similar to that of modern gorillas, with one large male and a harem of smaller females and perhaps juveniles,” says paleontologist and study coauthor Marco Cherin of the University of Perugia in Italy.

Chewie’s stature challenges a popular assumption that hominid body sizes abruptly increased with the emergence of the Homo genus, probably shortly after A. afarensis died out, Cherin adds.

The new paper presents reasonable stature estimates based on the Laetoli footprints, but “we don’t have a firm idea of how foot size was related to overall body size in Australopithecus,” says evolutionary biologist Kevin Hatala of Chatham University in Pittsburgh. Masao’s group referred to size data from present-day humans to calculate heights and weights of A. afarensis footprint-makers. That approach “could lead to some error,” Hatala says.

Stature estimates based on footprints face other obstacles, says paleoanthropologist Yohannes Haile-Selassie of the Cleveland Museum of Natural History. For instance, some tall individuals have small feet and short folks occasionally have long feet. It’s also unclear whether the new footprints and those from 1976 represent a single group, or if some smaller footprints were also made by males, Haile-Selassie adds. Cherin’s proposal that large A. afarensis males controlled female harems “is a bit of a stretch,” Haile-Selassie says.

The new report doesn’t document surprisingly large size differences among members of Lucy’s kind, Haile-Selassie adds. A. afarensis fossils previously excavated in Ethiopia include a partial male skeleton now estimated by Haile-Selassie and his colleagues to have been only about three inches shorter than Chewie’s reported height (SN: 7/17/10, p. 5).

Baby starfish whip up whirlpools to snag a meal

A baby starfish scoops up snacks by spinning miniature whirlpools. These vortices catch tasty algae and draw them close so the larva can slurp them up, scientists from Stanford University report December 19 in Nature Physics.

Before starfish take on their familiar shape, they freely swim ocean waters as millimeter-sized larvae. To swim around on the hunt for food, the larvae paddle the water with hairlike appendages called cilia. But, the scientists found, starfish larvae also adjust the orientation of these cilia to fine-tune their food-grabbing vortices.

Scientists studied larvae of the bat star (Patiria miniata), a starfish found on the U.S. Pacific coast, by observing their activities in seawater suffused with tiny beads that traced the flow of liquid. (Watch a video of the experiment.) Too many swirls can slow a larva down, the scientists found, so the baby starfish adapts to the task at hand, creating fewer vortices while swimming and whipping up more of them when stopping to feed.

Hunter-gatherers were possibly first to call Tibetan Plateau home

People hunted and foraged year-round in the thin air of China’s Tibetan Plateau at least 7,400 to 8,400 years ago, a new study suggests. And permanent settlers of the high-altitude region might even have arrived as early as 12,000 to 13,000 years ago.

Three lines of dating evidence indicate that humans occupied the central Tibetan Plateau’s Chusang site, located more than 4,000 meters above sea level, at least 2,200 years earlier than previously thought, say geologist Michael Meyer of the University of Innsbruck in Austria and colleagues. Their report, published in the Jan. 6 Science, challenges the idea that the Tibetan Plateau lacked permanent settlers until farming groups arrived around 5,200 years ago.、
“Hunter-gatherers permanently occupied the Tibetan Plateau by around 8,000 years ago, which coincided with a strong monsoon throughout Asia that created wet conditions on the plateau,” Meyer says.

These early permanent residents hunted animals such as wild yaks and foraged for edible plants, including berries from sea buckthorn shrubs, in nearby river valleys at elevations more than 3,600 meters above sea level, Meyer suspects. Brief, summer forays to Chusang would have been difficult for people living below 3,300 meters above sea level, he adds. Even when mountain passes were clear of heavy snowfall and expanding valley glaciers, round trips from low altitudes to the central Tibetan Plateau would have taken 41 to 70 days, Meyer’s team estimates.

Researchers discovered Chusang in 1998. The site consists of 19 human hand- and footprints on the surface of a fossilized sheet of travertine, a form of limestone deposited there by water from a hot spring.
The new age estimates for Chusang come from three measures: the decay rate of forms of radioactive thorium and uranium in travertine sampled in and around the prints; determinations of the time since quartz crystals extracted from the travertine were last exposed to sunlight; and radiocarbon measures of sediment and microscopic plant remains found on the travertine slab’s surface.
Signs of long-term camping at Chusang have yet to turn up, but extensive excavations of the site have not been conducted, Meyer says. His group found chipped rocks and other stone tool‒making debris at two spots near Chusang’s hot springs. These finds are undated.

Previous research has suggested that hunter-gatherers occasionally reached the Tibetan Plateau’s northern edge by around 12,000 years ago (SN: 7/7/01, p. 7), and again from about 8,000 to 6,000 years ago, says archaeologist Loukas Barton of the University of Pittsburgh, who wasn’t involved in the study. But the new discoveries at Chusang may not necessarily point to permanent residence there. Those early arrivals likely spent a single summer or a few consecutive years at most on the plateau, Barton says. “That would not constitute a peopling of a region any more than our 1969 visit to the moon did,” he says.

Archaeological finds indicate that human populations expanded on the Tibetan Plateau between around 5,200 and 3,600 years ago, Barton says. Those groups cultivated barley and wheat at high altitudes and herded domesticated sheep and perhaps yaks, he says.

Before that time, Chusang might have supported a year-round occupation, says archaeologist David Rhode of the Desert Research Institute in Reno, Nev., who wasn’t involved in the study. But the site could easily have been occupied seasonally, he says. Unlike Meyer, Rhode estimates that Chusang was about a two-week walk from some lower-altitude campsites. “That’s not far at all for a human forager.”

New dates for Chusang also raise the possibility that rare gene variants that aid survival in high-altitude, oxygen-poor locales first evolved among hunter-gatherers on the Tibetan Plateau, Meyer says. But both Barton and Rhode doubt it.

How mice use their brain to hunt

The part of the brain that governs emotions such as fear and anxiety also helps mice hunt. That structure, the amygdala, orchestrates a mouse’s ability to both stalk a cricket and deliver a fatal bite, scientists report January 12 in Cell.

Scientists made select nerve cells in mice’s brains sensitive to light, and then used lasers to activate specific groups of those cells. By turning different cells on and off, the researchers found two separate sets of nerve cells relaying hunting-related messages from the amygdala’s central nucleus. One set controlled the mice’s ability to chase their prey. The other affected their ability to deliver a solid chomp and kill a cricket.
“They’ve found these two behaviors — that are part of something we think of being very complex — are controlled by these two circuits,” says Cris Niell, a neuroscientist at the University of Oregon in Eugene who wasn’t part of the study. “You flip a switch to chase, you flip a switch to attack.”

Ramping both of those circuits up to high power at the same time even led mice to chase and capture a tiny bug-shaped robot that they would normally ignore or avoid.

“The central amygdala has been conceptualized as a center for emotion and fear and threat detection,” says study coauthor Ivan de Araujo, a neuroscientist at the John B. Pierce Laboratory in New Haven, Conn. Now, it seems that the structure also controls the relatively complex task of hunting.

Scientists don’t know how the new function relates to the amygdala’s better-known role as an emotional control center. But the amygdala does help control heart rate and blood pressure, which shift in emotionally charged situations but also need to be regulated when an animal is pursuing prey, de Araujo says.

The study also shows how even a complex task like hunting can be coordinated by different groups of very specialized nerve cells, or neurons, working together. In this case, one set of neurons made a signaling pathway that controlled chasing, while another controlled biting. Together, those neurons helped the mice grab dinner.
“I think over the years we’ve become progressively more surprised by the behavioral specificity of these particular pathways,” says Anthony Leonardo, a neuroscientist at the Howard Hughes Medical Institute’s Janelia Research Campus in Ashburn, Va. “Certainly the evidence is mounting for a very strongly specific role for neurons.” Leonardo has found similarly specialized neurons in the dragonfly brain, with groups of neurons that run in parallel to each other controlling different types of movements.

Next, de Araujo says, his lab hopes to figure out what flips the neural switches in a mouse’s brain — how seeing or smelling potential prey triggers the amygdala to send the critter after a meal.

Petrified tree rings tell ancient tale of sun’s behavior

The sun has been in the same routine for at least 290 million years, new research suggests.

Ancient tree rings from the Permian period record a roughly 11-year cycle of wet and dry periods, climate fluctuations caused by the ebbing and flowing of solar activity, researchers propose January 9 in Geology. The discovery would push back the earliest evidence of today’s 11-year solar cycle by tens of millions of years.

“The sun has apparently been doing what it’s been doing today for a long time,” says Nat Gopalswamy, a solar scientist at NASA’s Goddard Space Flight Center in Greenbelt, Md., who was not involved in the study.
Around every 11 years, the sun’s brightness and the frequency of sunspots and solar flares completes one round of waxing and waning. These solar changes alter the intensity of sunlight reaching Earth and, some scientists hypothesize, may affect the composition of the stratosphere and rates of cloud formation. Those effects could alter rainfall rates, which in turn influence tree growth.

Ancient trees may hold clues to similar cycles from long ago. In what is now southeast Germany, volcanic eruptions buried an ancient forest under debris roughly 290 million years ago. Paleontologists Ludwig Luthardt and Ronny Rößler of the Natural History Museum in Chemnitz, Germany, identified tree rings in the fossilized remains of the trees.

Measuring the widths of the rings, which show how much the plants grew each year, the researchers discovered a cycle in growth rates. The cycle lasted on average 10.62 years. This cycle reflects years-long rises and falls in annual rainfall rates caused by the solar cycle, the researchers propose. The cycle’s average length falls within the 10.44-year to 11.16-year length of the sunspot cycle seen over the last few hundred years.

Whether the solar and tree ring cycles are connected isn’t certain, says paleoclimatologist Adam Csank of the University of Nevada, Reno. Many studies suggest that it is not possible to clearly identify sunspot cycles in modern tree ring records, he notes. Other changes in Earth’s climate system or periodic insect outbreaks might contribute to tree ring widths, he says.

50 years ago, methadone made a rosy debut

Heroin cure works

[T]he drug methadone appears to have fulfilled its promise as an answer to heroin addiction. Some 276 hard-core New York addicts … have lost their habits and none have returned to heroin — a 100 percent success rating. Methadone, a synthetic narcotic, acts by blocking the euphoric effect of opiates. Addicts thus get nothing from heroin and feel no desire to take it. — Science News. February 4, 1967.

UPDATE:
The U.S. Food and Drug Administration approved methadone as a treatment for opiate addiction in 1972 but quickly recognized that it was no panacea. That same year, policy makers worried that methadone would produce addicts — as patients got high off the treatment itself (SN: 10/28/72, p. 277). Methadone can be deadly: In 2014, 3,400 people died of methadone overdoses. Although methadone is still used, drugs such as buprenorphine and naltrexone have joined the treatment arsenal for opiate addiction.