In 1967, LSD was briefly labeled a breaker of chromosomes

Two New York researchers have found the hallucinogenic drug will markedly increase the rate of abnormal change in chromosomes. [Scientists] tested LSD on cell cultures from the blood of two healthy individuals … [and] also found similar abnormal changes in the blood of a schizophrenic patient who had been treated with [LSD]. The cell cultures showed a two-fold increase in chromosomal breaks over the normal rate. — Science News, April 1, 1967

Update
Psychedelic-era reports that LSD damages chromosomes got lots of press but fell apart within a few years. A review in Science in 1971 concluded that ingesting moderate doses of LSD caused no detectable genetic damage. Researchers are still trying to figure out the molecular workings of the drug. Recent evidence suggests that the substance gets trapped in a pocket of the receptor for serotonin, a key chemical messenger in the brain. Its prolonged stay may explain why LSD trips can last up to a day or more (SN: 3/4/17, p. 16).

Neandertals had an eye for patterns

Neandertals knew how to kick it up a couple of notches. Between 38,000 and 43,000 years ago, these close evolutionary relatives of humans added two notches to five previous incisions on a raven bone to produce an evenly spaced sequence, researchers say.

This visually consistent pattern suggests Neandertals either had an eye for pleasing-looking displays or saw some deeper symbolic meaning in the notch sequence, archaeologist Ana Majkić of the University of Bordeaux, France, and her colleagues report March 29 in PLOS ONE.

Notches added to the bone, unearthed in 2005 at a Crimean rock shelter that previously yielded Neandertal bones, were shallower and more quickly dashed off than the original five notches. But additions were carefully placed, resulting in relatively equal spacing of all notches.

Although bone notches may have had a practical use, such as fixing thread on an eyeless needle, the even spacing suggests Neandertals had a deeper meaning in mind — or at least knew what looked good.

Previous discoveries suggest Neandertals made eagle-claw necklaces and other personal ornaments, possibly for use in rituals (SN: 4/18/15, p. 7).

Event Horizon Telescope to try to capture images of elusive black hole edge

The Milky Way’s black hole may finally get its close-up.

Beginning on April 5, scientists with the Event Horizon Telescope will attempt to zoom in on a never-before-imaged realm: a black hole’s event horizon. That’s the boundary at which gravity’s pull becomes so strong that nothing can escape.

In the telescope’s cross hairs are two supermassive black holes, one at the center of the Milky Way, the other in the nearby galaxy M87. Scientists hope to capture the light emitted by a halo of gas that swirls just outside the event horizon as the black hole swallows it up.

The Event Horizon Telescope is not one telescope, but eight radio observatories linked together into a massive network that spans the globe. The new observations will be the first that include the ultrasensitive Atacama Large Millimeter/submillimeter Array in Chile’s Atacama Desert, increasing the possibility that the image will reveal new details. Astronomers will take data for five nights within a 10-day period.

This is no Polaroid picture, though — it will be months before the data have been crunched and the portrait is ready for prime time.

Einstein’s latest anniversary marks the birth of modern cosmology

First of two parts

Sometimes it seems like every year offers an occasion to celebrate some sort of Einstein anniversary.

In 2015, everybody lauded the 100th anniversary of his general theory of relativity. Last year, scientists celebrated the centennial of his prediction of gravitational waves — by reporting the discovery of gravitational waves. And this year marks the centennial of Einstein’s paper establishing the birth of modern cosmology.

Before Einstein, cosmology was not very modern at all. Most scientists shunned it. It was regarded as a matter for philosophers or possibly theologians. You could do cosmology without even knowing any math.

But Einstein showed how the math of general relativity could be applied to the task of describing the cosmos. His theory offered a way to study cosmology precisely, with a firm physical and mathematical basis. Einstein provided the recipe for transforming cosmology from speculation to a field of scientific study.

“There is little doubt that Einstein’s 1917 paper … set the foundations of modern theoretical cosmology,” Irish physicist Cormac O’Raifeartaigh and colleagues write in a new analysis of that paper.

Einstein had pondered the implications of his new theory for cosmology even before he had finished it. General relativity was, after all, a theory of space and time — all of it. Einstein’s showed that gravity — the driving force sculpting the cosmic architecture — was simply the distortion of spacetime geometry generated by the presence of mass and energy. (He constructed an equation to show how spacetime geometry, on the left side of the equation, was determined by the density of mass-energy, the right side.) Since spacetime and mass-energy account for basically everything, the entire cosmos ought to behave as general relativity’s equation required.

Newton’s law of gravity had posed problems in that regard. If every mass attracted every other mass, as Newton had proclaimed, then all the matter in the universe ought to have just collapsed itself into one big blob. Newton suggested that the universe was infinite, filled with matter, so that attraction inward was balanced by the attraction of matter farther out. Nobody really bought that explanation, though. For one thing, it required a really precise arrangement: One star out of place, and the balance of attractions disappears and the universe collapses. It also required an infinity of stars, making it impossible to explain why it’s dark at night. (There would be a star out there along every line of sight at all times.)

Einstein hoped his theory of gravity would resolve the cosmic paradoxes of Newtonian gravity. So in early 1917, less than a year after his complete paper on the general theory was published, he delivered a short paper to the Prussian Academy of Sciences outlining the implications of his theory for cosmology.
In that paper, titled “Cosmological Considerations in the General Theory of Relativity,” he started by noting the problems posed by using Newton’s gravity to describe the universe. Einstein showed that Newton’s gravity would require a finite island of stars sitting in an infinite space. But over time such a collection of stars would evaporate. That problem could be avoided, though, if the universe turned out not to be infinite. Instead, Einstein said, everything would be fine if the universe is finite. Big, sure, but curved in such a way that it closed on itself, like a sphere.

Einstein’s mathematical challenge was to show that such a finite cosmic spacetime would be static and stable. (In those days nobody knew that the universe was expanding.) He assumed that on a large enough scale, the distribution of matter in this universe could be considered uniform. (Einstein said it was like viewing the Earth as a smooth sphere for most purposes, even though its terrain is full of complexities on smaller distance scales.) Matter’s effect on spacetime curvature would therefore be pretty much constant, and the universe’s overall condition would be unchanging.

All this made sense to Einstein because he had a limited view of what was actually going on in the cosmos. Like many scientists in those days, he believed the universe was basically just the Milky Way galaxy. All the known stars moved fairly slowly, consistent with his belief in a spherical cosmos with uniformly distributed mass. Unfortunately, general relativity’s math didn’t work if that was the case — it suggested the universe would not be stable. Einstein realized, though, that his view of the static spherical universe would succeed if he added a term to his original equation.

In fact, there were good reasons to include the term anyway. O’Raifeartaigh and colleagues point out that in his earlier work on general relativity, Einstein remarked in a footnote that his equation technically permitted the inclusion of an additional term. That didn’t seem to matter at the time. But in his cosmology paper, Einstein found that it was just the thing his equation needed to describe the universe properly (as Einstein then supposed the universe to be). So he added that factor, designated by the Greek letter lambda, to the left-hand side of his basic general relativity equation.

“That term is necessary only for the purpose of making possible a quasi-static distribution of matter, as required by the fact of the small velocities of the stars,” Einstein wrote in his 1917 paper. As long as the magnitude of this new term on the geometry side of the equation was small enough, it would not alter the theory’s predictions for planetary motions in the solar system.

Einstein’s 1917 paper demonstrated the mathematical effectiveness of lambda (also called the “cosmological constant”) but did not say much about its physical interpretation. In another paper, published in 1918, he commented that lambda represented a negative mass density — it played “the role of gravitating negative masses which are distributed all over the interstellar space.” Negative mass would counter the attractive gravity and prevent all the matter in Einstein’s spherical finite universe from collapsing.

As everybody now knows, though, there is no danger of collapse, because the universe is not static to begin with, but rather is rapidly expanding. After Edwin Hubble had established such expansion, Einstein abandoned lambda as unnecessary (or at least, set it equal to zero in his equation). Others built on Einstein’s foundation to derive the math needed to make sense of Hubble’s discovery, eventually leading to the modern view of an expanding universe initiated by a Big Bang explosion.

But in the 1990s, astronomers discovered that the universe is not only expanding, it is expanding at an accelerating rate. Such acceleration requires a mysterious driving force, nicknamed “dark energy,” exerting negative pressure in space. Many experts believe Einstein’s cosmological constant, now interpreted as a constant amount of energy with negative pressure infusing all of space, is the dark energy’s true identity.

Einstein might not have been surprised by all of this. He realized that only time would tell whether his lambda would vanish to zero or play a role in the motions of the heavens. As he wrote in 1917 to the Dutch physicist-astronomer Willem de Sitter: “One day, our actual knowledge of the composition of the fixed-star sky, the apparent motions of fixed stars, and the position of spectral lines as a function of distance, will probably have come far enough for us to be able to decide empirically the question of whether or not lambda vanishes.”

Hawk moths convert nectar into antioxidants

Hawk moths have a sweet solution to muscle damage.

Manduca sexta moths dine solely on nectar, but the sugary liquid does more than fuel their bodies. The insects convert some of the sugars into antioxidants that protect the moths’ hardworking muscles, researchers report in the Feb. 17 Science.

When animals expend a lot of energy, like hawk moths do as they rapidly beat their wings to hover at a flower, their bodies produce reactive molecules, which attack muscle and other cells. Humans and other animals eat foods that contain antioxidants that neutralize the harmful molecules. But the moths’ singular food source — nectar — has little to no antioxidants.

So the insects make their own. They send some of the nectar sugars through an alternative metabolic pathway to make antioxidants instead of energy, says study coauthor Eran Levin, an entomologist now at Tel Aviv University. Levin and colleagues say this mechanism may have allowed nectar-loving animals to evolve into powerful, energy-intensive fliers.

Immune cells play surprising role in steady heartbeat

Immune system cells may help your heart keep the beat. These cells, called macrophages, usually protect the body from invading pathogens. But a new study published April 20 in Cell shows that in mice, the immune cells help electricity flow between muscle cells to keep the organ pumping.

Macrophages squeeze in between heart muscle cells, called cardiomyocytes. These muscle cells rhythmically contract in response to electrical signals, pumping blood through the heart. By “plugging in” to the cardiomyocytes, macrophages help the heart cells receive the signals and stay on beat.
Researchers have known for a couple of years that macrophages live in healthy heart tissue. But their specific functions “were still very much a mystery,” says Edward Thorp, an immunologist at Northwestern University’s Feinberg School of Medicine in Chicago. He calls the study’s conclusion that macrophages electrically couple with cardiomyocytes “paradigm shifting.” It highlights “the functional diversity and physiologic importance of macrophages, beyond their role in host defense,” Thorp says.

Matthias Nahrendorf, a cell biologist at Harvard Medical School, stumbled onto this electrifying find by accident.

Curious about how macrophages impact the heart, he tried to perform a cardiac MRI on a mouse genetically engineered to not have the immune cells. But the rodent’s heartbeat was too slow and irregular to perform the scan.
These symptoms pointed to a problem in the mouse’s atrioventricular node, a bundle of muscle fibers that electrically connects the upper and lower chambers of the heart. Humans with AV node irregularities may need a pacemaker to keep their heart beating in time. In healthy mice, researchers discovered macrophages concentrated in the AV node, but what the cells were doing there was unknown.
Isolating a heart macrophage and testing it for electrical activity didn’t solve the mystery. But when the researchers coupled a macrophage with a cardiomyocyte, the two cells began communicating electrically. That’s important, because the heart muscle cells contract thanks to electrical signals.

Cardiomyocytes have an imbalance of ions. While in the resting state, there are more positive ions outside the cell than inside, but when a cardiomyocyte receives an electrical signal from a neighboring heart cell, that distribution switches. This momentary change causes the cell to contract and send the signal on to the next cardiomyocyte.

Scientists previously thought that cardiomyocytes were capable of this electrical shift, called depolarization, on their own. But Nahrendorf and his team found that macrophages aid in the process. Using a protein, a macrophage hooks up to a cardiomyocyte. This protein directly connects the inside of these cells to each other, allowing macrophages to transfer positive charges, giving cardiomyocytes a boost kind of like with a jumper cable. This makes it easier for the heart cells to depolarize and trigger the heart contraction, Nahrendorf says.

“With the help of the macrophages, the conduction system becomes more reliable, and it is able to conduct faster,” he says.

Nahrendorf and colleagues found macrophages within the AV node in human hearts as well but don’t know if the cells play the same role in people. The next step is to confirm that role and explore whether or not the immune cells could be behind heart problems like arrhythmia, says Nahrendorf.

Long naps lead to less night sleep for toddlers

Like most moms and dads, my time in the post-baby throes of sleep deprivation is a hazy memory. But I do remember feeling instant rage upon hearing a popular piece of advice for how to get my little one some shut-eye: “sleep begets sleep.” The rule’s reasoning is unassailable: To get some sleep, my baby just had to get some sleep. Oh. So helpful. Thank you, lady in the post office and entire Internet.

So I admit to feeling some satisfaction when I came across a study that found an exception to the “sleep begets sleep” rule. The study quite reasonably suggests there is a finite amount of sleep to be had, at least for the 50 Japanese 19-month-olds tracked by researchers.

The researchers used activity monitors to record a week’s worth of babies’ daytime naps, nighttime sleep and activity patterns. The results, published June 9, 2016, in Scientific Reports, showed a trade-off between naps and night sleep. Naps came at the expense of night sleep: The longer the nap, the shorter the night sleep, the researchers found. And naps that stretched late into the afternoon seemed to push back bedtime.

In this study, naps didn’t affect the total amount of sleep each child got. Instead, the distribution of sleep across day and night changed. That means you probably can’t tinker with your toddler’s nap schedule without also tinkering with her nighttime sleep. In a way, that’s reassuring: It makes it harder to screw up the nap in a way that leads to a sleep-deprived child. If daytime sleep is lacking, your child will probably make up for it at night.

A sleeping child looks blissfully relaxed, but beneath that quiet exterior, the body is doing some incredible work. New concepts and vocabulary get stitched into the brain. The immune system hones its ability to bust germs. And limbs literally stretch. Babies grew longer in the four days right after they slept more than normal, scientists reported in Sleep in 2011. Scientists don’t yet know if this important work happens selectively during naps or night sleep.

Right now, both my 4-year-old and 2-year-old take post-lunch naps (and on the absolute best of days, those naps occur in glorious tandem). Their siestas probably push their bedtimes back a bit. But that’s OK with all of us. Long spring and summer days make it hard for my girls to go to sleep at 7:30 p.m. anyway. The times I’ve optimistically tried an early bedtime, my younger daughter insists I look out the window to see the obvious: “The sky is awake, Mommy.”

Here’s how an asteroid impact would kill you

It won’t be a tsunami. Nor an earthquake. Not even the crushing impact of the space rock. No, if an asteroid kills you, gusting winds and shock waves from falling and exploding space rocks will most likely be to blame. That’s one of the conclusions of a recent computer simulation effort that investigated the fatality risks of more than a million possible asteroid impacts.

In one extreme scenario, a simulated 200-meter-wide space rock whizzing 20 kilometers per second whacked London, killing more than 8.7 million people. Nearly three-quarters of that doomsday scenario’s lethality came from winds and shock waves, planetary scientist Clemens Rumpf and colleagues report online March 27 in Meteoritics & Planetary Science.

In a separate report, the researchers looked at 1.2 million potential impactors up to 400 meters across striking around the globe. Winds and shock waves caused about 60 percent of the total deaths from all the asteroids, the team’s simulations showed. Impact-generated tsunamis, which many previous studies suggested would be the top killer, accounted for only around one-fifth of the deaths, Rumpf and colleagues report online April 19 in Geophysical Research Letters.
“These asteroids aren’t an everyday concern, but the consequences can be severe,” says Rumpf, of the University of Southampton in England. Even asteroids that explode before reaching Earth’s surface can generate high-speed wind gusts, shock waves of pressure in the atmosphere and intense heat. Those rocks big enough to survive the descent pose even more hazards, spawning earthquakes, tsunamis, flying debris and, of course, gaping craters.

While previous studies typically considered each of these mechanisms individually, Rumpf and colleagues assembled the first assessment of the relative deadliness of the various effects of such impacts. The estimated hazard posed by each effect could one day help leaders make one of the hardest calls imaginable: whether to deflect an asteroid or let it hit, says Steve Chesley, a planetary scientist at NASA’s Jet Propulsion Laboratory in Pasadena, Calif., who was not involved with either study.

The 1.2 million simulated impactors each fell into one of 50,000 scenarios, which varied in location, speed and angle of strike. Each scenario was run with 24 different asteroid sizes, ranging from 15 to 400 meters across. Asteroids in nearly 36,000 of the scenarios, or around 72 percent, descended over water.

The deadliness assessment began with a map of human populations and numerical simulations of the energies unleashed by falling asteroids. Those energies were then used alongside existing casualty data from studies of extreme weather and nuclear blasts to calculate the deadliness of the asteroids’ effects at different distances. Rumpf and his team focused on short-term impact effects, rather than long-term consequences such as climate change triggered by dust blown into the atmosphere.

(The kill count of each effect was calculated independently of the other effects, meaning people who could have died of multiple causes were counted multiple times. This double counting allows for a better comparison across effects, Rumpf says, but it does give deaths near the impact site more weight in calculations.)
While the most deadly impact killed around 117 million people, many asteroids posed no threat at all, the simulations revealed. More than half of asteroids smaller than 60 meters across — and all asteroids smaller than 18 meters across — caused zero deaths. Rocks smaller than 56 meters wide didn’t even make it to Earth’s surface before exploding in an airburst. Those explosions could still be deadly, though, generating intense heat that burns skin, high-speed winds that hurl debris and pressure waves that rupture internal organs, the team found.

Tsunamis became the dominant killer for water impacts, accounting for around 70 to 80 percent of the total deaths from each impact. Even with the tsunamis, though, water impacts were only a fraction as deadly on average as land-hitting counterparts. That’s because impact-generated tsunamis are relatively small and quickly lose steam as they traverse the ocean, the researchers found.

Land impacts, on the other hand, cause considerable fatalities through heat, wind and shock waves and are more likely to hit near large population centers. For all asteroids big enough to hit the land or water surface, heat, wind and shock waves continued to cause the most casualties overall. Land-based effects, such as earthquakes and blast debris, resulted in less than 2 percent of total deaths.

Deadly asteroid impacts are rare, though, Rumpf says. Most space rocks bombarding Earth are tiny and harmlessly burn up in the atmosphere. Bigger meteors such as the 20-meter-wide rock that lit up the sky and shattered windows around the Russian city of Chelyabinsk in 2013 only frequent Earth about once a century (SN Online: 2/15/13). Impacts capable of inducing extinctions, like the at least 10-kilometer-wide impactor blamed for the end of the dinosaurs 66 million years ago (SN: 2/4/17, p. 16), are even rarer, striking Earth roughly every 100 million years.
But asteroid impacts are scary enough that today’s astronomers scan the sky with automated telescopes scouting for potential impactors. So far, they’ve cataloged 27 percent of space rocks 140 meters or larger estimated to be whizzing through the solar system. Other scientists are crunching the numbers on ways to divert an earthbound asteroid. Proposals include whacking the asteroid like a billiard ball with a high-speed spacecraft or frying part of the asteroid’s surface with a nearby nuclear blast so that the vaporized material propels the asteroid away like a jet engine.

The recent research could offer guidance on how people should react to an oncoming impactor: whether to evacuate or shelter in place, or to scramble to divert the asteroid. “If the asteroid’s in a size range where the damage will be from shock waves or wind, you can easily shelter in place a large population,” Chesley says. But if the heat generated as the asteroid falls, impacts or explodes “becomes a bigger threat, and you run the risk of fires, then that changes the response of emergency planners,” he says.
Making those tough decisions will require more information about compositions and structures of the asteroids themselves, says Lindley Johnson, who serves as the planetary defense officer for NASA in Washington, D.C. Those properties in part determine an asteroid’s potential devastation, and the team didn’t consider how those characteristics might vary, Johnson says. Several asteroid-bound missions are planned to answer such questions, though the recent White House budget proposal would defund a NASA project to reroute an asteroid into the moon’s orbit and send astronauts to study it (SN Online: 3/16/17).

In the case of a potential impact, making decisions based on the average deaths presented in the new study could be misleading, warns Gareth Collins, a planetary scientist at Imperial College London. A 60-meter-wide impactor, for instance, caused on average about 6,300 deaths in the simulations. Just a handful of high-fatality events inflated that average, though, including one scenario that resulted in more than 12 million casualties. In fact, most impactors of that size struck away from population centers and killed no one. “You have to put it in perspective,” Collins says.

Why create a model of mammal defecation? Because everyone poops

An elephant may be hundreds of times larger than a cat, but when it comes to pooping, it doesn’t take the elephant hundreds of times longer to heed nature’s call. In fact, both animals will probably get the job done in less than 30 seconds, a new study finds.

Humans would probably fit in that time frame too, says Patricia Yang, a mechanical engineering graduate student at the Georgia Institute of Technology in Atlanta. That’s because elephants, cats and people all excrete cylindrical poop. The size of all those animals varies, but so does the thickness of the mucus lining in each animal’s large intestine, so no matter the mammal, everything takes about the same time — an average of 12 seconds — to come out, Yang and her colleagues conclude April 25 in Soft Matter.

But the average poop time is not the real takeaway here (though it will make a fabulous answer to a question on Jeopardy one day). Previous studies on defecation have largely come from the world of medical research. “We roughly know how it happened, but not the physics of it,” says Yang.

Looking more closely at those physical properties could prove useful in a number of ways. For example, rats are often good models for humans in disease research, but they aren’t when it comes to pooping because rats are pellet poopers. (They’re not good models for human urination, either, because their pee comes out differently than ours, in high-speed droplets instead of a stream.)

Also, since the thickness of the mucus lining is dependent on animal size, it would be better to find a more human-sized stand-in. Such work could help researchers find new treatments for constipation and diarrhea, in which the mucus lining plays a key role, the researchers note.

Animal defecation may seem like an odd topic for a mechanical engineer to take on, but Yang notes that the principles of fluid dynamics apply inside the body and out. Her previous research includes a study on animal urination, finding that, as with pooping, the time it takes for mammals to pee also falls within a small window. (The research won her group an Ig Nobel Prize in 2015.)

And while many would find this kind of research disgusting, Yang does not. “Working with poop is not that bad, to be honest,” she says. “It’s not that smelly.” Plus, she gets to go to the zoo and aquarium for her research rather than be stuck in the lab.
But the research does involve a lot of poop — and watching it fall. For the study, the researchers timed the how long it took for animals to defecate and calculated the velocity of the feces of 11 species. They filmed dogs at a park and elephants, giant pandas and warthogs at Zoo Atlanta. They also dug up 19 YouTube videos of mammals defecating. Surprisingly, there are a lot of those videos available, though not many were actually good for the research. “We wanted a complete event, from beginning to end,” Yang notes. Apparently not everyone interested in pooping animals bothers to capture a feces’ full fall.

The researchers also examined feces from dozens of mammal species. (They fall into two classes: Carnivores defecate “sinkers,” since their feces are full of heavy indigestible ingredients like fur and bones. Herbivores defecate less-dense “floaters.”) And they considered the thickness and viscosity of the mucus that lines mammals’ intestines and helps everything move along as well the rectal pressure that pushes the material. All this information went into a mathematical model of mammal defecation — which revealed the importance of the mucus lining.

Yang isn’t done with this line of research. The model she and her colleagues created applies only to mammals that poop like we do. There’s still the pellet poopers, like rats and rabbits, and wombats, whose feces look like rounded cubes. “I would like to complete the whole set,” she says. And, “if you’ve got a good team, it’s fun.”

When it’s hot, plants become a surprisingly large source of air pollution

Planting trees is often touted as a strategy to make cities greener, cleaner and healthier. But during heat waves, city trees actually boost air pollution levels. When temperatures rise, as much as 60 percent of ground-level ozone is created with the help of chemicals emitted by urban shrubbery, researchers report May 17 in Environmental Science & Technology.

While the findings seem counterintuitive, “everything has multiple effects,” says Robert Young, an urban planning expert at the University of Texas at Austin, who was not involved with the study. The results, he cautions, do not mean that programs focused on planting trees in cities should stop. Instead, more stringent measures are needed to control other sources of air pollution, such as vehicle emissions.
Benefits of city trees include helping reduce stormwater runoff, providing cooling shade and converting carbon dioxide to oxygen. But research has also shown that trees and other shrubs release chemicals that can interact with their surrounding environment, producing polluted air. One, isoprene, can react with human-made compounds, such as nitrogen oxides, to form ground-level ozone, a colorless gas that can be hazardous to human health. Monoterpenes and sesquiterpenes also react with nitrogen oxides, and when they do, lots of tiny particles, similar to soot, build up in the air. In cities, cars and trucks are major sources of these oxides.

In the new study, Galina Churkina of Humboldt University of Berlin and colleagues compared simulations of chemical concentrations emitted from plants in the Berlin-Brandenburg metropolitan area. The researchers focused on two summers: 2006, when there was a heat wave, and 2014, when temperatures were more typical.

At normal daily maximum summer temperatures, roughly 25° Celsius on average, plants’ chemical emissions contributed to about 6 to 20 percent of ozone formation in the simulations. At peak temperatures during the heat wave, when temperatures soared to over 30°C, plant emissions spiked, boosting their share of ozone formation to up to 60 percent. Churkina says she and colleagues were not surprised to see the seemingly contrary relationship between plants and pollution. “Its magnitude was, however, quite amazing,” she says.

The results, she notes, suggest that campaigns to add trees to urban spaces can’t be done in isolation. Adding trees will improve quality of life only if such campaigns are combined with the radical reduction of pollution from motorized vehicles and the increased use of clean energy sources, she says.