Connect with us

Science News

Humans Have Compressed Millions of Years of Natural Change Into Mere Centuries

Published

on

Many numbers are swirling around the climate negotiations at the UN climate summit in Glasgow, COP26. These include global warming targets of 1.5°C and 2.0°C, recent warming of 1.1°C, remaining CO2 budget of 400 billion tonnes, or current atmospheric CO2 of 415 parts per million.

 

It’s often hard to grasp the significance of these numbers. But the study of ancient climates can give us an appreciation of their scale compared to what has occurred naturally in the past. Our knowledge of ancient climate change also allows scientists to calibrate their models and therefore improve predictions of what the future may hold.

Recent work, summarized in the latest report of the Intergovernmental Panel on Climate Change (IPCC), has allowed scientists to refine their understanding and measurement of past climate changes.

These changes are recorded in rocky outcrops, sediments from the ocean floor and lakes, in polar ice sheets, and in other shorter-term archives such as tree rings and corals.

As scientists discover more of these archives and get better at using them, we have become increasingly able to compare recent and future climate change with what has happened in the past, and to provide important context to the numbers involved in climate negotiations.

For instance one headline finding in the IPCC report was that global temperature (currently 1.1 °C above a pre-industrial baseline) is higher than at any time in at least the past 120,000 or so years.

 

That’s because the last warm period between ice ages peaked about 125,000 years ago – in contrast to today, warmth at that time was driven not by CO2, but by changes in Earth’s orbit and spin axis.

Another finding regards the rate of current warming, which is faster than at any time in the past 2,000 years – and probably much longer.

But it is not only past temperature that can be reconstructed from the geological record. For instance, tiny gas bubbles trapped in Antarctic ice can record atmospheric CO2 concentrations back to 800,000 years ago. Beyond that, scientists can turn to microscopic fossils preserved in seabed sediments.

These properties (such as the types of elements that make up the fossil shells) are related to how much CO2 was in the ocean when the fossilized organisms were alive, which itself is related to how much was in the atmosphere.

As we get better at using these “proxies” for atmospheric CO2, recent work has shown that the current atmospheric CO2 concentration of around 415 parts per million (compared to 280 ppm prior to industrialization in the early 1800s), is greater than at any time in at least the past 2 million years.

 

Other climate variables can also be compared to past changes. These include the greenhouse gases methane and nitrous oxide (now greater than at any time in at least 800,000 years), late summer Arctic sea ice area (smaller than at any time in at least the past 1,000 years), glacier retreat (unprecedented in at least 2,000 years) sea level (rising faster than at any point in at least 3,000 years), and ocean acidity (unusually acidic compared to the past 2 million years).

In addition, changes predicted by climate models can be compared to the past. For instance an “intermediate” amount of emissions will likely lead to global warming of between 2.3°C and 4.6°C by the year 2300, which is similar to the mid-Pliocene warm period of about 3.2 million years ago.

Extremely high emissions would lead to warming of somewhere between 6.6°C and 14.1°C, which just overlaps with the warmest period since the demise of the dinosaurs – the “Paleocene-Eocene Thermal Maximum” kicked off by massive volcanic eruptions about 55 million years ago.

As such, humanity is currently on the path to compressing millions of years of temperature change into just a couple of centuries.

Distant past can held predict the near future

For the first time in an IPCC report, the latest report uses ancient time periods to refine projections of climate change. In previous IPCC reports, future projections have been produced simply by averaging results from all climate models, and using their spread as a measure of uncertainty.

But for this new report, temperature and rainfall and sea level projections relied more heavily on those models that did the best job of simulating known climate changes.

Part of this process was based on each individual model’s “climate sensitivity” – the amount it warms when atmospheric CO2 is doubled. The “correct” value (and uncertainty range) of sensitivity is known from a number of different lines of evidence, one of which comes from certain times in the ancient past when global temperature changes were driven by natural changes in CO2, caused for example by volcanic eruptions or change in the amount of carbon removed from the atmosphere as rocks are eroded away.

Combining estimates of ancient CO2 and temperature therefore allows scientists to estimate the “correct” value of climate sensitivity, and so refine their future projections by relying more heavily on those models with more accurate climate sensitivities.

Overall, past climates show us that recent changes across all aspects of the Earth system are unprecedented in at least thousands of years.

Unless emissions are reduced rapidly and dramatically, global warming will reach a level that has not been seen for millions of years. Let’s hope those attending COP26 are listening to messages from the past. The Conversation

Dan Lunt, Professor of Climate Science, University of Bristol and Darrell Kaufman, Professor of Earth and Environmental Sciences, Northern Arizona University.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 

Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Local

Indian Coast Guard to get three more pollution control vessels to enhance capabilities

Published

on

Panaji: As a marine pollution control response, three more pollution control vessels (PCVs) will be added to the Indian Coast Guard’s (ICG) fleet, Union Defence Secretary Ajay Kumar said on Tuesday.

Speaking to reporters on the sidelines of the 8th National Pollution Response Exercise currently taking place in Goa, Kumar said that India is also willing to help friendly countries in upgrading their capabilities.

Around 19 friendly countries are participating in the exercise.

The Union government is continuously trying to upgrade the ICG’s capabilities to face pollution hazards in the ocean.

“Today, the Indian Coast Guard is capable of handling the highest level of oil spills in this region, which is 700 tonnes and above. Only a few countries in the world have this capability,” Kumar said.

Currently, the ICG has two dedicated vessels for pollution response, while three more will be added to its fleet to enhance its capability, he said.

The Indian Ocean is one of the busiest routes in the world and half of the trade takes place in the region, the senior official said, adding that oil exploration has also increase and accidents can happen anywhere.

Countries are also battling with the issue of plastic waste being dumped in the ocean, he said.

“We need to fight this (plastic pollution) collectively. It cannot be done by one country. All the coastal countries in the region need to make efforts,” Kumar said.

The defence secretary lauded the Punit Sagar Mission launched by Prime Minister Narendra Modi to clear plastic from the coastline.

“We should ensure that plastic waste is not washed into the ocean. Every year, 15,000 million tonnes of plastic washes into the Indian Ocean from different countries. If this continues, our marine life, environment, ecology and health will be affected,” he said.

Asked about cooperation from Pakistan and China over the pollution response, Kumar said, “This is an environmental issue and all countries should contribute towards it.” Several treaties have been signed to reduce pollution in the Indian Ocean, and friendly nations will have to collectively ensure that these are observed, he said.(GoaNewsHub)

Continue Reading

Science News

Brain Implant Translates Paralyzed Man’s Thoughts Into Text With 94% Accuracy

Published

on

By

A man paralyzed from the neck down due to a spinal cord injury he sustained in 2007 has shown he can communicate his thoughts, thanks to a brain implant system that translates his imagined handwriting into actual text.

 

The device – part of a longstanding research collaboration called BrainGate – is a brain-computer interface (BCI), that uses artificial intelligence (AI) to interpret signals of neural activity generated during handwriting.

In this case, the man – called T5 in the study, and who was 65 years of age at the time of the research – wasn’t doing any actual writing, as his hand, along with all his limbs, had been paralyzed for several years.

But during the experiment, reported in Nature earlier in the year, the man concentrated as if he were writing – effectively, thinking about making the letters with an imaginary pen and paper.

As he did this, electrodes implanted in his motor cortex recorded signals of his brain activity, which were then interpreted by algorithms running on an external computer, decoding T5’s imaginary pen trajectories, which mentally traced the 26 letters of the alphabet and some basic punctuation marks.

“This new system uses both the rich neural activity recorded by intracortical electrodes and the power of language models that, when applied to the neurally decoded letters, can create rapid and accurate text,” says first author of the study Frank Willett, a neural prosthetics researcher from Stanford University.

 

Similar systems developed as part of the BrainGate have been transcribing neural activity into text for several years, but many previous interfaces have focused on different cerebral metaphors for denoting which characters to write – such as point-and-click typing with a computer cursor controlled by the mind.

It wasn’t known, however, how well the neural representations of handwriting – a more rapid and dexterous motor skill – might be retained in the brain, nor how well they might be leveraged to communicate with a brain-computer interface, or BCI.

Here, T5 showed just how much promise a virtual handwriting system could offer for people who have lost virtually all independent physical movement.

BrainImpantDevice2A diagram of how the system works. (F. Willett et al., Nature, 2021, Erika Woodrum)

In tests, the man was able to achieve writing speeds of 90 characters per minute (about 18 words per minute), with approximately 94 percent accuracy (and up to 99 percent accuracy with autocorrect enabled).

Not only is that rate significantly faster than previous BCI experiments (using things like virtual keyboards), but it’s almost on par with the typing speed of smartphone users in the man’s age group – which is about 115 characters or 23 words per minute, the researchers say.

 

“We’ve learned that the brain retains its ability to prescribe fine movements a full decade after the body has lost its ability to execute those movements,” Willett says.

“And we’ve learned that complicated intended motions involving changing speeds and curved trajectories, like handwriting, can be interpreted more easily and more rapidly by the artificial-intelligence algorithms we’re using than can simpler intended motions like moving a cursor in a straight path at a steady speed.”

Basically, the researchers say that alphabetical letters are very different from one another in shape, so the AI can decode the user’s intention more rapidly as the characters are drawn, compared to other BCI systems that don’t make use of dozens of different inputs in the same way.

BrainImpantDevice2The man’s imagined handwriting, as interpreted by the system. (Frank Willett)

Despite the potential of this first-of-its-kind technology, the researchers emphasize that the current system is only a proof of concept so far, having only been shown to work with one participant, so it’s definitely not a complete, clinically viable product as yet.

The next steps in the research could include training other people to use the interface, expanding the character set to include more symbols (such as capital letters), refining the sensitivity of the system, and adding more sophisticated editing tools for the user.

There’s plenty of work to still be done, but we could be looking at an exciting new development here, giving the ability to communicate back to people who lost it.

“Our results open a new approach for BCIs and demonstrate the feasibility of accurately decoding rapid, dexterous movements years after paralysis,” the researchers write.

“We believe that the future of intracortical BCIs is bright.”

The findings are reported in Nature.

 

Continue Reading

Science News

Astronomers Detect a ‘Tsunami’ of Gravitational Waves. Here’s Where They’re Coming From

Published

on

By

The most recent gravitational wave observing run has netted the biggest haul yet.

In less than five months, from November 2019 to March 2020, the LIGO-Virgo interferometers recorded a massive 35 gravitational wave events. On average, that’s almost 1.7 gravitational wave events every week for the duration of the run.

 

This represents a significant increase from the 1.5-event weekly average detected on the previous run, and a result that has plumped up the number of total events to 90 since that first history-making gravitational wave detection in September 2015.

“These discoveries represent a tenfold increase in the number of gravitational waves detected by LIGO and Virgo since they started observing,” said astrophysicist Susan Scott of the Australian National University in Australia.

“We’ve detected 35 events. That’s massive! In contrast, we made three detections in our first observing run, which lasted four months in 2015-16. This really is a new era for gravitational wave detections and the growing population of discoveries is revealing so much information about the life and death of stars throughout the Universe.”

Of the 35 new detections, 32 are most likely the result of mergers between pairs of black holes. This is when pairs of black holes on a close orbit are drawn in by mutual gravity, eventually colliding to form one single, more massive black hole.

That collision sends ripples through space-time, like the ripples generated when you throw a rock in a pond; astronomers can analyze those ripples to determine the properties of the black holes.

mergersAn infographic showing the masses of all black hole mergers announced to date. (LIGO-Virgo/Aaron Geller/Northwestern University)

The data revealed a range of black hole masses, with the most massive clocking in at around 87 times the mass of the Sun. That black hole merged with a companion 61 times the mass of the Sun, resulting in a single black hole 141 times the mass of the Sun. That event is named GW200220_061928.

Another merger produced a black hole 104 times the mass of the Sun; both of these are considered intermediate mass black holes, a mass range between 100 and around a million solar masses, in which very few black holes have been detected.

 

GW200220_061928 is also interesting, because at least one of the black holes involved in the merger falls into what we call the upper mass gap. According to our models, black holes over about 65 solar masses can’t form from a single star, as stellar mass black holes do.

That’s because the precursor stars are so massive that their supernovae – known as pair-instability supernovae – ought to completely obliterate the stellar core, leaving nothing behind to gravitationally collapse into a black hole.

This suggests that the 87 solar mass black hole might be the product of a previous merger. GW200220_061928 isn’t the first that’s involved a black hole in the upper mass gap, but its detection does suggest that hierarchical black hole mergers are not uncommon.

And another event includes an object in the lower mass gap – a gap of black holes between 2.5 and 5 times the mass of the Sun. We’ve not conclusively found a neutron star larger than the former, or a black hole smaller than the latter; the event named GW200210_092254 involved an object clocking in at 2.8 solar masses. Astronomers have concluded that it’s probably a very small black hole.

 

“Looking at the masses and spins of the black holes in these binary systems indicates how these systems got together in the first place,” Scott said.

“It also raises some really fascinating questions. For example, did the system originally form with two stars that went through their life cycles together and eventually became black holes? Or were the two black holes thrust together in a very dense dynamical environment such as at the centre of a galaxy?”

The other three events out of the 35 involved a black hole and something else much less massive, likely a neutron star. These events are of great interest to astronomers, since they might reveal the stuff that’s inside a neutron star – if we ever detect one that emits light. By finding more of these mergers, we can start to build a better understanding of how they actually occur.

“Only now are we starting to appreciate the wonderful diversity of black holes and neutron stars,” said astronomer Christopher Berry of the University of Glasgow in the UK

“Our latest results prove that they come in many sizes and combinations – we have solved some long-standing mysteries, but uncovered some new puzzles too. Using these observations, we are closer to unlocking the mysteries of how stars, the building blocks of our Universe, evolve.”

The team’s paper has been submitted for publication, and can be found on preprint server arXiv.

 

Continue Reading
Advertisement

Trending