The notion of time travel has been debunked as often has it has been explored — or more — but scientists in Boulder have brought the concept closer to reality, at least as it applies to the weather.
And in this case, their project also should provide a window into reading the future.
Through the 20th Century Reanalysis Project, or, the less unwieldy 20CRv3, researchers at Boulder’s National Oceanic and Atmospheric Administration have developed the ability to look with much greater clarity back in time to probe new questions pertaining to meteorological events far in the rear view mirror, with the aim of improving the prediction of weather events to come.
What NOAA scientists have produced is an update to a weather “time machine” in development since 2011. The new, third version of their brainchild is a highly complex, four-dimensional, high-resolution of the global climate estimating what the weather was for every day back to 1836.
The project’s latest iteration yields continuous estimates of the “most likely” state of the global atmosphere’s weather on 75-kilometer grids eight times a day for the past 180 years, according to a news release. It’s the product of an international effort spearheaded by scientists at NOAA’s Physical Sciences Division and the University of Colorado Boulder’s Cooperative Institute for Research in Environmental Sciences, and is supported by the U.S. Department of Energy.
By using NOAA’s Global Forecast System, researchers were able to reconstruct the global atmosphere by leveraging surface pressure readings, sea ice and sea temperature observations through archival records, some of which were transcribed by citizen volunteers.
With that data as the foundation, the model produces estimates of moisture, temperature, winds, pressure, clouds and solar radiation.
“We want to be able to compare the statistics of what has happened to the statistics of our climate models,” said Gil Compo, a CIRES scientist working at NOAA who leads the reanalysis project.
“If we can have confidence that the climate models can represent how extreme weather and storms and weather patterns have changed, then we should be able to have more confidence in how extreme weather and storms and weather patterns will change in the future.”
The latest iteration of the tool, 20CRv3 uses millions more observations that previous versions of the reanalysis, especially from earlier periods. It includes up to 25% more available observations from prior to 1930. Running the model, and crunching so much data called for extraordinary computer resources, for which the Department of Energy contributed 600 million CPU hours in order to process 21 million gigabytes of data at the National Energy Research Scientific Computing Center, according to a news release.
“There are three points behind the philosophy of what we do this for,” Campo said. “The first is to be able to describe what has happened. The second is to be able to compare what is happening today to what was happening in the past. Has something changed?”
A good local example of that, he said, is posed by the September 2013 Colorado Front Range flood.
“How different is the meteorology and the weather patterns for the September flood from the other big floods we know about, like May 1894? Was it basically the same thing or, was it affected by some kind of forcing factors, like changing atmospheric composition, CO2, or some sea surface temperature patterns in the Pacific?,” Campo said. “… And then the last thing is, is our data set good enough to be able to evaluate changes in storms and weather patterns from climate models, with the goal of being better able to predict them?”
Critical ships’ logs
The meteorological time machine has helped researchers fine-tune their scientific understanding of specific weather events over the course of history, such The Great Blizzard of 1888 which paralyzed the East Coast, or the epic Midwestern winter of 1880-1881, captured in historical fiction for children by Laura Ingalls Wilder in “The Long Winter.”
“I’ve found reanalysis composites incredibly useful and accessible,” Barbara Mayes Bousted, a meteorologist instructor with the National Weather Service who has studied that winter, said in a statement. “We introduce them in our course on operational climate services and to weather forecast offices who want to investigate climate events like the El Niño Southern Oscillation.”
The main data point utilized by the scientists is barometric pressure. Campo said barometers were invented in 1644 by Italian physicist and mathematician Evangelista Torricelli, a student of Galileo, and were in consistent use by mariners across the globe by the end of the 18th century. Mariners’ logs are an example of the data feeding this very 21st century project, and he cited the heroic recovery of the trip logs from the doomed 1879-1881 bid by the Jeannette to reach the North Pole, which ended with the ship’s destruction by the ice and the death of 20 of its 33 crew members.
The fingerprints of as many as 20,000 people are on some aspect of the project, Campo said, and he offered an insight into the myriad contributions that have been made.
Dating back to the beginning of the 1800s, “Almost every ship had a barometer and particularly for Navy and the merchant ships and some whaling ships, they recorded that barometer every hour, or every two hours, or four times a day, and those recordings were in log books,” Campo said.
“And those log books have been archived around the world. And so what my colleagues do in the Atmospheric Circulation Reconstructions Over the Earth, is they work with historians and archivists to find those log books, and then they scan them. And we’ve been very fortunate that the public has responded to the fact that computers can’t read the log books. And so citizen scientists have helped to transcribe the log books” as well as weather station’s old records.
The next step in development of the reanalysis tool is to crank its reach all the way back to 1806, because Campo said 1804 is the first year for which scientists can access a barometric pressure reading “from somewhere in the world” at a minimum of every six hours.
The 20CRv3 has not yet been applied to Boulder County’s 2013 flood, but Campo said that’s near the top of his “to-do list,” as he still remembers that deluge for the fact that the water of his basement in Louisville “kept coming up and up and up.”