In 2007, the Inter­gov­ern­mental Panel on Cli­mate Change released its fourth assess­ment report based on a new group of cli­mate models, the value of which earned the IPCC the Nobel Peace Prize. The new models—collectively referred to as CMIP3—revealed infor­ma­tion about cli­mate change that sealed the deal on a number of lin­gering ques­tions: Is global tem­per­a­ture increasing? Yes. Is it caused by humans? Almost cer­tainly. But one major question—perhaps the most impor­tant one of all—remained. What should we do about it?

To answer that ques­tion, stake­holders would need more locally rel­e­vant infor­ma­tion on much shorter time scales. So when the next gen­er­a­tion of cli­mate models was released in 2012, sci­en­tists had high hopes. These models included more phys­ical, chem­ical, and bio­log­ical processes, often con­sid­ered at a much finer-​​grain than the pre­vious models. Surely CMIP5, as the new ensemble of models is called, would pro­vide cred­ible pro­jec­tions at the scales rel­e­vant to stakeholders.

But a new study from the lab of North­eastern Uni­ver­sity asso­ciate pro­fessor of civil and envi­ron­mental engi­neering Auroop Gan­guly tells a dif­ferent story. According to the team’s analysis, which will be pub­lished in the journal Cli­mate Dynamics, the CMIP5 models don’t do a better job. In fact, some pro­jec­tions are even worse than the CMIP3 models that were released in 2007.

Some of the models say an area will be wetter in the future, others say it’ll be drier,” said Devashish Kumar, a doc­toral can­di­date in Ganguly’s Sus­tain­ability and Data Sci­ences lab­o­ra­tory and lead author on the paper. “So which is it and which should we plan for?”

Pro­fessor Auroop Gan­guly (center) and his stu­dents (from left) Deba­sish Das, Evan Kodra, Poulomi Gan­guli, Devashish Kumar, Rachindra Mawalagedara, Saeed Zabet, David Wang and Babak Fard inside the SDS Lab. Photo by Mariah Tauger.

A handful of ear­lier studies made sim­ilar claims prior to the release of the CMIP5 results. But unlike Northeastern’s study, most of that work was antic­i­pa­tory, since the actual data weren’t yet available.

While the results are sobering for a field that has, in recent years, relied on the accu­racy of its models to make the most pro­found impact state­ments, Ganguly’s team does not think cli­mate mod­eling has nec­es­sarily reached its end. It’s just that “these models may not be able to keep pace with the urgency with which the stake­holders require the problem to be solved,” he explained.

That’s because of long­standing gaps in our under­standing of the sci­ence in areas such as cloud physics and ocean-​​land-​​atmosphere inter­ac­tions, Gan­guly said. While our under­standing in these areas may even­tu­ally get better, the time­line will be too slow for policy makers to wait around. Addi­tion­ally, cli­mate and earth system models have intrinsic vari­ability, which may never be over­come, “but still need to be char­ac­ter­ized,” said Ganguly.

Instead, he is advo­cating for the use of Big Data tools to advance the field. Between observed data from var­ious types of sen­sors sta­tioned around the globe and even in space, and the data coming from the sim­u­la­tion models them­selves, cli­mate sci­en­tists are treading a ver­i­table flood of data.

But cli­mate is a com­plex field. A change in one vari­able can cause ripple effects throughout the entire system, a phe­nom­enon that poses chal­lenges to the sci­en­tists studying it. For that reason, Gan­guly said, the data deluge is only being used in pockets of the field.

For instance, researchers inter­ested in rain­fall extremes look at rain­fall data itself as well as data they already know may influ­ence these extremes, such as spe­cific atmos­pheric and sea sur­face tem­per­a­tures. How­ever, a vast trea­sure trove of com­plex data—which could also yield impor­tant insights—lies unex­plored. As Gan­guly put it, “We need to look at it all as an inte­grated whole.”

In another upcoming paper, which will appear in the journal Non­linear Processes in Geo­physics, Gan­guly and doc­toral can­di­date Evan Kodra col­lab­o­rated with a large team of cli­mate sci­en­tists, hydrol­o­gists, com­puter sci­en­tists, and sta­tis­ti­cians to fur­ther address this ques­tion. This work is part of a multi-​​institution five-​​year, $10 mil­lion grant from the National Sci­ence Foundation.

While the researchers believe models will remain invalu­able to the field, they argue that an optimal blend of data-​​driven insights and phys­ical understanding—beyond what may be easily cap­tured within the cur­rent gen­er­a­tion of cli­mate models—may be a path for­ward. For this to happen, Big Data tools need to be cus­tomized for com­plex cli­mate data, and opti­mized for extremes to char­ac­terize what may ulti­mately be small data, from pos­sibly elu­sive indicators.

Ulti­mately, future pro­jec­tions will have to be based on models,” said Gan­guly. “And action­able pre­dic­tive insights will need to be gen­er­ated based on such pro­jec­tions. But physics-​​guided data mining may need to work in tandem to inform stake­holder decisions.”