This is a summary of my notes from the Clear Skies, Clear Minds geoengineering talk at UChicago this past Thursday (February 13, 2025). This was an event open to the public. I covered my own travel and accommodations to attend.
The event page is here: https://www.lib.uchicago.edu/conferences/clear-skies-clear-minds-harnessing-open-research-to-find-climate-solutions/.
There were three 1-hour sections during the event.
- Professor Tiffany Shaw of the Geophysical Sciences department gave a climate change overview talk: “Information versus knowledge in climate change prediction.”
- Vice Provost for Research Erin Adams and Professor David Keith held a fireside chat regarding climate systems geoengineering: the cost of inaction, four dimensions of available action, and some tradeoffs.
- Erin Adams moderated a discussion between David Keith, Erin Adams, and University Librarian Rachael Kotarski on the state of climate research and open science.
Professor Shaw’s talk included a summary of the past ~200 years of climate research showing increasingly clear evidence that global climate change is a result of human industrial activity, primarily the emission of carbon dioxide by burning fossil fuels. Sources are cited throughout in Prof. Shaw’s slides.
We arrive in the modern era of digital modeling, notably by Manabe & Wetherald, focusing on several of their key papers (1967, 1975, 1989, 2017) leading up to the 2021 Nobel. Breaking down the changes between the papers, Shaw traces the refinements that brought improved predictive power to climate models. Some of the models’ limitations are noted: predicting CO2 concentration is much easier than predicting cloud coverage.
Shaw points out that the development of these models leads to finer-grained predictions, but maintains global consistency: legacy & modern models with CO2 set to pre-industrial levels do not show warming.
The specific axes along which M&W refined their models hint at potential mitigation strategies: stratospheric aerosols, solar radiation modification (SRM), surface albedo modification.
Predictions that were later confirmed observationally include stratospheric cooling, albedo effects, transient cooling of the southern Atlantic, polar heating. Manabe & Wetherald’s 1967 paper also helps contextualize heavier rainfall intensity; a 7% increase in moisture-carrying capacity of air with 1 kelvin of temperature increase disproportionately affects regions with high relative humidity, e.g. Europe or the tropics, vs. deserts with little humidity.
Continuing the development, more recent models show the consequences of climate change differ in directionality across the globe. Chicago is already seeing reduced cold weather intensity. The southwestern US is expected to see increased drying & heating, but parts of Europe and the global south are expected to see more intense rainfall & storms.
Shaw sees opportunities for more improvement in models, with different coupling strategies potentially giving better insight into convection & energy transport. GPU-accelerated codes and machine learning could play a role for climate, as they have in local weather forecasting, but there’s no obvious smoking gun that they’ll be helpful at a global level. Other areas of improvement: ocean circulation, cloud distributions, atmosphere-land and atmosphere-ocean coupling strategies. Scientists should use failed predictions and model discrepancies as opportunities for improving those models’ reliability.
Dr. Shaw closed the talk by contextualizing information and knowledge gaps vs. action. Scientists are often tasked with understanding nature and tracking down ways to improve knowledge about nature. Action needs to be taken to address climate change, but political will and public support are required to drive those actions. Scientific knowledge itself doesn’t dictate policy: that responsibility falls on politicians and the voting public.
During the Q&A, I asked what could make “private SRM deployments,” e.g. Make Sunsets or Stardust Solutions, useful on a scientific basis. Response: the preference within academia is that private industry should not take direct action. However, if they do so anyway, it’s critical that (a) open data on deployment should be published, not kept secret or left undocumented; and (b) efforts are made to measure meaningful signals above the local atmospheric noise floor. It isn’t just companies: countries that have seeded clouds in decades-long programs are reluctant to share data on scale or efficacy. It is a fundamental obstacle to geoengineering experimentation that we can’t necessarily see a statistically significant signal with no control group.
After a ten-minute break, the fireside chat (Keith / Adams) began, with extensive Q&A throughout. A wood-burning fireplace was projected in the background.
The cost of inaction was a focal point. Climate change has real consequences; concern about potential side effects of action doesn’t justify not taking action to prevent primary harms.
David Keith brought up four dimensions along which humans can take action to address climate change. First: decarbonization is necessary to prevent further escalation of the climate crisis. Second, carbon removal will be necessary in the long term, as the magnitude of climate change is driven by cumulative emissions. Third, solar geoengineering could be used to break the link between CO2 and climate — temporarily. The last dimension mentioned was local adaptation, e.g. the use of levees to deal with sea level rise.
Axes #1 and #2 differ substantially from #3. Decarbonization is a productive, economically useful activity happening at a large scale now (around $1.7 trillion / year). Direct carbon removal is happening at a much smaller scale, less than 1% of decarbonization.
Prof. Keith’s intuition is that most efforts (about 95-98%) should be focused on decarbonization rather than carbon direct removal while we are still above net-zero. Why capture carbon from the atmosphere at great expense while easy solutions for minimizing emissions (e.g. solar) have yet to displace the major contributors to emissions (fossil fuels)?
Keith aligns with Shaw: commercial actors should not be taking the lead on solar geoengineering when the costs are low enough already.
Keith, referring to my earlier question on SRM studies, clarified that SCoPEx was a research project to measure the mixing properties of dust particles in the air column, and not a test deployment of SRM or aerosol injection. From natural experiments and climate models, the mechanics and scaling of stratospheric aerosols are well-understood. What SCoPEx aimed to find was specific (nerdy) technical answers to fluid dynamics questions.
By contrast, Prof. Keith finds efforts like Make Sunsets and Stardust Solutions mostly pointless. They don’t share data on their climate modification efforts publicly and they aren’t doing observational tests with a control group. Their efforts are too small to be measurable and not scalable. The value of these companies to future geoengineering is not apparent.
Prof. Keith discussed how his personal beliefs on the value of SRM have changed. In the 1990s, some individuals had proposed stratospheric sulfates as an easy solution to global warming, but it wasn’t clear that effects would be hemispherically balanced or predictable. Keith worked with Caldeira around 1998 and determined that cooling effects would be balanced as long as deployment of sulfates was also balanced. In Keith’s words, there are no studies that show disproportionate downsides to hemispherically balanced, gentle introduction of stratospheric aerosol injection.
Prof. Keith continued on the subject of geoengineering. Environmental activists often protest stratospheric aerosols as a “false solution,” but it is a real solution in that it does work; natural experiments like Pinatubo & simulations show that it would work. Sulfur is “a devil we know,” vs. one we don’t. A figure was mentioned: 1 million tons of sulfur in the stratosphere per year could cause 10,000 sulfur-related deaths, but could prevent approximately 500,000 deaths per year from climate change. Keith indicates that 1 MT/yr is small compared to the 100 million tons of sulfur injected into the stratosphere during the 1920s.
If Prof. Keith were forced to choose between banning SAI forever or immediately deploying it at a large scale, he would choose to ban it. Immediately deploying geoengineering technology on a massive scale would be catastrophic. Fortunately, he points out, we don’t have this false choice; we can choose a moderate, careful approach to deployment.
The conversation turned to ethics (procedural justice and distributive justice). These are a mixed bag in the context of SAI. Industrialization, which released CO2, benefited wealthy countries economically, while causing harm to poorer countries in the consequences of global warming. CO2 emissions also warm colder countries, potentially increasing economic output. But stratospheric aerosol injection has the opposite effect: cooling hot countries reduces mortality from heat waves and extreme weather. SAI seems to invert the harm/benefit mapping of CO2 release.
In the last half hour of the panel discussion, Q&A with audience members ranged across a few topics in carbon direct removal and decarbonization.
- Direct air capture may be economically viable at the range of $100/ton CO2. It is relatively easy to validate: look at the inputs and the outputs that are buried underground.
- Reforestation-based carbon removal is harder to assess & prone to fraud. 99% of human-released carbon in the atmosphere came from the geosphere (fossil fuels). Storing that in the biosphere (trees) is not a solution because it can return to the atmosphere. It should go back to the geosphere.
- Building technologies like double-pane windows: around $1000/ton. They can still be useful for individuals, but are not globally relevant.
- Capturing CO2 from a fossil fuel plant is cost prohibitive. A figure quoted by the audience member: on an EPA test plant, capturing 1% of CO2 emissions cost as much as capturing all NOx/SO2 emissions.
- Coal power generation should be phased out as soon as possible.
- Concrete manufacturing, ammonia production, steelmaking are opportunities for point-source carbon capture.
- Solar generation is economically viable though batteries are needed to satisfy nighttime power consumption.
- Olivine-based rock weathering risks contaminating soil with chromium.
- Ocean alkalinity enhancement could be productive for carbon capture, but data & planning needs to be open.
- AI datacenters represent around 1% of global emissions and are not likely to play a major role in the future. Some technology companies may have overemphasized the value of scale.
- In general, stratospheric aerosol injection is not taken seriously enough.
Prof. Keith briefly discussed markets & business. Secrecy around inputs and outputs for decarbonization and carbon renewal is counterproductive. On the other hand, concealing internal trade secrets is reasonable. For example, the inputs and outputs for photovoltaic production are measurable: sand, chemicals and energy are converted into cells and panels, then sold at some cost. Whatever technology is used to achieve that output is irrelevant since the photovoltaic cells’ performance is trivially measurable. Private industry can achieve impressive gains when it can iterate internally.
Regulation, in general, happens when harms are noticed. EPA regulations are specific about restricting the amount of sulfur emitted by the burning of aviation fuel. At present, there’s no obvious legal obstacle to flying a huge aircraft into the stratosphere and releasing sulfur directly, as long as you adhere to rules around fuel emissions.
The final session, a roundtable, focused on open science & communications. The difference between knowledge and action, or science and policy, was central. Rachael Kotarski shared slides.
Recent actions by the Trump administration may change the available funding fraction for overhead in federal grants and could cut funding to organizations like NIH, NSF by 2/3. The panelists covered this in different ways. Rachael Kotarski pointed out how overhead helps support libraries, data archival systems, outreach events to keep the public. Given that trust in the academic sector is at an all-time low, some panelists shared that it’s frustrating but unsurprising that these proposals haven’t elicited serious outcry from the American public.
A poll from Pew Research showed that most Americans see research scientists as “intelligent,” but less than half see them as “good communicators.” 60% of Americans polled think that universities are for-profit, while in reality most are non-profit. Misunderstanding of this dynamic leads some members of the general public “to want to burn it all down and rebuild from scratch.” The Ivy League is a major recipient of this antipathy. Prof. Keith discussed the corrosive effect of entrenched legacy admissions.
The panelists shared frustration around these results. Promoting research is a full-time job itself, but university communication departments can’t make research an interesting story without editorializing to some degree. Prof. Shaw shared that communicating with kindergarteners is a lot to ask of tenured faculty. Plain language summaries in papers may help, but are ultimately not aimed at policy-makers or the general public.
Prof. Keith shared frustration with the bureaucracy inherent in universities and funding. Federal grants are structured in such a way that researchers have to know what their results will be, before they do their experiemnts. This is an obstacle to real discovery & innovation.
Keith mentioned how the team (including John Jumper) behind DeepMind’s protein folding tool, AlphaFold, only took shape & made inroads after it outgrew academia. Joining Google and expanding to use the computing infrastructure and novel algorithms available within the private sector was key in its success. This sort of ground-breaking work seems to rarely entirely happen within academia.
Rachael Kotarski shared how the system differs in the UK; a 4-5 year Research Excellence cycle allows periodic re-evaluation of priorities & effectiveness.
I asked a final question on the impact of changes to government funding. Will privately-funded research have a greater leverage over research institutions given new restrictions on federal grant money, and would this have an impact on open science given the lack of stipulations around data release? The answer was a general yes. Corporate-funded research studies sometimes uncover results that the funders prefer to be left unpublished. This can leave researchers empty-handed.
Many thanks to the speakers and the University of Chicago Library & Department of Geophysical Sciences for sharing their insights & work in a public forum!
Leave a Reply