skip to main content
Home  /  Interviews  /  Richard Flagan

Richard Flagan

Rick Flagan
Rick Flagan

Richard Flagan

Richard Flagan

Irma and Ross McCollum-William H. Corcoran Professor of Chemical Engineering and Environmental Science and Engineering

By David Zierler, Director of the Caltech Heritage Project

February 11, February 18, February 25, March 7, March 23, April 1, 2022


DAVID ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It is Friday, February 1, 2022. I am delighted to be here with Professor Richard C. Flagan. Rick, it's great to see you. Thank you for joining me today.

RICHARD FLAGAN: Glad to be here.

ZIERLER: To start, would you tell me your title and affiliations here at Caltech?

FLAGAN: I'm the Irma and Ross McCollum-William H. Corcoran Professor of Chemical Engineering and Environmental Science and Engineering.

ZIERLER: There's a lot there in that title. Let's start, first, with the names. What is the circumstance of having it jointly named between the McCollums and Corcoran?

FLAGAN: The McCollums are the people who endowed the professorship. Bill Corcoran was, I think, a mentor and a friend who it seemed appropriate to add to the name. The fund had actually grown to the point where one professorship became two.

ZIERLER: Now, do you have any special connection with your research with either the McCollums or Corcoran? Do you see any connections there?

FLAGAN: With Corcoran, definitely. Not at present, but when I started at Caltech, the first projects I did had to do with pollutant formation and coal combustion. Bill Corcoran was doing a lot of work with coal and refining in general at the time. In fact, when Corcoran passed away, I became the advisor of the students he had at that point.

ZIERLER: In terms of your affiliations, is that a dual appointment, or is one your home division, and you have a courtesy appointment within Caltech?

FLAGAN: When I first came to Caltech, I was hired in what was then environmental engineering science in the Division of Engineering and Applied Science. As my career developed, I found new research areas where I could apply what I had been doing to problems that were not directly environmental. I started out looking at particles as they were formed in combustion systems, ultimately got involved in looking at particle formation in photochemical smog. But then, I also got involved, through a project with JPL, in the use of aerosols in refining of silicon, and that was to try to make a very inefficient process more efficient to make photovoltaics practical on a large scale. Chemical Engineering was a logical home for both, so I actually switched divisions into Chemical Engineering. As the Environmental Engineering Division shifted its focus from the engineering side, which was very heavily focused on water at the time, to include much more of the science and changed the name to Environmental Science and Engineering, and moved out from EAS to span EAS, CCE, and GPS, looking at scales from local to global, I re-formalized the affiliation with that program.

ZIERLER: It sounds like the trajectory of your research career really is a microcosm for the way that you can do interdisciplinary research at a place like Caltech, where the administrative walls are really not that high.

FLAGAN: That is one of the beauties of Caltech, no brick walls. And not just for seismic reasons.

ZIERLER: In terms of service work, the committees you sit on, the students you advise, where is your home department or division at Caltech now?

FLAGAN: My primary home is Chemical Engineering.

ZIERLER: I'd like to ask some questions about disciplines and fields, given all of the research that you traverse. There's so much there. At the end of the day, what would be the umbrella title for which all of your research falls? Would it be chemical engineering, chemistry, environmental science, fluid dynamics? Where do you see yourself?

FLAGAN: The common thread through everything is aerosols, particles in gases. That was a field that I had done nothing in before I joined the faculty here. During my interview, I was introduced to that field, and it seemed like a natural new direction to go. When I joined the faculty, I actually changed fields from focusing on combustion, the gas-based chemistry, the turbulent reacting flows, to focusing on aerosols. But aerosols is a very broad term, and it applies in many different domains, and I've spanned a lot of them.

ZIERLER: Was it Corcoran who was instrumental in this new field that you pursued?

FLAGAN: No, it was two people. One was a professor who left Caltech a few years after I joined, Sheldon Friedlander, who moved over to UCLA, and the other was John Seinfeld, with whom I still collaborate very closely to this day.

ZIERLER: Let's do some nomenclature now. First of all, what is the most commonly accepted definition of aerosol? What are aerosols?

FLAGAN: Aerosols are particles suspended or entrained in a gas.

ZIERLER: Are there size limitations for which things are either too large or small to be an aerosol?

FLAGAN: The aerosols refer to particles that remain in the gas for some extended period of time. Exactly what that time is varies from system to system. We can look at particles as they're being formed directly from gas molecules, and there, we are looking at particles down to one-nanometer in size and even below. If we start looking at clouds, some would dispute whether you call the cloud an aerosol, it is described by the same physics, it introduces some physics that's not present at smaller particles because of the large gravitational effects, but we could span that range. From nanometer sizes out to approaching millimeter sizes.

ZIERLER: What are the delineations in your research agenda in terms of studying aerosols that are human-made versus those naturally occurring?

FLAGAN: My work really includes both. It depends upon the problem that we're looking at. I've dealt with designing chemical processes to make interesting and useful materials, example being the refining of silicon or making of special kinds of memory devices based upon aerosol nanoparticles. The work that I did with Harry Atwater, we developed some very interesting processes for that. Those are strictly manmade. When we talk about what's out in the atmosphere, we talk about urban air pollution, it's a mix. We have the particles that are produced by sources, by diesel trucks, factories, all sorts of industrial processes. But we also have particles that form as a result of gas-based chemistry. We have chemical reactions going on in the atmosphere in the Los Angeles Basin, photochemical reactions. The sunlight really drives a lot of that chemistry and produces and grows particles. That includes both particles that are formed from materials that are emitted by man, by people in general, to the atmosphere, but it also includes particles that are produced by vapors that are present in the atmosphere naturally. And sometimes, it's a mix of both.

ZIERLER: A question in historical context. How far back, roughly speaking, does aerosol research go as a distinct field of study?

FLAGAN: In terms of the directed research on looking at particles in gases, particularly with regard to particles in ambient air, quantitative studies go back to the very early 19th century, maybe even a little bit before. In terms of observations of particles and their effects in the environment, it goes back much further. You see depictions of aerosols in artwork of all ages.

ZIERLER: That's to say that we don't need modern technologies to perceive aerosols? There was an awareness that there were particulates suspended in gas even before we had the ability to detect them?

FLAGAN: The 19th century is where many of the basic methods were developed. They didn't have the electronics to make instruments that look like what we have today. But there were a bunch of very clever scientists who made really intriguing observations of very small particles in air long before we had the electronics to do it.

ZIERLER: For the observations and experiments that are most relevant for your research, what are the theories, perhaps in fluid dynamics, turbulence, or elsewhere, that provide guidance for you?

FLAGAN: A lot of what we deal with is the microphysics of the particles. There, we're dealing with very small scales. We're dealing with scales on the order of the size of particles. Fluid mechanics is very important, turbulence, less so, although when you start getting to thinking about clouds or industrial processes, turbulence becomes very important. But we're often dealing with very low Reynolds number flows with flows that are laminar, that have simple fluid shear, basic fluid mechanics. But the particles are often very, very tiny. If we think about a gas, a gas is not a continuous fluid. If we look at ambient conditions, if we go down to a scale on the order of about 60 nanometers, typical ambient conditions, we're in the range where the mean distance molecules travel between collisions is that scale. If particles are smaller than that, they don't see a continuum's fluid, they see impacts of individual molecules. And that changes the fluid dynamics to something that's described by kinetic theory of gases. Thermodynamics also comes into play here because that describes the equilibrium between the vapor phase that may be going into or coming out from the particles and what's in the particle phase.

ZIERLER: What are the aspects of your research that are oriented toward basic science, just understanding how aerosols work, and what aspects are really geared toward applications or translational research?

FLAGAN: Well, at the very basic scale, one problem that comes up again and again in the research we do is the formation of a new phase by nucleation of vapor molecules to form a liquid or a solid. This occurs in the atmosphere and in a wide range of different systems. And that has been a common thread to much of what we've done over the years, everything from how we did the refining of silicon involved trying to gain control over that to work that we're doing today, looking at how half of the particles in the global atmosphere are formed.

ZIERLER: Who are some of the key funders of your research, in industry or government?

FLAGAN: In government, the biggest funder has been the National Science Foundation, funding both basic and applied research. We've also had funding from the Department of Energy, from NIH, we've had NASA funding, particularly on projects in collaboration with JPL, the Environmental Protection Agency. It's been a wide range of government agencies. In industry, I've had some funding from Dow, but that's more using techniques that have come out of aerosol science to address problems in the materials processing that they do, like looking at how bubbles nucleate in liquids. We have an ongoing project on that. I've had collaborations with Lucent, came out of Bell Labs, ultimately evolved and split in other directions. That was looking at the processing of semiconductor devices. I have collaborated some with industry on instrumentation questions, including specialized companies dealing primarily with aerosols, some large, some small.

ZIERLER: Last question for our initial session. I wonder if you can explain perhaps, in concentric circles going from the micro to the macro, the areas that you study. The close range, perhaps your recent work on COVID transmission, all the way out to however large it goes, planetary or beyond. What are the smallest areas where aerosols exist that you study? Let's go from there.

FLAGAN: The smallest has been looking at this problem of new particle formation. Right now, we're looking at how new particles form on a global scale and how they affect climate. I can describe later experiments that we are doing there. We've looked at a number of aerosol problems that relate to health. Recently, the COVID pandemic drove a lot of work on how we protect people. That ultimately focused in on the masks. Not because it was the most exciting work, but because we had to solve some problems to know how to protect people, and from a local perspective, looking at how Caltech could get people back in the laboratories safely. There, we made recommendations that changed some of the policies that Caltech has taken. I've also looked at particles that affect diseases like asthma. Specifically, I got involved in the question, how can huge particles like pollen, which are so big that if you inhale them, they'll deposit in your nose or throat and won't penetrate to the deeper lung where they could trigger the respiratory response that we see in asthma, where people's lungs constrict so much that they can't exhale, and you get the wheezing. That got down to looking at the particles and identified ways that the pollen can lead to the formation of particles that contain the allergen in really tiny particles. We go beyond that, think about air pollution.

A lot of our experiments have been quite fundamental, looking at, as an example, the formation of what's called secondary organic aerosol in photochemical smog. There, we've used chambers, relatively large experimental apparatus. The current chambers we use are about 20-plus cubic meters of volume. They're big Teflon balloons. They allow us to do experiments with a controlled volume of air and to probe how particles form, examine how different chemical species contribute to that formation. That's work that I've been doing with John Seinfeld, and we've been doing experiments of this kind now for many years. The species, the chemicals that we look at have evolved over time. As one changes emission regulations, the mix of what's going out in the air changes. There was a time when the automobile was the primary culprit by far. Now, we're looking at consumer products. All those things that have nice fragrances that, when they get out in the atmosphere, react really effectively and produce particles. We've got some experiments now looking at the solvents from paints. Those are very applied problems. But they're important from the perspective of understanding what regulatory actions and policy decisions lead to in terms of air quality.

We've looked a lot at issues in aerosols looking at climate change, trying to understand the biggest source of uncertainty in our total radiative budget for the planet. We know the greenhouse gases pretty well. They're stable over long periods of time, unfortunately. We can predict what those effects will be very effectively, and those predictions have been made–one very early reference on that goes back to the 1880s, when a very famous name in chemistry, Gustav Aurenius, looked at the effect of CO2 on atmospheric temperature and made predictions of warming that we have seen. He didn't have all the tools, but he had tremendous insight. To do those kinds of measurements, we have gone out with aircraft to probe the atmosphere at larger scale. We've built instrument packages to go out and measure what's in the atmosphere. Ultimately, the understanding from those is very localized. We may be probing hundreds of miles, but on a global scale, that's a tiny distance. We've got really good data on near-coastline fair-weather conditions, not a lot on the rest of the planet and the rest of the time. That work ultimately has to be coupled to large-scale modeling.

I don't do large-scale modeling, but I do talk to people who do it, and start looking at the question of how we can take data from the laboratory experiments and translate it to a parameterized form that can be integrated into large-scale models. As we look at the large-scale models, those models come up in many different contexts. One context we're looking at now is how clouds form. These are not the typical clouds that we think of when we look up in the sky here. We're looking at the clouds of Venus. Right now, we're not trying to predict those clouds. The focus is, how can we get real in situ data on those clouds? There are Venus missions being planned, and I'm working with JPL on trying to help in enabling what is called an aerobot, which would be basically a balloon-borne laboratory that would fly in the cloud layer of Venus. This balloon-borne laboratory would fly not under conditions of the surface. It's way too hot, the pressure's really high, instruments would not last very long there. There have been a few that have lasted for hour timescales, but very difficult to do.

The clouds on Venus form at altitudes of 50 kilometers or so, where the pressure is not 90 bar or 90 atmospheres as it is on the surface, but more like a tenth of an atmosphere to an atmosphere, and the temperatures are comparable to the temperatures in the Earth atmosphere. But the clouds aren't like the clouds in the Earth atmosphere. In the Earth atmosphere, the most abundant condensible vapor is water, and the clouds are primarily water. Water that is condensed on smaller particles, but still, the clouds are primarily water. Water is scarce in the atmosphere of Venus. The clouds there are primarily sulfuric acid. As we think about sending a mission to Venus to try, through robotic tools, to probe the atmosphere, we first have to figure out how to build the instruments that will survive in that atmosphere. We're working on that right now.

ZIERLER: It's amazing to think just how expansive your research is all the way from particle creation to beyond planet Earth, to Venus and perhaps beyond. That leads me to ask about all of the collaborations that are possible at a place like Caltech. Let's run through that same concentric circle of research interests so I can get a sense of some of the key individuals and research projects at Caltech that have been most relevant for your research. Let's start, first, on the human health side. Who are some of the people or institutes you've worked with over the years?

FLAGAN: On the human health side, actually, a lot of my education on that has come from collaborations outside of Caltech but in the Southern California area. I've interacted with people at USC in their Environmental Medicine Department. When we did the work with pollen, I actually ended up, in my research group, having two physicians, a botanist, and a specialist in computer vision. The latter came from starting discussions with Pietro Perona. The question there being, how do you identify pollen? On the COVID side, that was done largely in isolation. I was definitely involved with Niles Pierce and Barbara Wold through their roles on the committee that was looking at safety in lab-opening from a broad administrative perspective for Caltech as a whole and trying to work with them to focus in on masks that would be relevant. Joe Kirschvink had been doing some preliminary work on masks. We've brought into play the state-of-the-art aerosol measurements to make that more quantitative.

ZIERLER: What about with JPL? In what cases did JPL come to you because they had a specific research question for which you could provide guidance, and in what ways were you interested in aerosols at the atmospheric or even planetary level?

FLAGAN: At JPL, if I go back very early, before I was doing that silicon work, they had a tool that I wanted to use that was more available than what I had access to on campus at that point, or at least what I knew about at that point. I went up there and used a microscope, where I commented on some pictures of silicon particles on the wall. But it was the people at JPL who came to me to say, "Can you help us understand this problem?" that started off work with JPL. At several points along the way, I have had JPL people come to me with different questions that they wanted someone with an aerosol background to look at. At one point, I had someone from JPL come to me who said, "There's going to be a flight where we're going to be trying to look at the formation of clouds in the upper atmosphere. We see that you've been working on some instrumentation for that. Could we put that on the WB-57 and fly it up to 60,000 feet?" Which we did. That was not one of our most successful missions. We learned a lot about putting instruments in aircraft going way outside the bounds we'd ever worked before, but it started out from a query from JPL. On the planetary science side, it has been people from JPL largely coming to me to say, "We have a problem that we're trying to understand," whether it's thinking about the dust devils on Mars, or the clouds on Titan, or the clouds on Venus, or how we deal with planetary protection as we try to bring samples back from Mars, take samplers there and get them there cleanly, and ultimately how to bring samples back to Earth without the risk of contaminating Earth with any form of life that might be picked up from Mars. In each of those cases, it came out of JPL.

ZIERLER: Specifically with regard to your research on aerosols and their relation to climate change, I'm curious how you see your research fitting in more broadly with this institute-wide initiative on sustainability at Caltech.

FLAGAN: I've not been directly involved with the Resnick Institute. I see the work that we're doing as being directly relevant to that. Sustainability is the big question as we look forward, and the aerosol question on climate is absolutely critical. It comes in several different forms. One is the one I've already mentioned. The biggest source of uncertainty in the energy budget for the planet is the aerosol. That's because the greenhouse gases are distributed relatively uniformly, at least on a hemispheric scale. The aerosols are short-lived. They will last in the atmospheres for days to weeks. Clouds will be even shorter-lived. The uncertainty that that brings in to how much sunlight is actually absorbed within the atmosphere or at the surface versus how much is reflected back to space is a key part of the uncertainty in how we have perturbed the climate on this planet.

ZIERLER: I heard you use the term energy budget. What does that mean?

FLAGAN: The Earth is warmed by the sun. But only a fraction of the sunlight that reaches the Earth is actually absorbed. Clouds reflect sunlight back to space. When you fly during daytime, you look down at the clouds, they're very bright. That's sunlight being reflected back to space. The haze also reflects some sunlight back to space, but there are also things like black carbon in the atmosphere, soot particles emitted by combustion sources, that absorb sunlight. All that combined determines the net warming that we get from this balance of sunlight coming in and reflected radiation or longer-wavelength emitted radiation going out into space. The energy budget is that balance.

ZIERLER: In the fantastically complex, multidisciplinary endeavor that is climate science, where do you see your field slotting into the basic understanding of how climate change or global warming occurs?

FLAGAN: My work is really focused on the microphysics, understanding the aerosol and ultimately its interaction with clouds. It aims to provide ways to quantify that so that we can integrate in an accurate way the complex effects of aerosols and clouds, aerosols causing what is called direct radiative forcing, and the clouds, which are indirect, in that they're formed on the aerosols and then scattering sunlight. We're trying to develop the physical models that could go into those larger-scale models, or at least to provide the data that will enable that.

ZIERLER: That's, of course, coming from a fundamental research perspective. Do you see this work influencing various mitigation strategies in climate change?

FLAGAN: The critical question that is going to come up, is going to happen at some point, people are going to try geoengineering. There are natural experiments. A volcano emitting particles into the upper atmosphere is sort of a natural experiment on geoengineering. You emit large amounts of sulfur into the atmosphere, it forms particles relatively high in the atmosphere, leading to cloud droplets or cloud ice particles that affect the radiative budget. There are proposals to do that intentionally, to send aircraft up into higher-altitude regions, releasing gases that would produce particles and affect how much sunlight reaches the surface. When those experiments happen, it's going to be absolutely essential that we have good, objective data to understand exactly what is going on, to develop the basic physical and chemical understanding to a level that we fill in the gaps. There will be unintended consequences of those experiments when they happen.

ZIERLER: You sound fatalistic, that they are going to happen. Is that a certainty, you think?

FLAGAN: I think it's a high probability. We are seeing really drastic effects of climate change, and we're going to see more. At some point, someone will start trying experiments with that. Will it be something that is done with a firm scientific basis? On that, I have my doubts.

ZIERLER: What are the options? What would that even look like, to disperse particles throughout the atmosphere in a way that would reduce climate change or global warming?

FLAGAN: One approach would be to fly aircraft to release into the stratosphere sulfur-containing gases that would undergo reactions and form a sulfate aerosol. This would be probably largely sulfuric acid, perhaps ammonium sulfate. That aerosol would scatter sunlight, scatter some of it back to space. That would be one. When you do that, if you're up in the stratosphere, you don't have the vertical mixing that dominates in the troposphere, where we live. The stratosphere's thermally stratified, it's much more stable, those particles will stay there for long periods of time. They will disperse over large areas. That would be one form. That's a long-lived form. It has many potential implications. If we think back decades ago to the discussion of acid rain, which was a result of building tall smokestacks to export the sulfur emissions from burning coal so they don't affect pollution locally, and you see the pollution effects downwind. We're going to produce sulfuric acid in the upper atmosphere. What are its effects? That would be one scenario. That's one that is discussed repeatedly. I would be surprised not to see it happen at some point.

ZIERLER: What unintended consequences look back or blow back, as it were? Worst-case scenario?

FLAGAN: One is, ultimately, that acid will come down. The acid rain question we've seen before is now put onto at least a hemispheric scale. But also, the weather patterns that we have are driven by the solar fluxes where you have heating and cooling. That determines the convention we have in the troposphere, it determines the winds, the circulations. Ultimately, the warming of the oceans affects how the flows there will behave, although I am in no way an expert on that and cannot comment on it with any detailed knowledge. If we change the distribution of radiation, are we going to perhaps produce some cooling over large areas but shift where the rainfall occurs, so we take rainfall that's occurring in one place and move it someplace else? Are we going to induce droughts? What are the consequences that will occur when you start changing how the energy is distributed throughout the atmosphere?

ZIERLER: Maybe it's a naive question, but given how important clouds are to blocking and reflecting solar heat, why do we not have the technology simply to release water vapor and have it behave like naturally occurring clouds?

FLAGAN: Water vapor's the biggest greenhouse gas. It's just very, very short-lived in the atmosphere, and it's part of the system. The amount of water in the atmosphere is huge. The amount we would have to introduce as vapor to make a big difference would be immense. But there are proposals to do this. If you fly over the oceans, you will often see a large low-level cloud deck, the marine stratocumulus clouds that will extend for thousands of miles. There are proposals to not inject water vapor, but to inject the seeds on which water would condense at the top of the marine boundary layer in order to extend that cloud deck, and thereby reflect more sunlight to space. That's a very serious proposal, and that's an alternate method that would be much shorter-lived. The droplets in the cloud are constantly being formed, growing, then evaporating as they reach the tops of the clouds and mix out from the clouds. That would not be something that would circulate over the hemispheric scale. It would require energy to continuously eject material into the atmosphere. There are proposals to use waves and winds over the ocean to do that. That might be an alternate way to do it, and that is a very serious proposal.

ZIERLER: Maybe it's more a philosophical than scientific question, but the idea that there's a sort of fatalism that we are going to do geoengineering, what does that say simply about our inability to halt or reverse the behaviors, the energy production that's causing this need in the first place?

FLAGAN: The technology to replace the fossil fuel usage with renewable resources either exists or is being developed at a rapid pace. What's lacking is the political will. And this gets to politicians who believe more in faith or belief than in fact and science. It's really disappointing that we have such a divided political system in this country that we're not seriously attacking the problem.

ZIERLER: Is your sense that the situation is so bad that the risks of a geoengineering project at a large enough scale to actually make a difference are actually worth it at this point?

FLAGAN: I think the risks of unintended consequences are very, very high, that the technology exists to make a huge dent, and to move us toward a situation where we can constrain how much warming we see. But it requires investment. It's not going to be done with a focus on making sure every existing business stays in business. It's going to change the complexity of the industrial world. But I think the risks of relying on unproven technologies are very severe.

ZIERLER: You mentioned that one of the connections to JPL was that they had an instrument you were interested in. Have you been involved at all in instrument building or in consulting with companies to create instruments you needed for your work?

FLAGAN: I've been involved a lot in building instruments. There seems to be a mythology in this country that companies build instruments, and scientists buy and use them. And that's not the way it works. A lot of scientists do buy and use instruments, and they do it very well, and that leads to all sorts of major advances. I'm not saying anything against that part of science. But when you look at where the dramatic new developments have come from, it's usually a scientist who sees a need and devises a way to do it. I've been involved a lot in developing instruments; for some of those instruments, I've been involved in getting companies to produce them, getting them out to the whole world. An example of that goes back to our early work with the atmospheric chambers, looking at the processes involved in photochemical smog. The instrument we had at the time for measuring small particles had many, many problems. I watched my students' frustration in trying to make sense out of the data that those instruments would produce. When we had these experiments going on, my student was very, very frustrated.

After she defended her thesis, the next year, when I was teaching a laboratory course, where I would go through, talk about instruments, talk about what works, what doesn't, what's good, what's bad, I was looking at this one instrument that was the basis for our measurements of very-fine aerosol particles, I didn't really enjoy talking about that one, so I started thinking about some things I'd been learning from Jack Beauchamp on mass spectrometry, and asked a very simple question. A basic measurement involves putting a charge on a particle, single elementary charge, and migrating it in electric field, and how fast the particle migrates tells you the size. The way the measurements would be made is that you would charge the particles, as I said, at a level where you've got an average of, at most, one elementary charge, then you'd set the voltage in a separator device, allow it to reach steady state, and count the particles that are transmitted. Then, you'd change the voltage, sit twiddling your thumbs and waiting, and after it reaches a new steady state, you'd make the next measurement. We had a beautiful tool that we could use as a calibration system, but it was too slow to make measurements. I asked a simple question. What would happen if, instead of classifying the particles in a constant electric field, we classified them in a time-varying electric field? I did a very simple analysis of how that would work and realized that I got the same instrument response function for the simplistic models as what I had found with the stepping mode.

The next day, I gave my students a lecture on an instrument that did not exist. Because all it required was doing a little bit of computer interfacing and writing a computer program to drive it, in two weeks, I had that class using it in the laboratory. They were the first ones to see this instrument. We ultimately developed it quite a bit further, and we had a new tool. The grant that I had supported the student on when we did the development had a stipulation in it that said all inventions would be given freely to the American people, i.e., "Don't bother to apply for a patent you can't make money off of." You can't have the protection that would get a company to commercialize the technology. I went to the company that was in the best position to do that, which had this calibration tool, and said, "I've got a way to take your calibration tool and turn it into a really good fast measurement system. It would take a measurement down from 20 minutes to half a minute."

I had scheduled at a conference a breakfast meeting with a key guy at the company, and sometime in the mid-afternoon, we finally broke up the meeting. They were very interested, but the key challenge was, they didn't have a way to protect their investment to get this to be commercial. I'd first given it away to people. I'd given the software to people because we really couldn't patent it to build something to be solid. That clause, by the way, is no longer accepted by Caltech. [Laugh] But I discovered that was far too expensive because people expected us to provide full service when they didn't know how to do things. It took a few years, but ultimately, that company did find a way to embed the software inside another piece of hardware they would sell that allowed them to go forward with the investment. It's now the most common instrument used for that class of measurements worldwide.

ZIERLER: More broadly, there's so much exciting work with sensors at Caltech right now. Are you involved in that? I wonder, specifically, the new quantum center for research that the Ginsburgs are supporting, if there's anything there that might be relevant to you.

FLAGAN: I have not been involved with that center at all. The measurements we're doing are really not getting down to the quantum level other than that we're looking at one charge at a time. A key challenge that I have seen for a long time is the need for measurements of the particles in the air to have much more fully integrated networks of measurements than is possible today. The commercial instruments now sell on the order of $100,000. At $100,000, you don't have local air pollution agencies buying those instruments. Also, analyzing and understanding the data is much too complex for the level of personnel they're hiring. The question of how we go from laboratory-grade instruments that are great for the fundamental research to the integrated networks that will provide the resolution of data needed to answer key environmental questions so that we can put it out in big networks, that is a key question that I am trying to work with, and we're trying to develop instruments that do that. I've worked recently with a small company that is just in the process of taking an instrument we developed, and they've lowered the cost by a factor of three. We need a factor of 100, so there's a long way to go. But they've been able to lower the cost quite dramatically. We just installed one of those instruments on a site at JPL as part of a network that JPL is developing to look at the relationship between air quality and human health on a global scale, integrating satellite data with relatively low-cost sensors at a number of sites around the world.

ZIERLER: All of these instruments and the data they collect naturally leads me to ask you about the role of computation in your research. Let's start with simulation. Is computer simulation important for what you do?

FLAGAN: Absolutely. And at many levels. At one level, we've got these big experiments, the chamber experiments, and we want to extract fundamental information from them. We want to understand the dynamics of the aerosols so that we can ultimately translate microphysics-based models to simplified models that can be integrated, for example, into global climate models. We do a lot of work simulating our laboratory experiments. That's one level, and that extends not just in the chamber experiments, but even to ambient measurements. But when we design instruments, we design them based on simulation. We build the instrument first on the computer, and we will go through many variants of the instrument before we commit to physically building it. That is far more efficient than building $100,000 or more instruments, then, "Oh, I got this part wrong." We spend a lot of time doing simulations, and some of these involve very intensive computational studies.

ZIERLER: What about modeling? How do you analyze all of the data, and what do you do with it as a result?

FLAGAN: The simulation and modeling are really one and the same. There's one experiment I'm heavily involved with using an ultra-clean atmospheric chamber at CERN. We're using this to do very fundamental studies of the nucleation of particles in the atmosphere. That experiment, we take to it our measurements of the very, very small particles. Other teams bring other instruments. This one 26-cubic-meter chamber probably has 15 mass spectrometers on it in any given experiment, plus a lot of aerosol physics measurements, and gas-based measurements, and so on. Modeling that, what we're trying to do is extract from the experiment parameters that describe the physics and chemistry in a way that they can be integrated into bigger models. We have the basic theory of aerosols that's quite well-developed. We can describe how the distribution of particles with respect to size evolves with time. But there are some rate parameters that we need for that. What we try to do is to parameterize those rate processes in ways that are focused on the fundamental physics, but minimizing the complexity so that we develop models that can be integrated into bigger ones. It's data analysis, information extraction, statistical analysis out to machine learning methods to try to extract those parameters, knowing the structure of how the system evolves and trying to develop that into tools that can go into the bigger models.

ZIERLER: Given all that you do, it's obvious that your research has significant policy ramifications, either at the local level, the industrial level, national politics, and even international agreements, given climate change. Are you involved directly in policy debates or analysis, or do you provide consulting to various companies and agencies?

FLAGAN: I have been involved in numerous studies trying to guide policy questions, National Research Council studies, things of that sort. I have not been directly involved other than a few times, but generally not heavily involved in actually formulating the policies. We do, however, send out a lot of students who have learned the basic science, and having that basic science background, want to get directly involved in policy. In the last year, one student went to the California Air Resources Board. Another one first was involved in a program where she was writing, for popular and political consumption, analysis of what is coming out of the science. Writing for newspapers, for example. We do have quite a few students who take that route. I do also sit, as Caltech's representative, on the board of the California Council for Science and Technology, which runs a post-doc program that embeds post-docs from the sciences into the state legislature, and state agencies within the state of California. People who have science backgrounds but want to more directly impact human society, they get involved in the policy questions through that route.

ZIERLER: What about on the regulatory side? Do you see your research being relevant, for example, to the EPA and the way that they want to control dangerous aerosols in our environment?

FLAGAN: I've certainly been involved with discussions at EPA as to how one might implement regulations. For example, one that I did very early in my career was when the standards for particulate matter in the air were initially based on something called total suspended particles, which was trying to get the total mass of all the particles in the atmosphere. That's something you really can't do without biases, just because of the difficulty of sampling. There were extensive discussions that went on. How do we think about particles that people inhale? Where they would deposit in the lungs, how they would impact health. First question was, how do you characterize the particles in the air that would penetrate to the deep lungs when you're breathing through your mouth if you're out running or bicycling, or you're a laborer doing physical labor outside, or a child playing on a sports field? You're breathing through your mouth. What is your dose? Ultimately, the PM-10 standard came out of those discussions. The mass of particulate matter smaller than ten microns. Ten microns wasn't a perfect size at which to take the cut, it was something that was attainable.

Dealing with the practical side of how you make something robust enough that it would stand up when lawyers have to take a case to court. Then, we looked at what happens when you breathe through the nose, and the so-called PM-2.5 standard came from that. That's mass of particles smaller than 2.5 microns. These are statutory requirements. They're designed to be enforceable in a court of law, to give agencies tools where they could specify what you need to measure, the measurements have to be cheap and simple enough that local agencies can do it, but then you can use those as a basis for making policy decisions. I've been involved in discussions of that form. That's one where I think we still have a long ways to go. PM-2.5 is a very imperfect measurement. What I'm interested in is, if you inhale particles, where do you deposit them? Do they deposit in the nose and the throat? Do they deposit in the lower airways, or in between? Ultimately, what are the effects? This comes into air pollution as well as questions like, what is the mode of transmission of COVID-19, the SARS-CoV-2 virus, that if, as was fated incorrectly early in the COVID-19 pandemic, the primary mode of transmission is big particles you produce when you talk when you cough or sneeze, particles much larger than ten microns, although they got the sizes wrong, if it's those big inertial particles that are sprayed out, that's very different from how you would protect people than if it's the small particles that come from the deep lung that lead to transmission of the disease.

The same kinds of questions come up with the health effects of the pollutants that are emitted into the air. If they deposit in the deep lungs, the way they're going to affect the body can be very different than if they deposit in the nose and throat. Particle size may lead to very different behaviors, even for particles that deposit in the same place. If you go down to really, really tiny particle sizes, they diffuse very fast. If you inhale them, they can deposit in the nose. They can translocate across cell membranes into the olfactory mucosa into the olfactory neurons, and ultimately, get into the brain, in some cases, without having to cross the blood-brain barrier. There are many, many different kinds of impacts that are out there. What's needed from a policy perspective is data that allows you to assess what the dose is for different kinds of exposures to different regions for different components of the aerosol. That requires a huge amount of information beyond what is collected now.

A problem in setting regulation is that regulations can only be set once there is epidemiological data to support it. The problem we have in the US, and because everybody else around the world has copied the US, is that there's a huge focus on PM-2.5, which is a very, very blunt instrument. We need a lot more information. This is where it gets back to the sensors question. How do you provide the data in a cost-effective way that will allow you to assess those effects? Now, the particles that affect human health will be different than the particles that affect the health of our planet. We need to think about it from both perspectives. We need a lot more detailed information than the standard EPA measurements will get. One of my foci is trying to understand and develop ways to get that data so that ultimately, rational decisions can be made that will stand up to all of the tests, administrative and legal, that they have to to start changing how we think about these policy questions.

ZIERLER: Last question for today, just as a snapshot in time circa February 2022, what are you currently working on?

FLAGAN: We have a number of different things. I'm very heavily involved in this big experiment at CERN called the CLOUD Experiment, which is one of the most tortured acronyms out there, I won't go through it. [Laugh] There, the focus is really on new particle formation. To support that, trying to push the measurements even further than we have. We can do a good job of measuring particles down to about 1.5 nanometers. We can't go below that because from the measurements that we're making, we cannot distinguish between a particle and some of the gas ions that we produce to charge the particles. How do we correct that problem? How do we advance the instrumentation to the point that we eliminate that interference? That's one very basic question. I think I have a solution. I won't know until we complete the simulations, then assuming we find a path that looks promising, build and test the instrument. I'm involved, also, in some exploratory experiments right now, trying to look at new particle formation in the upper troposphere.

The kinds of instruments we have used have never been used for direct in situ measurements as we approach the tropopause because of a bunch of technical questions. We are risking one of our instruments to go out on some flights to try to do this. We think we have ways to protect it, but we are cognizant of the fact that there are unknowns we can only evaluate once we do the tests. We will find a way to do it. Those two activities are very closely coupled. On the questions that have come out of COVID, there are still many questions with regard to the masks. How do you protect people? I hate to say it, but your beard stops you from being well-protected by any mask. That's why I don't have a beard. But also, can we use the physics to guide thinking about how diseases are transmitted? When we think about disease transmission, the mistake that was made in COVID was to assume that we knew the mechanism, for the WHO and CDC to decide that only big particles were important, "We don't need to worry about small particles, therefore we can recommend cloth masks." That was a huge mistake. Everything should be on the table until you have evidence that it's not rather than the other way around.

ZIERLER: Right, that's science.

FLAGAN: That's science. We need to apply the scientific method to scientific questions as well as policy questions. I've gotten rather heavily involved in some of the exploratory work for future missions to Venus. And that's intriguing because it's a totally different venue. It's out of this world. [Laugh]

ZIERLER: [Laugh] Literally. Rick, this has been a terrific discussion so far. I look forward to picking up next time. We'll go all the way back to the beginning and learn about your family background.

[End of Recording]

ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It's Friday, February 18, 2022. I'm delighted to be back with Professor Richard C. Flagan. Rick, it's great to be with you again. Thanks for joining me.

FLAGAN: Glad to be here.

ZIERLER: Today, we're going to go all the way back to the beginning and develop your personal narrative. Let's start with your parents. Tell me a little bit about them.

FLAGAN: Well, my father was the second-oldest of eight brothers and grew up in Rathdrum, Idaho, a tiny town in the Idaho panhandle. He dropped out of high school because he was offered a job. At that time, when the economy was not so good, with seven brothers, the income was very needed by his family, and he started driving a truck.

ZIERLER: How many generations back does your dad go in Idaho?

FLAGAN: Not many. His father started out in Indiana in a strongly German community. During World War I, names that were strongly German could actually be very dangerous. He and his siblings all left that area and changed their names. My last name is one that was invented by my grandfather.

ZIERLER: What's the original name?

FLAGAN: I'm not sure of the exact spelling, but it was something like Pfleging.

ZIERLER: And they made it sound more Irish, I assume was the idea.

FLAGAN: This redhead guy made it sound more Irish and got away with it. [Laugh]

ZIERLER: Did your father serve in the military?

FLAGAN: Yes, he volunteered early on. He wanted to be a pilot, but they told him his eyesight wasn't good enough, and they made him a navigator. He started out as a navigator but very quickly showed some unique skills as a navigator and got shifted off to a top-secret part of the Army Air Corps. He was one of the first radar operations on the B-17s flying over Germany. They had 13 units at that time, 13 flyable radars. He did three tours flying over Germany, usually in the lead aircraft of large missions.

ZIERLER: Did you ever get him to talk about his service?

FLAGAN: Only when he was in his 90s. He never talked about it for decades.

ZIERLER: Did you ever have a sense if he was conflicted, given his heritage?

FLAGAN: I had no sense of that at all. I think he felt he was doing his duty and what was needed at the time.

ZIERLER: What about your mom's side? Where's your mom from?

FLAGAN: She grew up on a family homestead in Stevens County, north of Spokane on a ranch on the shores of a lake. She was the only child and lived there up until her father lost the ranch during the Depression. Ultimately, she moved to Spokane.

ZIERLER: Where did your parents meet?

FLAGAN: They lived next-door to each other.

ZIERLER: In Spokane?

FLAGAN: In Spokane.

ZIERLER: What were your parents' professions after the War?

FLAGAN: My mother was a mother, housewife. She did not work after the War. When my father came back, he took advantage of the GI Bill and studied engineering at Gonzaga, got a degree in mechanical engineering, and ultimate was hired into the company that one of his professors worked at. He started designing tractor trailers.

ZIERLER: Do you remember what the company was?

FLAGAN: It was called Brown. I don't remember the full name, but they were a small manufacturer of trucks taking advantage of the low-cost aluminum in Eastern Washington because of the large hydroelectric plants.

ZIERLER: Is that where you grew up, Spokane?

FLAGAN: Where I started out.

ZIERLER: How long was your family there?

FLAGAN: I went to 1st grade in Spokane, then my father was transferred to Toledo, Ohio. We moved there and lived there a couple of years, then moved back to Spokane. After a few years there, he was going to be transferred to Illinois. While he was in Illinois, someone who he'd worked with previously called him up and said, "I've got a better opportunity for you." He took a look at a job in the Detroit area working with another company making trucks, Fruehauf, and ultimately took that job, and we moved to Michigan.

ZIERLER: How old were you at that point?

FLAGAN: I was about 11.

ZIERLER: Siblings?

FLAGAN: I have one brother and three sisters. One sister is older, the others are all younger.

ZIERLER: Where in Michigan did your family live?

FLAGAN: We started out in a small town north of Detroit called Mount Clemens, lived there for a few years, then moved to a place closer to where he was working on the East Side of Detroit, a city called Grosse Pointe.

ZIERLER: This must've been the height of the Detroit manufacturing era.

FLAGAN: Detroit was doing well at that time.

ZIERLER: I imagine strong public schools, a solid economic base.

FLAGAN: The place we moved to after Mount Clemens had very strong public schools, which is why we moved there.

ZIERLER: When did you start to get interested in science?

FLAGAN: As long as I can remember, I had interest in science and technology. I remember watching Sputnik fly over. When we lived in Spokane, we were not too far from an Air Force base, and we would always see the aircraft fly low over us, the B-52s flying over close enough that we could see the rivets in the wings. The technology always fascinated me.

ZIERLER: Did your father involve you at all in his work? Did you have a sense of what it meant to be an engineer?

FLAGAN: We quite often would go and see things that he was doing, and we did get involved in it at some level. We saw how the trucks were put together, played around with some of the technologies that they were working with.

ZIERLER: Did you have chemistry sets, toy model airplanes, things like that?

FLAGAN: Always. When we lived in Spokane, there, the geology was really interesting, and I was constantly exploring the rocks. I had a sense that geology was something I might want to do. When I moved to Michigan, where it was absolutely flat, where you saw mostly clay and not many rocks, it became less interesting, and I started focusing on other things.

ZIERLER: Would you say your high school was particularly strong in math and science?

FLAGAN: It was, but I was excluded from that. We had moved from Mount Clemens to Grosse Pointe when I was in junior high school. When I moved into this new school system, they had two tracks in math. They had one for the people who were really good at it and one for everybody else. I came from this remote suburbia school, and their immediate assumption as that I did not have the math preparation, so they put me in a class that was about two years behind where I had been. I suppose I proved that they were right by falling asleep in the class because it was so boring. The science was pretty good. Again, the advanced tracks were difficult for me to reach into. I eventually succeeded, but it took a lot of battling by me and especially my mother.

ZIERLER: When it was time to think about colleges, what was considered in reach for you, economically, geographically, academically?

FLAGAN: I considered colleges all over. I even applied to Caltech. I was rejected, I think because I was in a school that had a strong calculus program, and I had not been in the track that allowed me to take that calculus. I ultimately went to the University of Michigan, but I could have gone elsewhere. It seemed like a good option at the time, and it was.

ZIERLER: Did you have a clear sense when you got to Ann Arbor what you wanted to study? Were you always going to be on a science and engineering track?

FLAGAN: I started out thinking I would study math because I do enjoy math. But I very quickly realized that what I liked about math was solving practical problems, and I switched into engineering.

ZIERLER: It's such a big school. what is the engineering program like there? Is it subdivided into particular areas?

FLAGAN: Oh, yes, it's a huge school. When I tried to switch in, the school didn't know how to handle it. They knew how to handle students switching out of engineering, but they hadn't had someone come into engineering from the literature, science, and arts college. I went into mechanical engineering. I suppose that was the exposure that I had had while I was growing up.

ZIERLER: Was the idea that you would follow in your dad's footsteps and pursue something in industry, at least from the beginning?

FLAGAN: That was all I knew at the time. I don't think I had a clear idea of where I was going. It was just interesting.

ZIERLER: How much basic science did you take as an undergraduate?

FLAGAN: I started out in the LSA, Literature, Science, and Arts school with the basic physics and math. When I switched into engineering, the program was much more constrained. The degree in engineering nominally required an extra semester to complete beyond the normal time for a bachelor's degree. There was very little flexibility within that program. Because I came into it out of phase with everybody else, I was actually able to create more flexibility. The basic courses I took were mostly courses taught by the engineering faculty, but I took a lot of advanced courses as I went along in the school. Statistical mechanics, advanced topics that most mechanical engineers would never have seen.

ZIERLER: Was the field of chemical engineering available to you at that time? Did you see a possibility to merge these two areas of study?

FLAGAN: It was, but the organic chemistry teaching had a really bad reputation. I ultimately opted for more the mechanical engineering. A lot of what I did got very close to chemical engineering, however, even as an undergrad.

ZIERLER: Were there any professors you remember as being particularly formative in your education?

FLAGAN: There were definitely a few. Probably the best instructor I had there was a fellow by the name of Chuck Vest who went on to be the President of the National Academy of Engineering after having been president of the university. Exceptional teacher and later on, a very good friend. The first fluid mechanics course I took was absolutely terrible. It was handbook engineering, no basic science to it. Then, I had a second course taught by Professor George Springer. He went into the fundamentals, and I just found it fascinating. That became my focus. Ultimately, I did undergraduate research with him for about two years.

ZIERLER: Did you stay on campus during the summers to do research?

FLAGAN: I did one summer.

ZIERLER: On the social side, being in Ann Arbor in the late 1960s, what was it like? Were you politically active at all?

FLAGAN: I was, I would say, more politically active than most engineering students, but not very because I had no time. It was a very politically active time on that campus. You couldn't help but be involved, with the many large protests that were held there. One time, coming back from the north campus, where I'd been doing research, I stepped off the bus, and a tear gas canister landed at my feet. You couldn't avoid it.

ZIERLER: Was the draft something you needed to contend with?

FLAGAN: Oh, yes. At the time, when I was finishing, they had abolished the draft deferments for graduate students. They called me in for my draft physical before I graduated. I had passed my draft physical, I was 1A. When I went to graduate school, I had no expectation of being allowed to continue. I was expecting to receive a draft notice very, very quickly.

ZIERLER: To foreshadow to your later interest in the environmental aspects of aerosol research, in the late 1960s was also a time of rising environmental awareness. Did that register with you at all as an undergraduate?

FLAGAN: During the first few summers, I worked in industry. First one was at Chevrolet, where I worked on a drafting board. Incredibly boring work. The next summer, I got a job with the local electrical utility. They hired a bunch of engineering undergraduates to do stack emission testing on coal-fired power plants. I was immersed in one of the big problems there right away. They were burning very high-sulfur coal, extremely polluting. I was climbing up on the smokestacks and making measurements, sometimes several hundred feet up in the air. When we opened the port in the side of the stack to put the probe in, you'd get a blast of what was coming out of the stack, and it was terrible. I was deeply aware of the problem from that. Research I did as an undergrad did not have anything to do with environmental or air pollution, it was very basic fluid mechanics and heat transfer that actually foreshadowed some things I'm doing now, but from a very different perspective.

ZIERLER: You mentioned starting in mechanical engineering and thinking you would go into industry because that was what you knew. What was it that gave you the confidence or compelled you to think that graduate school was a viable option?

FLAGAN: I wasn't really sure it was viable option. I looked at the alternatives. If I wanted to avoid going to Vietnam, which is where I would have gone had I been drafted, and with a mechanical engineering degree, I knew the path they would be pushing me into if I got drafted, the jobs in industry that would afford me the opportunity to not take that path were all in the military-support industries and were not attractive to me. Politically, I could not support that. As I said earlier, when I went to graduate school, I had an idea what was coming, but I had no idea how I would respond to it. I'd always lived next to the Canadian border, spent a lot of time in Canada. I seriously thought about moving to Canada. I went to graduate school because I felt it was interesting, I could do something that might be important, and it was more attractive than the other options I had at the time.

ZIERLER: Before we get to the institutions themselves, what kinds of programs were you considering for graduate school?

FLAGAN: I had studied mechanical engineering, so I just looked at mechanical engineering departments. I was following the standard path. Emphasis was really on fluid mechanics. But as I said, I really did not have a clear picture at that time. It was political turmoil.

ZIERLER: Between fluid mechanics and statistical mechanics, did you ever think about applied physics programs?

FLAGAN: Not really because I had no exposure to it. In a huge university like the University of Michigan, the graduating class of my department was as big as the graduating class from Caltech. You're totally immersed in the one discipline. I did not.

ZIERLER: Between your grades and how actively your professors were supporting you, what kinds of graduate programs were within range for you?

FLAGAN: I applied to schools all over, including Caltech, MIT. Basically, anything was open.

ZIERLER: Why ultimately MIT?

FLAGAN: The professor I'd worked with had started his career there and I think probably instilled in me some bias in that direction. I was offered admission at Caltech, and while at MIT, I was offered a fellowship at Caltech. I was offered a teaching assistantship and was communicated with by someone who was not very effective at communicating with prospective graduate students. It just turned me off. This was long before the era where people would routinely go and visit all the graduate schools before they got in, so I was really going in blind.

ZIERLER: Of course, the late 60s follows you right to Cambridge, Massachusetts. What was that like?

FLAGAN: It was a political hotbed. There were many, many protests. The police would often bottle up the protest on the MIT campus before it could get to, depending upon which direction it was going, the back bay of Boston or the center of Cambridge. Again, I was immersed in that. I had become much more politically aware and more sensitive to the political activism than I had been before.

ZIERLER: The specific strain of protest at a place like MIT where there was a lot of concern about, for example, the work that Lincoln Labs did for the Department of Defense, was that an issue in Michigan as well, or this was a new concept for you?

FLAGAN: In Michigan, I had not seen that particular kind of protest. There, it was much broader, more focused on the social and political side of the conflict.

ZIERLER: Did you go to MIT specifically to work with a particular professor?

FLAGAN: No.

ZIERLER: What was the process of determining who your graduate advisor would be like?

FLAGAN: One issue I had to deal with was, what was I going to work on that would give me a chance to stay in graduate school? I talked with a lot of faculty. I had a fellowship, so I could pick and choose. I ultimately took on a project that had to do with some high-temperature chemistry, very basic physical chemistry. It was funded by the Department of Defense, sold to them as being essential for understanding why radio communications to vehicles entering the atmosphere were blacked out. I was looking at the chemistry that occurs behind a shockwave, where temperatures are in the range of 6,000 to 15,000 kelvin. It was very fundamental, sold as something that related to how you might deal with anti-ballistic missiles, but really, just basic research.

ZIERLER: What were some of the theories that provided guidance for this work?

FLAGAN: The work was really guided by a puzzle. What is the chemistry that stops effective radio signal transmission between a station on Earth and a vehicle entering the atmosphere? It was an observation, a puzzle that had to do with the plasma that's created at these very high temperatures. The hypothesis was that the chemistry of that plasma was really providing the blockage.

ZIERLER: Was this a field of study that naturally lent itself to a multidisciplinary approach, that this was not something that could be tackled within any one area of research or even academic department?

FLAGAN: In the big picture, certainly. In terms of what I was involved in, it was really based upon some tools that came out from fluid mechanics, the ability to recreate those shockwaves in the laboratory, and basic chemical physics.

ZIERLER: What was the instrumentation that was most relevant for this work? What were you working with?

FLAGAN: I was working with a device called a shock tube. This is a long, very highly polished stainless steel tube about ten meters long, divided into two sections. There's a short, high-pressure section, where you put in a gas at high pressure, and ultimately rupture a diaphragm between that high-pressure section and a lower-pressure section that has the gas that you want to study. When you rupture that diaphragm, the pressure wave that propagates down, if it's strong enough, will form a discontinuity, a shockwave that will propagate down that tube at velocities much greater than the speed of sound. That produces essentially an instantaneous jump in temperature. Within about three molecular collisions, you go from the ambient temperature to a very high temperature, 5,000, 10,000, or more. With that, you then use spectroscopic tools to probe what's going on behind that shockwave. The specific chemistry I was looking at was the emissions from the nitrogen first positive emission and the first negative emission, the dinitrogen molecule going to an electronically excited state for the first one, and the dinitrogen ion for the other one. The question was, what is producing these very high excited states that will cause strong interactions with electromagnetic radiation?

ZIERLER: A chicken and the egg question. Were you far enough along in this research that that's what got you to your graduate advisor? Or did you develop that relationship first, and that was the origin story of this research?

FLAGAN: This was the problem that he proposed to me.

ZIERLER: Was it related to what he was doing?

FLAGAN: That was what he was funded to do. From a practical point of view, it was something that was a long shot, but I could try to sell to my draft board to avoid a long trip that I didn't want to take.

ZIERLER: How mathematical was this research? Were you using at least very early variations of computers?

FLAGAN: The acquisition of the data used a fast oscilloscope to capture the emissions from a photo multiplier tube with some optical filters to see the emissions from electronic transmissions of specific chemical species. And that was captured on a polaroid film, taking a picture of the screen of the oscilloscope. Then, you would trace that and try to develop a model that would reproduce the signatures that were observed. All of the information that we needed involved physical modeling to predict what the temperatures were. We could measure the pressure. We couldn't measure the temperature on a ten-microsecond timescale. We would monitor the pressure wave, and we would monitor the emission spectrum. Ultimately, going back and reading the prior work that had been done on this, there was one paper that had gotten close to the problem, but the experiments they had done were rendered invalid because they were based upon a complex chemical process, and one of the rate constants that they used in the design of the experiment was off by a huge factor. And when you put in the right value, you could see the experimental conditions they had probed could not give you the information you wanted. But in that paper, there was a hint that they had done some experiments that did not require that rate constant. They had done the simple experiment, but the results didn't make sense at the time.

ZIERLER: What was that hint?

FLAGAN: They had these simple experiments they couldn't interpret. They were mentioned in the paper, and no data was shown. They were uninterpretable. I had been looking at what was going on in the shock tube and realized that, in analyzing the shock tube, we had to consider the boundary layer development along the wall of the shock tube to quantify the temperature that we were getting, that there was a strong bias in the temperature caused by that boundary layer slowing down the shockwave. The paper was from a research-based firm that was in the Boston area. I called the fellow who had done the experiments and asked him if there was any chance that he still had the data from those experiments. And he said it should be in the lab notebooks. He could try to get them, so we made an appointment. He got them out of the warehouse where they stored all the old lab books, and I went and met with Kurt Ray. He showed me the data, and he had done the experiments I was planning to do. All of them. [Laugh] He gave me a photocopy of those pages from his lab notebook, then I went back and started poring through it and realized he had literally done the entire project. He was sitting on all the data.

ZIERLER: To what extent were you deflated that this was already done, and to what extent were you relieved that you were not going to be doing redundant work?

FLAGAN: I was just excited I had the data. I started digging into it. But as I dug into it, I started getting into some questions in the basic physical chemistry that I had not formally studied, that the kinetics didn't make sense, the reaction was far too fast and was an electronically forbidden reaction. I started reading the physical chemistry and realized that the way we and everybody else had been thinking about the problem had to be wrong. Then, it became a question of developing the model to describe what was going on. What we found was the excitation from a singlet to a triplet state. It was not a simple collisional excitation, it was an exchange reaction. It was a nitrogen atom reacting with a dinitrogen molecule to produce an electronically excited dinitrogen molecule, but one nitrogen atom was exchanged in the process. Developed a model for that, and that provided a nice, simple explanation for that that stands the test of time. Gave us the basic mechanism to explain why this reaction was occurring so fast. For the calculations, I did some computer programming, but it was very simple, integrating a couple of ordinary differential equations, a couple rate equations with the temperature we got out considering the boundary layer in the shock tube.

ZIERLER: To go back all the way to undergraduate, when you realized for math that you wanted to do something that had more applied value, was this it? Were you in your research zone, doing what you wanted to do?

FLAGAN: I would say no, that I was developing much stronger interests in things with direct societal benefit. But it had satisfied one need. I did get an occupational deferment as a graduate student.

ZIERLER: When the draft ended in 1973, was that a big sigh of relief for you?

FLAGAN: When the lottery came in, I had this occupational deferment. I had a draft number that looked like it was probably safe but was not a clear thing. I was on pins and needles for the year. I did not immediately withdraw my deferment to go into the lottery because I had a sure thing versus a gamble.

ZIERLER: What would you say were the principal conclusions or contributions of your thesis research?

FLAGAN: That was just my master's. That was not my thesis. That was just a starter project that took me up to the qualifying exams. After I went through the qualifying exams, then I was really torn. I had additional projects. I started on a project that would've taken the kind of chemistry we could do with these shock tube techniques a whole lot further. It was beginning to focus on some of the chemistry related to pollution formation. It was also a very dangerous experiment.

ZIERLER: In what way?

FLAGAN: With the shock tube, the shockwave travels at many times the speed of sound. It hits the end wall of the shock tube and reflects back. The timescale you have for the experiment is measured in microseconds. The technique that I was asked to develop was to sample from the end wall of the shock tube after the shock had reflected, to extract a sample, collect it onto a cold substrate, so basically capture the highly reactive intermediates that are produced at cryogenic temperatures to then evaporate them into a mass spectrometer and get their chemistry. To do that, I had to make a valve that would open with timing for delay after that shockwave had reflected, stay open for a small period of time, about ten microseconds, and then close vacuum tight. Trying to find a way to put the energy into this valve to move it that fast–at that point in time, there were no obvious answers. There may have been to others.

We did not have the luxury of internet searches at that time. I spent a lot of time in the library looking for what had been done. I ultimately ended up with basically a shutter that I would shoot past an aperture. The way to put the energy into that shutter, to move it on a timescale where I would control, probably with a 50-microsecond lag, when it would fire, but then it would stay open for this short time and close. The energy source I found was the detonators used for the explosive bolts that separate missile stages. This is a little blasting cap. I had to go through explosives training for that. Ultimately, had to get the license to be able to deal with these things because it was on a university campus. I actually had to interview with the state fire marshal of Massachusetts as part of getting that license. I ultimately got the license, built the valve, tested it. The first test failed because the energetics of the valve motion was so high that when it reached the stop, it welded in place. The shockwave in the metal actually welded it.

ZIERLER: What kind of temperatures are you talking about here?

FLAGAN: I don't think I ever calculated the temperature.

ZIERLER: But really, really hot.

FLAGAN: Basically, you're compressing something that doesn't want to compress, doing a lot of work on it. Yes, it was very hot. I started redesigning that valve, but the other side was, I also needed to do the optical spectroscopy. After going through all the training on the explosives, realizing that I'm going to have a photo multiplier tube with a two-kilovolt power supply to it ten centimeters away from where I've got this detonator that could take your hand off, that just started seeming like a very difficult experiment. Then, I started talking with my advisor and said, "OK, to do this, we're going to have to move the shock tube. There's this one place on campus that has the blast-containment capabilities to deal with this." The cost started going up and up. And I ultimately decided that was not a thesis project I wanted to continue. My advisor left for the summer and left me to think about what I was going to do. I became very discouraged. I ultimately left a note on his desk saying I'd turned down my financial aid for the summer, I put everything I owned in a friend's basement, and took off to do some thinking.

ZIERLER: Where'd you go?

FLAGAN: I actually drove across the country all the way to Washington, visiting friends along the way, hearing from some of my friends what they were doing in industry, all the different possibilities. Ultimately, I decided that yes, getting a PhD sounded like what I would really like to do, but continuing to work on that particular problem, but also on the defense-related problems was not where I wanted to be. When I got back to MIT, my advisor was willing to take me back, which was not a sure thing. [Laugh]

ZIERLER: He believed in you.

FLAGAN: He was working on a proposal and asked if I would be willing to do some experiments on this combustion problem where they were trying to understand the fate of organically bound nitrogen in fossil fuels. Coal contains typically about 1.5% nitrogen. Fuel oils or heavy oils contain less, but half a percent to a percent is not uncommon. When that burns, what happens to it? Nitrogen oxide was very much in the area of concern for air pollution because of Los Angeles and photochemical smog. That had brought the nitrogen oxides out as a key player. Reading the literature, the people who had mentioned the problem at all at that point in time said, "Oh, just treat that nitrogen like you do nitrogen from the air." In other words, take nitrogen that could be oxidized exothermically to form nitrogen oxides and treat it the same as nitrogen that has an exceptionally endothermic reaction required to produce nitrogen oxides, and that made no sense. The question was, what happens when you burn these fuels? There was an apparatus that another professor had in his laboratory for studying the kind of combustion that occurs in things like gas turbine engines.

The thought was to put some nitrogen into some fuel oil and find out what it does to the nitrogen oxide emissions. This was a turbulent combustor. We went in and mixed some pyridine with aviation kerosene and burned it, and I varied the turbulence intensity in the reactor, and I varied the fuel-to-air ratio, and mapped out what happened to the nitrogen oxide emissions. I saw the effects of mixing, saw the effects of stoichiometry, got this incredible data set that clearly said that the common wisdom was wrong. But it did not tell us anything obvious about how to address the problem. I started digging into that problem, and several months later, my advisor suggested, "Maybe we ought to forget about that earlier project," which I'd long since forgotten about. But that gave me license to ship back the remaining detonators I did not use. Over that one-month experiment, I got this beautiful data set, then I spent three years trying to make sense out of it.

ZIERLER: What were some of the challenges in trying to make sense out of the data?

FLAGAN: The reactor that we were working with was one in which you are mixing the reactants as they are reacting. The typical chemistry experiment, you would have a well-defined stoichiometry, and you would know how much of each reactant is present, and you would follow what happens over time. Here, we're doing mixing, so we have a distribution of composition within the reactor. And we started digging into the chemistry using techniques that were not really introducing any new chemistry, but looking at how this would occur in a practical system. It became clear that if just the exothermic oxidation were the key; if it was like burning the carbon in the fuel, then all of the nitrogen would come out as nitrogen oxides. But we found that at a stoichiometric mixture, we would get the peak in the amount of nitrogen oxide produced from that organic nitrogen. As we went out to fuel-rich conditions, the amount of nitrogen oxides coming out was lower.

That said some other chemistry was going on. You could go through and explore what was happening, but even at the peak, the amount of nitrogen oxides produced was much less than the amount of organic nitrogen we put in. At least, the increment above what you would get just from the nitrogen in the air. That said that the mixing process was very important. Then, the question was, how do you capture this complex chemistry that's occurring simultaneously with turbulent mixing? For modeling things like the internal combustion engine–at that time, most gasoline engines were pre-mixed. They had a carburetor that mixed fuel and air before it went into the cylinder and burnt, so it was a relatively homogeneous mixture that was burning. There were models that had been developed to capture the complex chemistry going on there that would take the results of the fast reactions of the equilibrium process, and the approach to a final full equilibrium was constrained by the slow chemistry. It's a rate-constrained approach to the equilibrium state. I was thinking about that to describe this nitrogen chemistry. But I had to superimpose it on a mixing process. I started looking at the classical models of mixing, which typically assume some sort of a probability density function, and calculate moments. But the chemistry that was involved was second-order chemistry.

It basically just occurs in the tail of the distribution. What's happening at the mean is not interesting. What's happening in the tail is interesting. And if you get that tail wrong, you don't capture what's going on in the real world. The challenge became how to capture that tail. I dug through the literature, trying to figure out what was going on there. I took additional courses in fluid mechanics, especially in turbulent fluid mechanics, to try to learn as much as I could about the thinking, how people model these complex flows, and came to realize that these were not capturing the full probability density function. Given the computational tools at the time, they were forced to look at moments. We had something more complex going on. Digging through the literature, I eventually found an interesting approach by a chemical engineer at Northwestern who had been dabbling around with something that was approaching a Monte Carlo simulation of the mixing process, very crude.

But he was doing it on absolutely fictitious chemistry. I was trying to deal with real chemistry. We knew a lot of the kinetics. We were trying to capture the real world. I started approaching that idea and found other works that gave me the timescales. I developed a very crude model of turbulent mixing with chemical reactions. It did not satisfy the rigorous fluid mechanics. I was uncomfortable with that aspect. But it was capturing what was going on with the chemistry. I started developing the models. When I finally got the whole model together, the first time I ran it, that one run of the computer program cost more than my annual stipend, at which point my advisor said, "You can continue on this, but you have to find a way to get the cost way down."

ZIERLER: How close to over-budget were you at this point?

FLAGAN: I was way over-budget. [Laugh] And this was a project that he really didn't have funding for. It was that extreme challenge. I started looking for ways to do it. I ultimately found a computer I could get access to with very modest costs. It was a wonderful computer. It had 16 kilobytes of memory. I started looking at how to put all of this together into a program I could run on this computer. All of the detailed kinetics evaluations, thermodynamics evaluations got translated into lookup tables. I made a lot of approximations in order to simplify it down, but I was ultimately able to capture the essence of the problem and really show why this fuel-rich region dropped the nitrogen oxides, and that that was the key throughout this turbulent mixing process.

ZIERLER: How close did this get you to the defense itself?

FLAGAN: It ultimately got me to the defense. It also brought out a lot of skeptics among the fluid mechanists.

ZIERLER: What were they concerned about?

FLAGAN: That I was not doing rigorous fluid mechanics. That I had recognized that to capture this kind of chemistry, I had to approach the real mixing, but I could make lots of approximations in doing that. I had to get the mixing timescales right. But to capture the essence, I didn't have to get all the details right. What I had to do was have the chemistry right. Ultimately, when I defended my thesis, in the style that department at MIT had at the time, the thesis defense and all the questioning was open. I had a packed room for the thesis defense with all of the big guns in fluid mechanics in there.

ZIERLER: Who was on your committee?

FLAGAN: It was not so much who was on the committee because any faculty who were there, in essence, became part of the committee and were involved in the vote.

ZIERLER: A community effort.

FLAGAN: It was a community effort. Normally, the thesis defense was an hour seminar including questions. It was pretty much like the public seminars we have here. But here, we go to a private committee meeting where other questions may be asked.

ZIERLER: I assume this was a rather animated conversation during the defense.

FLAGAN: Yes. Normally, they took about an hour. An hour and a half would be a really long one. Mine lasted three and a half. [Laugh]

ZIERLER: How'd you fare?

FLAGAN: My seminar lasted three and a half hours. Ultimately, I passed. But it would've been very interesting to hear the discussion. [Laugh]

ZIERLER: Were there any criticisms that were productive for you long-term in the way you approached the subject?

FLAGAN: The essence that came out of that was that yes, I had been able to capture the essence of this really complicated problem, that one needed to push the fluid mechanics to look in much more detail at the probability density function that the distributions of composition that go on in these reacting systems. That was critical. It was also clear that a way had to be found to develop that formalism rigorously from the fluid mechanics. That the approach I'd taken was a good first step but was not nearly far enough. Then, the question became, where did I want to invest my effort? Did I want to go into the rigorous fluid mechanics of turbulence, which becomes an end in itself, or did I want to work on problems such as the environmental implications that came out from what I had done?

ZIERLER: Given these questions, what were your prospects after you defended? Where did you want to go next?

FLAGAN: I wanted to get a job. [Laugh] It was not a good time. That was just after the ending of the Vietnam War. All the people who had been doing research in combustion, the primary source of support was the Department of Defense, and suddenly that was drying up. The opportunities were not so great in that arena. I interviewed in industry, I interviewed for some faculty positions, I considered some post-docs. I ultimately stayed on the research faculty at MIT for a couple of years.

ZIERLER: What industrial opportunities were you considering? What might've been relevant for your field of research at this point?

FLAGAN: One company was a contract research company in the Boston area that was very interested in me, a little company called Aerodyne Research, who I have collaborated with for many years long after. Ultimately, the research they were doing at the time was all defense-related, and that was not the path that I wanted to go. I was offered positions in the oil industry. Again, combustion is a big thing for them. But I wasn't too thrilled with what they were offering. Again, thinking about the longer term implications of the combustion work, they were doing what was necessary to deal with regulators, but it wasn't their focus to solve the problems. I interviewed for some faculty positions, was offered one position, which I saw that the university had some really major problems in how they were structured and how they were treating their junior faculty, so I turned down an offer without anything else in hand. I kept working at MIT, developed some new projects, got my own funding for starting to look at soot formation, which was my second not-so-successful attempt at looking at aerosols. When I had done summer work measuring particles going out of smokestacks, that was actually aerosol research with instrumentation I would never touch these days. [Laugh] Or at any point after I came to Caltech.

ZIERLER: To what extent was your post-doctoral research a continuation of your thesis research, and to what extent was it an opportunity to look at new areas?

FLAGAN: I started looking at new areas of combustion. I did some work with what was then the National Bureau of Standards looking at questions related to fire safety, spent some time at NBS, got involved with dealing with outcomes of really disastrous fires. Got involved in some work related to gas turbine engines, and in particular, started looking at soot formation, developed a system for extracting soot directly from the flame region of large turbulent combustors.

ZIERLER: Just a nomenclature question, does soot automatically mean human-caused? Is there such a thing as naturally occurring soot?

FLAGAN: Have you ever seen a forest fire? [Laugh]

ZIERLER: Fair enough.

FLAGAN: Yes. Soot has been around for a long time. It's also been a technology for a long time. The first nanotechnology was making ink, which is soot. [Laugh]

ZIERLER: When did you make the decision to pursue academic appointments and not go into industry? Or was that always an open question until the opportunity at Caltech presented itself?

FLAGAN: It was an open question until the opportunity at Caltech. What was then the Environmental Engineering Science Department at Caltech had been trying to hire an atmospheric chemist for some time. There was one candidate they were and had been trying to hire, and it became clear that person was not going to satisfy both the environmental engineering and the chemistry faculty. It was a chemist, and there were questions about his chemistry, as I understand. They reactivated that search and called two people I know of at MIT. One was someone who'd been involved with me in my work in mechanical engineering, and the other was someone in chemical engineering who had been on my thesis committee, Adel Sarofim, because the same week I did my proposition seminar as I started my thesis, there was another proposition seminar in the MIT catalogue with exactly the same title. I went to that seminar. I didn't like the work the student was proposing, but it was clear I wanted his advisor on my committee. Got him on my committee and ultimately worked with him quite closely.

We'd done a massive review for the Senate Public Works Committee on nitrogen oxide emissions. They called these two people. Both recommended me, said I wasn't an atmospheric chemist, but they might want to look at me. That got me an interview. I went out for my one-day interview, which lasted three days. In my interview, there were a couple people who were working on this concept of aerosols. I'd sort of touched on the fringes of it but really didn't know what an aerosol was. Didn't know any of the aerosol science. But there was this fellow, Sheldon Friedlander, who was on the faculty here for a long time and ultimate went over to UCLA. His primary home was environmental engineering. There was John Seinfeld. I had very good conversations with them. Sheldon saw what I'd been doing in combustion and started suggesting that someone with my background might be able to answer some big questions about air pollution. It was really in my interview that the collaboration with Sheldon and John got started.

ZIERLER: A bit of a tangential question. In the mid-1970s, there was some concern over the concept of global cooling. Did that register with you at the time? Did you ever think about that?

FLAGAN: Oh, yes. Global cooling and especially nuclear winter.

ZIERLER: Was that relevant to what you were studying?

FLAGAN: It was relevant. I'd have to say at that time, I did not really have the big picture of the global atmosphere. I had not really looked at the structure of the atmosphere and how all these pieces would fit together. But it certainly caught my attention.

ZIERLER: In subsequent conversations, we'll pick up with the beginning of your career at Caltech. But just to clarify in the timeline, when you are considering the opportunity at Caltech, are you more broadly on the job market? Are there other institutions you're applying to at that point?

FLAGAN: At that point, I did not have anything else in the pipeline. I had to start raising a lot more money if I was going to stay in the position I was in.

ZIERLER: Right opportunity at the right time, it sounds like.

FLAGAN: Yes. Los Angeles was not my idea of a good place to live.

ZIERLER: On that note, we'll pick up next time and see what happens next.

[End of Recording]

ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It's Friday, February 25, 2022. I'm delighted to be back with Professor Richard Flagan. Rick, as always, it's good to be with you again.

FLAGAN: Thank you.

ZIERLER: Today, we're going to go back to 1975. Just to set some context, I wonder if you can reflect on just how peripheral aerosol research was relative to what you were doing prior to your arrival at Caltech?

FLAGAN: In my graduate work and my post-doc period, my work on MIT, it was all on combustion. I did, as a postdoc, some work looking at soot formation. Soot is an important kind of aerosol. I was probing inside the flame and trying to quench the samples so that we could look at the particles as they existed in the flame. We actually collected them into liquid water from a very hot environment. The techniques we used couldn't provide any of the kind of information that we focus on on the aerosols. We were really looking at conditions leading to soot formation. I had sort of touched on aerosols there, but not really with any focus on aerosol as an aerosol. I had done a tiny bit of work one summer as an undergrad where I did emission measurements from coal-fired power plants, and there I was sampling an aerosol. But the techniques we were using were nothing like anything that I have ever used since. [Laugh]

ZIERLER: What were John and Sheldon working on at that point? And was it independent or a collaboration?

FLAGAN: Well, John, at that point, was sort of an applied mathematician, had been working on a number of rather complex problems of traditional chemical engineering kinds, and at that point, he was beginning to look at aerosols. The first PhD thesis exam I sat on here was one of his students, the theoretical solution to one of the key equations in looking at aerosols. But it was very mathematical and very far from the real world. Sheldon was focused very much on aerosols. He had been involved in some studies looking at the air pollution. He developed the first version of the atmospheric chamber experiments that we've used at Caltech. The laboratory he had was up on the roof of the Keck Lab building, a small space for instruments, and a platform outside where a big 60-cubic-meter pillow-shaped Teflon balloon was used for studying the reactions. These experiments then were very dirty, involved drawing in ambient air, adding to it particular reactants, and looking at the aerosol was pretty crude at the time.

ZIERLER: It would've been very late in his life, but I'm wondering if you ever crossed paths with Arie Haagen-Smit.

FLAGAN: A few times, yes. He was a delightful man. He was, earlier in his career, back in the late 40s, early 50s, a synthetic organic chemist working on natural products, and his laboratory, because odor was a significant factor in artificial flavors, which was one thing he was looking at at the time, the air in his lab cleaned through activated carbon. At the end of the day, he would step outside, and he could smell and taste the smog. And he asked what it was that he could taste. He developed the insights that laid the basic foundation for understanding what photochemical smog is. And he was a true pioneer. He ultimately was the first chair of the California Air Resources Board appointed by Reagan, then governor, when that finally occurred. He was a very important figure in this question, "What is this nasty stuff in the air?" He had done that first work back in the late 40s, early 50s. But it took a long time. The smog was still very bad when I met him about 1975.

ZIERLER: Then, the collaboration that developed between you, John, and Sheldon, to what extent was it building on what Arie had already done, and to what extent were you posing new questions that were separate from his research?

FLAGAN: Arie Haagen-Smit had laid out the basic chemical mechanism. But getting to the point that you could actually predict what was going on in the atmosphere was far from where he was able to go. At that point, we were starting to look at the Los Angeles Basin as a big chemical reactor, so applying chemical engineering principles instead of to a few liters or a few cubic meters to something of 1,000 cubic kilometers. Much larger volume, much more heterogeneous. It was, "We know the basic physics, we know some of the chemistry. Can we model this, and can we use those models to find a way out of the mess?" Looking at it from a theoretical and modeling point of view. Sheldon had built a chamber facility that was able to simulate in a captive parcel of air some of what was going on in the atmosphere. When I interviewed, Sheldon introduced me to aerosols and started asking questions. I'd done work on coal combustion, fossil fuel combustion in general. Could what I was doing help explain why, in many cities, the aerosol was enriched with heavy metals? Before I arrived at Caltech, we had written a joint proposal in which I proposed to build a combustion facility, not to work with liquid fuels, which was what I had done primarily at MIT, but to work with coal. A facility to start probing directly what was being formed in the combustion. That was one part of it. Another part was going to be using the atmospheric chamber, and John was going to be continuing modeling.

ZIERLER: To the extent that this really was a new direction for your career and research, what resonated about this project, both from the science and from the application perspective?

FLAGAN: From the science, as I started looking at aerosols, I realized that some of the concepts I had found to be very important as I was thinking about how the reactions take place in a turbulent combustor, I had to think about the probability density function for composition. I looked at the mathematical form of the theory for describing aerosol dynamics, general dynamic equation for aerosols, which was described in a book that Sheldon had published about aerosols. The formulation was very familiar. The concepts were natural to me as far as how to think about the problem. There was a scientific similarity that made me comfortable with it. The question of how to solve a big problem that's affecting lots and lots of people was what had driven me into what I did at MIT, and this was going much more directly at the question of how you would deal with pollution and hopefully find a way to avoid some of the health impacts that were definitely being seen and felt at the time. You couldn't ignore it at the time in Pasadena.

ZIERLER: To what extent were the motivations local, this being a problem in the Los Angeles Basin, and to what extent was this a problem worldwide, in any population center that had automobiles, factories, burning of trash, things like that?

FLAGAN: There had been major air pollution episodes that were very well-documented, that had killed large numbers of people, years before that. The air pollution episode in London in 1952 that, over the course of one week, killed on the order of 5,000 people. There were a number of other locations. A lot of those were related to coal combustion. The photochemical smog in the Los Angeles Basin was something that was seen most profoundly in Los Angeles. But it was very, very substantial. You couldn't ignore it. People could be here for months and not know that there were mountains right next to Pasadena.

ZIERLER: Why? What's unique about Los Angeles that it would host this problem at that magnitude?

FLAGAN: First, we have the mountains that seal in the airflow. The clean air coming in from over the ocean gets blocked by the mountains, so what is emitted into that air is given a long time to cook. It's the geography of the Basin and the mountains, the sunlight, the huge number of people and emissions all combined to give you a lot of time for a very reactive mixture to react.

ZIERLER: Was the catalytic converter at this point identified already as a possible solution?

FLAGAN: That came later. As the models developed that said we had to control the organics that were being emitted to the atmosphere; you also had the nitrogen oxides, these were very important parts of the puzzle, the basic chemistry, you could see on a small scale. Putting it all together into models as John Seinfeld was doing, it made it very clear why there were different regions of the basin that had different levels of air pollution.

ZIERLER: Let's delve a little deeper into exactly what made up the puzzle. Obviously, at this point, it was known to be emissions. But emissions from what? What were the possible candidates, what eventually was confirmed, and what eventually did not turn out to be a factor among all of the emitting elements in Los Angeles?

FLAGAN: Well, the automobile was identified to be a key, and really, Haagen-Smit led the way on identifying the automobile as a source. With some support from Arnold Beckman, he used cryogenic traps to collect a small amount of essence of smog that he then analyzed chemically, and he found a lot of organic nitrates in it. Organics could come from a lot of sources, the automobile being one very important class of sources. The nitrates pointed him towards nitrogen oxides. Ultimately, with the nitrogen oxides, he identified the photocatalytic cycle that is the underlying core of that, and then along the way, the additional contributions the organics had to enhancing that cycle, increasing the ozone.

ZIERLER: This comes directly after the Arab Oil Embargo in '73 and '74 from the Arab-Israeli War, and that, in turn, was a toehold for the Japanese automobile industry to come to the United States because they produced more fuel-efficient cars. Was that an opportunity right from the beginning, the notion that, at least from an economic perspective, American consumers had renewed interest in fuel efficiency?

FLAGAN: I don't think that was such a major driving force at that point in time, but I could be mistaken. One thing that did happen when regulations finally came into play, those regulations brought with them standards that were based on vehicle miles traveled. For a small fuel-efficient car that's burning less gasoline, you might have to do less to control the emissions to within that mass per mile traveled than you would for a larger engine, heavier car.

ZIERLER: Logically, it would just stand to reason that a more fuel-efficient care would emit less pollution.

FLAGAN: Yes. And I have not studied the rationale for how the regulations were first written, but that would seem to be a logical component of it.

ZIERLER: To delve a little deeper into the collaboration itself, as you, Sheldon, and John were conceptualizing this project, what were the strengths that each of you individually brought to it that made it ultimately so successful?

FLAGAN: Well, John brought to it this deep mathematical and ultimately computational approach to looking at a problem that was complex, covered a huge area, huge number of sources, any reactive species. He brought the ability to bring all of that together into one coherent picture of what's happening in something like the Los Angeles Basin. Sheldon had been working on aerosols for a long time. He really emphasized the role of the aerosols and developed a lot of relatively simple but very informative understanding, simple models to describe what was going on. He was good at getting different people to collaborate on problems of this complexity and brought in a group from the University of Minnesota who had the instruments that were then available to measure the small particles in the air. In fact, the first instruments to really let you look at the dynamics of the aerosol were just commercialized in 1974 and '75. I came in with a background in combustion. I could look at the sources. I didn't have any background at all in the atmosphere. I didn't have any background at all in aerosols.

Sheldon suggested strongly, as I was trying to figure out how I was going to contribute to this, that I write a review paper on the problem of particle emissions from coal combustion. "Take a subject where you'd have some of the fundamental background but none of the detailed knowledge, dig into the literature deep enough to be able to tell the story and synthesize it." I did that. I started developing some simplistic models that captured the essence of particulate emissions from combustion, predicted some things that were not observed. I predicted that the coal-combustion systems would be emitting smaller particles than what had been observed by the people who had taken the new measurements out to the field and measured what was coming out. Ultimately, I was proven right, that what was going on was, the sampling systems had such a long resonance time that the particles were coagulating, the numbers were going down, and the sizes were getting bigger.

ZIERLER: From the computation to the field instruments, what were the key technologies that made this research possible in the collaboration?

FLAGAN: This was a time when electronics were allowing more sensitive measurements than had been practical previously. A big part of what I was looking at was the formation of very small particles. The techniques that had been developed by this group at Minnesota involved charging the particles, then sorting them with respect to size and electric field. Then, stepping through electric field strength to get a distribution of signals that you could invert to find the size distribution. Bigger particles, you could do by light-scattering methods. You'd measure pulses of light produced when a single particle passed through a beam. Measuring the peak voltage from individual particles lets you get the distribution of bigger particles. The instrumental tools were available to begin to look at the dynamics. When we went to the chamber, the particles were generally smaller than what we were looking at with the coal combustion. We had particles that went up to 100 microns when we were dealing with the coal and down to ten nanometers. When we were looking at the atmosphere, we were typically dealing with particles smaller than a few microns. The instruments were a key part of that. The computers were getting bigger and faster, and that's what made it possible to go beyond modeling just what happens in a tiny parcel of air, but starting to think about how you would model something like the LA Basin. But that wasn't my story, that's really John Seinfeld.

ZIERLER: On the political side, let's start with funding sources. What were the agencies or foundations that supported this research?

FLAGAN: For me, primarily the National Science Foundation in that early time. The California Air Resources Board [CARB] supported a lot of the work looking at the modeling of air quality in the Los Angeles Basin. CARB also supported the chamber studies, where we were trying to figure out, for different mixtures, how much aerosol and particulate matter is being produced. Those were really the keys early on.

ZIERLER: Given, I would assume, some level of concern from industry, did that pose any political problems at Caltech?

FLAGAN: There definitely were problems that came up later on with industry. Some years later, Michael Hoffmann received some funding from the Air Resources Board to look at the chemistry of cloud water, especially fog water, in the Los Angeles Basin. It was a long path for how that funding came about. There had been some people supported by the power company who had proposed to the Air Resources Board–they looked at a prior proposal that Jim Morgan had written on the same topic, and ultimately, Michael ended up with some funding. Jim was winding down his research activity at the time, and Michael took it on, trying to sample the fogs in Pasadena and the Los Angeles Basin. Michael and I had lunch together at the Athenaeum, and he told me about this funding and he said, "Here's what's been done before to sample fog water. What could we do to do it better?" Classical Athenaeum story, I sketched out on a place mat three instruments. He had one student who was this very, energetic French-American graduate student and another student on the project who was this very laid-back Californian. The laid-back Californian took the best of the idea, the French-American took the worst. Three months later, he had a rotating-arm sampler that was installed up on the roof of the penthouse on top of Keck, and we were sampling the fog water. And Michael turned in his first quarterly progress report to the Air Resources Board that had data in it from an instrument that hadn't existed three months earlier. That started basically an attack on the project. It got some press. The Air Resources Board issued a press release on that, so before anything was published, it was suddenly in the newspapers.

That was the support of publicity that was not very popular with the power companies. I think it was the BBC that did the first big news story. We had an instrument that we hadn't really characterized at that point. We couldn't say much about exactly what we were collecting other than it was the fog water, we were seeing the chemistry, and it was very, very acidic. Far more acidic than any of the acid rain that had been in the press. And they basically started an attack on Michael. This included talking to people on the Caltech faculty about this bad work that was being done, "How could this be allowed at Caltech?" and so on. That was a direct assault. Ultimately, the science was good. It produced a lot of very interesting data. I had to retire all of those rotating-arm samplers. My concern had been that mechanically, it was a risky proposition having a large arm rotating at high speed, sampling the fog water. It underwent a fatigue failure, and one end of the arm flew a long distance. I heard about it and told the student, "I want all of the instruments back in the lab within 12 hours." And that included bringing some back from Northern California. They did, and that was the last they were used.

ZIERLER: I know it comes later, but given these political challenges, was Caltech leadership always supportive, the president, provost, and board? Were they behind this research because they knew the science was solid?

FLAGAN: Yes. Most of it, I didn't see. Michael was the visible person on this. They were supportive throughout.

ZIERLER: What was the timescale? You join Caltech in 1975. Given the enormity and complexity of the problem, how long did you envision this research project to take?

FLAGAN: Well, the first proposal was a typical proposal. It was something to be done in three years. What we're talking about now goes way beyond that. It's many projects. I really did not have a concept of how long it would take at the time. What I saw was a problem where I felt my viewpoint on the problem would allow me to gain some insights that maybe other people couldn't. I had the patience to deal with experiments. I was willing to dig into instrumental methods that I had never seen before and figure out how they worked, then start trying to make them better. But it was a huge problem that'd been going on for decades, and it continued for decades. [Laugh] We're not out of the woods yet.

ZIERLER: Do you have a clear memory of first getting data back that was promising that showed you were on the right track?

FLAGAN: The coal combustion work, the first data didn't come for a long time. While I came in as an experimentalist, ultimately, I started building the apparatus without having a place to put it. Ultimately, I was not given laboratory space. I was somehow supposed to fit within what else was going on. I did, but it was not by any formal agreement. I had to move someone else out in order to install my apparatus. It took a while to get everything going, but we did certainly have interesting results within that first grant, and they showed that the basic hypothesis was correct, that what we were seeing was vaporizing of metals during the combustion of coal. We couldn't say exactly what form they were in as they vaporized, just that the metallic elements were appearing in very small particles that could only be formed by vaporization. We were able to develop models to describe it, and they were consistent with the overall idea that we'd had from the beginning. That was really not a surprise because I was building it based upon a lot of evidence that was out there but had never been pulled together.

ZIERLER: This might be a hard question because it would ask you to get into Haagen-Smit's mind, but just in terms of extrapolating the science that he was doing earlier, were you finding things that existed on a continuum of what he had suspected, or just by virtue of the instrumentation, the collaboration, your background, were you finding new issues that he might not have considered, even from a theoretical perspective?

FLAGAN: He was a chemist, and he looked at what was going on from a chemist's viewpoint. The concept of aerosols and the aerosol dynamics, he knew what was going on. He knew the reason the mountains were not visible so much of the year was due to the chemistry that he was seeing, but he did not have a comprehensive picture to say how you go from these small volatile molecules that are released into the air all the way to producing particles that would survive in the air in the heat of the summer, that their volatility is so low that they won't evaporate. I don't think he had anywhere near a full picture. We still don't have a full picture of that. But we know a whole lot more now than we did way back then.

ZIERLER: When in the timeline did you, Sheldon, and John have enough data, enough of a theoretical basis that you could start to think about solutions, applying the basic science to societal ways to mitigate aerosol pollution?

FLAGAN: This, again, is really a team effort. I came in 1975, and Sheldon left Caltech about 1979. He was here for that first period, then after he left, the roof lab on top of Keck was idle for a time. Finally, John and I got together and started looking at how to revive it and what to do with it. John was developing models, but there were gaps in the understanding in some of the chemistry and the ability to predict–he was developing the models for the LA Basin and trying to describe a broader scale. It would be better to talk to him about exactly what he was thinking at the time. The initial focus, from a modeling point of view, was ozone. The ozone involves the photochemical reactions, with nitrogen oxides playing a really key role. If there's nitrogen dioxide, it could be photolytically decomposed to nitric oxide and a free oxygen atom. The oxygen atom could combine with O2 to produce ozone, and the ozone can oxidize the nitric oxide with NO2.

There's a catalytic loop going on where photons are keeping the process going, and it builds, in simplest models, to a steady state. Then, hydrocarbons come in and increase the pace around that cycle. But they also produce semi-volatile organic compounds and low-volatility organic compounds that can condense. The oxidation of nitrogen oxide's part of the story. That was clearly identified as the key component by Haagen-Smit. Not with all the details that we know now. This is a continuously evolving field. There are still things we don't know. But a lot of the chemistry was understood, and the role of the automobile in producing ozone was quite well-understood. The automobiles also emitted a lot of carbon monoxide, which when the inversion was low in winter, concentrations could get quite high. The unburned hydrocarbons are part of that cycle leading to nitrogen oxides. Figuring out how you'd rebuild this complex piece of machinery that is tuned to convert fuel into the power to drive a vehicle, to change it that so you change the mix of what comes out was a real puzzle. And that puzzle started being addressed before I got to Caltech.

This was known, the role of the hydrocarbons was known. I had looked at it from a combustion perspective previously. The first efforts to control the emissions looked at the hydrocarbons and the carbon monoxide and say, "To get rid of those, increase the temperature and make the combustion more efficient. That will cut down the carbon monoxide, by making the the engine fuel lean. Increasing the combustion efficiency will cut down the hydrocarbons. Raise the temperature, and up go the nitrogen oxides. The first emission controls actually exacerbated the nitrogen oxide from the problem. It was ultimately the catalytic converter, the three-way catalyst, that would both oxidize the hydrocarbons and reduce the nitrogen oxides enabled controlling both.

ZIERLER: When does the catalytic converter enter the scene? What year would that have been?

FLAGAN: It was around the early 1980s.

ZIERLER: Were you involved at all in the design, proposal, regulatory measures that put this in place?

FLAGAN: Not at all.

ZIERLER: What's the connecting point between the research you're involved in and the realization that this needs to be a standard part of automobiles?

FLAGAN: I never really worked on the automobile side of the equation. I was around it during my graduate school days. There were a lot of people working on engines when I was a student. I was never involved in that part of the story.

ZIERLER: But is your understanding that the regulatory impetus to create catalytic converters in some way is derived from what you and your colleagues were researching?

FLAGAN: What people who preceded me had researched really provided the starting point for that. I was involved in some other aspects that ultimately did get into use. My thesis research, to go back earlier, focused on organically-bound nitrogen in fossil fuels, how that forms nitrogen oxides. Along the way, we discovered that a large fraction of that reduced nitrogen would be converted to N2 during the combustion process. And that started suggesting things with respect to ways to control nitrogen oxide emissions to lower levels. Work was going on on that in industry as well. One thing I got involved in between when I finished my thesis and when I came to Caltech, there was a request from the Senate Public Works Committee to produce a document that would really describe the state-of-the-art for control of nitrogen oxide emissions from stationary combustion sources. Think power plants. That effort was by led Adel Sarofim, who was asked to do that. He agreed to do it if he could bring me on to join him.

We took over a large office at MIT. By the time we got done, we had bookcases on all the walls in that office filled with documents that we'd gotten because we went out to people in the academic community but in industry and described what we were doing, who it was for. We had companies sending us patent applications under confidentiality agreements, but giving us information that was not even public at the time. We got a huge amount of information. Some of the technologies, using things like urea as a fixed nitrogen source to post-treat combustion products to reduce nitrogen oxide back to N2 came out in that analysis. The ultimate document is a volume about an inch and a half thick. We wrote a review paper on it. It laid out a lot of these technologies, laid out some of the concepts for how to think about the problem, building on what a huge number of people had done. In fact, it formed a significant part of the talk that I gave when I interviewed at Caltech.

ZIERLER: In the late 1970s and early 1980s, there was much research on ozone depletion and the formation of the so-called ozone hole. Had you contributed to that research or drawn on some of that research for what you were doing?

FLAGAN: I got involved in some links to some of that in looking at one study with Paul Wennberg shortly after he joined the Caltech faculty in some measurements of polar stratospheric clouds. They were looking at the north polar vortex. There, we were looking at hydrates of nitric acid that played a key role there. I was not directly involved in any work looking at the ozone hole.

ZIERLER: Once you had gathered enough data to understand what was going on at a broad level, what were some of the regulatory processes that might've been available in terms of finding solutions, both at the local, state, and national level?

FLAGAN: In finding solutions, we looked at local level, the Los Angeles Basin. Huge problem, very complex, many different hydrocarbon species contributing. The automobile was the initial focus. Later on, many other things had been considered. The models that John Seinfeld developed were one of the really key tools for advanced regulation. The part I was directly involved in was much more how to get the data. How do we find out the contributions of different hydrocarbons being emitted? Different components of gasoline, how they contributed to the photochemical smog, the aerosol, the particulate matter on one hand, the ozone on another. The chamber studies that John and I did together provided a huge amount of data. And this is something that's been going on from the late 70s until this day, looking at the different species and how they contribute. The different species being emitted change with time, so there are always new frontiers. Putting that into the models really becomes a key in how you move to solutions. One of the real developments that, I think, was a major factor in cleaning up Los Angeles–we're still not pristine, but we're a whole lot better than we were back then–was that the models that John developed were delivered to the Air Resources Board, and they became tools in their toolbox for looking at air quality.

If a question comes up, what are the contributions of a source we haven't thought about? You can put that into the model. Take what you've learned about the chemistry, take what you can find out about the sources, put it into the model, and say, "Is this significant or not?" An example of that came in the mid-80s, when I was on the research committee of the California Air Resources Board. The question came up, "What is the contribution of the solvents from paint to photochemical smog?" The Air Resources Board commissioned a study to have someone gather data on how much paint is consumed in the Los Angeles Basin, assuming that all paint that's purchased is ultimately going to lead to solvent evaporation into the air. What would that contribution be? We gathered that data, they put it into the models that were derivatives of what John had developed, and came back with an answer that this was a significant contributor. It would make a difference if they got rid of that source.

Then, the research side of the Air Resources Board delivered to the actual political leaders the data and conclusions from that study, and they went back and said, "Design a policy and a plan for implementing it." They did that. They decided that, to get the solvents out of the paint, the paint manufacturers were going to need some motivation to help them along. They gave some grants to help get the solvent out of the paint along with a timeline that they had to actually have something working by this time, or they wouldn't be able to sell their product. The paint we buy in Los Angeles is different than the paint you buy in the rest of the country. In fact, we're right now doing some experiments where the question of those paints that still have the organic solvents, what their contributions are. Turns out, it's very, very difficult to get the paints necessary to do the experiments.

ZIERLER: It's an irony. [Laugh] At the national level, to what extent was the EPA an ally or supporter in what you were doing, and how might that have changed over the very different domestic agendas going from the Carter to the Reagan Administrations?

FLAGAN: EPA funded some of my research from time to time, including efforts to develop instruments that would enable measurements that weren't possible when we set out. They supported some of our chamber studies. We've had some very positive collaborations with them over the years. The difference between the Carter and Reagan Administrations, when Reagan came out with his statements that trees cause smog, so the way to clean up the air in California is to cut down the forests–I don't think he went that far, but the implication was there–suddenly, people were not very anxious to start describing the biogenic hydrocarbon sources that would support that statement. Because trees do contribute to smog. But so do automobiles and all of the other sources. The Smoky Mountains were smoky before the automobile existed. If you read descriptions of the Los Angeles Basin dating back to the early 19th century, the descriptions of the haze that hung over the Los Angeles Basin were quite clear.

ZIERLER: Was there anything going on at JPL that was relevant for your research? Was there instrumentation they had that was useful? Or to flip that question around, in light of all the subsequent work that JPL had done with satellites and radar pointed back at Earth, was some of the research you were involved in perhaps a catalyst for JPL's Earth-based research projects?

FLAGAN: Definitely, JPL played a role in my research at many points along the way. One of the earliest ones, this would've been in the early 80s, I was doing work on burning coal and looking at the particles that were produced. I used an electron microscope system at JPL. We were looking at these little agglomerate particles and trying to see what they were composed of. I don't remember the exact reason, but for some reason, that microscope at JPL seemed preferable at the time to what was available in GPS. We did some measurements up there. While we were up there, in the space where the electron microscope is located were pictures they'd taken on the wall. I looked at some pictures on the wall and started asking questions about one particular set of pictures that were agglomerate particles that looked very similar to the things we were doing, except the scales were quite a bit different. I just asked the microscopist some questions, and he could tell me about taking pictures, but not a lot about the nature of the particles.

But he mentioned it to someone else at JPL who was the source of the particles they had taken the pictures of. A few weeks later, I got a call from JPL saying they were doing a review of this program, a flat solar array project. That early project was trying to make photovoltaic solar power more cost-effective by reducing the cost of refining silicon. They said, "We're having a review of this project. Would you be willing to help us out?" "Sure. I mleight learn something interesting." I sort of got a sense that I had said something that really caught their attention. I took along a graduate student to the meeting, and we spent a day going through this project, and they showed where those pictures had come from. What they were trying to do was take silane, the silicon analog of methane, and decompose it to produce elemental silicon aerosol particles that they would then separate from the gas. Those could be melted to pull single crystals for making the photovoltaics. This was quite a long time ago. Better ways have been developed for many aspects of the process now. But the process they were doing, they were producing big agglomerate particles that aerodynamically would settle at very, very low velocities. They had huge aerodynamic drag because they were big, very low-mass agglomerates of lots of little, tiny spherical subunits. I was able to explain that their process, they were nucleating material, producing really tiny particles that were forming these agglomerates, and it seemed unlikely that the process they were doing was going to get the big particles.

At the end of the day, they asked if I'd be willing to do an exploratory look at the theory behind it to see if I could explain it in more detail and perhaps find a way around it. And they gave me funding for six months. I had this new graduate student who was really energetic and good computationally. We started simulating what was going on in this process. And eventually, we could put in words exactly what was going on. As soon as we could put it into words, we said, "A-ha. They're getting greedy. They're trying to do the process fast to make it efficient." We slowed everything down; fewer particles were formed, but they grew much faster and much denser, and it looked like we could create them. We got a bottle of silane, set up an experiment in the lab. We built a small reactor. First time we turned on the reactor, the particles grew so big, they settled down and plugged the reactor. But we had something that worked. We inverted the reactor and were able to produce particles. When I gave my report after six months, it was one of those really fun presentations because I spent a long time building up all the reasons this wouldn't work, then turned around and said, "If we slow it down, here's what we can do." Then, I started going into data. And that project continued for quite a few years.

ZIERLER: A term in the early 1980s, acid fog. Is that simply the vaporized version of acid rain that people were so concerned about at that point?

FLAGAN: Acid fog was exactly the same thing as the acid rain, the droplets just hadn't gotten so big that they'd fall out. If you're dealing with low-level fog, typically, the atmosphere's thermodynamically stable. You don't get the turbulent mixing that is ultimately required for getting up to large droplet sizes. But it was the same basic chemistry.

ZIERLER: Did this research broaden out your work? In other words, I know acid rain was a nationwide concern.

FLAGAN: It was a global concern.

ZIERLER: That's right. Did this broaden it out from the initial focus on the Los Angeles Basin for you?

FLAGAN: My focus was more on the fundamentals, trying to understand what's going on in the atmosphere. John was the one who was really modeling the Los Angeles Basin. Most of my work to that time was laboratory work, not going out into the field and making field measurements, although that project with Michael, his students were going out into the field and making measurements, which really provided the data on the acid fog.

ZIERLER: Another nomenclature question. Fog water, which is a term in use in the mid-1980s, what is fog water?

FLAGAN: The water content of the fog. Fog is liquid droplets condensed typically on a small particle. Some sort of small particle. The particles produced in photochemical smog, for example. But it could also be near the coast. Could be sea spray. You get water condensing on them, so you have a droplet that is primarily water, but it's formed on a small particle of some other composition. The water then can evaporate completely. The fog is formed on something that won't evaporate completely.

ZIERLER: Ukraine is on my mind right now, and in the year 1986, I'm curious if the Chernobyl radioactive crisis registered for you and if your area of research was at all relevant to understanding this catastrophe.

FLAGAN: I was not directly involved in that, although I do know many people who were. 1987, I spent a sabbatical in Finland, and there, I collaborated, or at least had many discussions, with people who were directly involved in that. They saw huge pieces of uranium falling out of the sky, seven-micron lumps of radioactive uranium that had clearly been molten were collected on filter. The Finns had set up a network to sample the aerosol on a regular basis during the era of the Soviet nuclear tests. They also had serious instruments to measure atmospheric electrical conductivity. The day of Chernobyl, when the cloud from Chernobyl reached Finland, the Finnish Meteorological Institute, which ran these atmospheric conductivity sensors, started having problems. The atmospheric conductivity sensors would go off-scale.

They'd send technicians out and readjust them, then they'd go off-scale again. This happened on the spring weekend with the first weekend when the ice in Helsinki Harbor had cleared. The head of the Finnish Meteorological Institute had gone sailing. He came back and found his door covered with notes because he'd been totally out of touch as they tried to figure out what was going on. Of course, the cloud went on over Sweden, and the Swedes figured it out more quickly. But the Finns had the first signals coming out from it – the filter samples they had collected. A fellow with whom I did collaborate took those filters and simply laid a piece of film on them. He got a map of where all the radioactive particles were on the filters, then they would go in with a microscope and look at the particles. That's where they saw these monstrous uranium particles that had been released in that accident.

ZIERLER: Were they too big to be considered aerosols?

FLAGAN: They'd been transported along and been suspended in the air. They were aerosols. They were surprisingly big for particles of such density traveling in the air over such a long distance. They'd been lofted high into the air. But they were definitely aerosols, but very unusual ones.

ZIERLER: I wonder if one of the ironies here is that to the extent that civilian nuclear energy might provide an alternative to coal combustion, the Chernobyl disaster threw cold water on that, so to speak.

FLAGAN: There are many reasons to question the extent to which traditional fission-based nuclear power can be considered a cleaner source to address the global warming potential. Many reasons to question that. It might be an interim solution, but there's the waste problem that you have to deal with, which is a whole additional class of problems. The power plant at Chernobyl was a design that invited that accident. It turned out that in Finland, there was at least one plant that was part of Soviet payments to Finland. Finland supplied a lot of technology, and the Soviet Union paid with what they had, their technology. they have a Chernobyl-like nuclear plant. The key thing they don't have on it is controls built by anyone in the Soviet Union. But they have the reactor. It has the carbon shielding that was key in the disaster at Chernobyl, the combustible shield. But they have modern controls. There are many reasons that a lot of people are very skeptical of nuclear power as a replacement for fossil fuels. Maybe some of the infusion technology will make that possible at some point in the future.

ZIERLER: For the last part of our talk today, the focus up to this point has been concerns surrounding air pollution, so if we look to 1988 and James Hanson's testimony before Congress, which historians generally regard as a watershed moment in public awareness about the connections between carbon dioxide emissions and global warming, at what point in your research career did the carbon dioxide-global warming connection resonate? For example, were you aware of what Roger Revelle was doing decades earlier and how that had contributed to the forming consensus in the 1980s?

FLAGAN: I was aware of the work going on looking at CO2. I was aware this was a problem that had been studied for a century, that people had known about it long, long before it received any serious attention. The CO2 problem is really outside the realm of the kinds of tools and approaches I was taking in my research. Where the link between my work and climate change really developed was with the IPCC reports showing that a truly major part of the uncertainty in the global energy budget was driven by CO2, but CO2's a fairly stable background. But the aerosol produces a huge part of the uncertainty in climate. It's not the biggest forcer. The biggest forcer is the CO2. The biggest greenhouse gas is water, but water cycles as driven by the background temperature distribution on the planet, and that's really the CO2 and other greenhouse gases that are important. That's a key part. But the aerosol scatters sunlight back to space, and in doing so, it provides a cooling signal that competes with the greenhouse gas warming. That uncertainty has created huge questions as to how we can predict what was going on, the reason why we didn't see it so clearly in the temperature changes until much later than was predicted that there was this cooling effect. Then, trying to understand that cooling effect and to understand why it is so large was needed to reduce the uncertainty in that effect. That's really where I got involved actively in research on that. Before that, it was, "This is an important problem we need to look at, but what are the options for dealing with it?"

ZIERLER: Roughly where in the chronology do you make that intellectual connection? When does that happen for you?

FLAGAN: I was certainly beginning at least by the early 90s. I'd have to look back a little bit to put specific dates on it.

ZIERLER: For example, your work on Fourier transform infrared spectroscopy, you would not categorize that research as yet being connected with the carbon dioxide global warming connection?

FLAGAN: No, that was the question of whether we could take the classical Fletcher-Millikan experiment, levitating a droplet–I give it that name because Fletcher was the student who came up with the oil drop experiment, very important scientist, first head of NACA, the predecessor of NASA–being able to levitate a particle and being able to observe directly its composition and any reactions occurring within that bit of matter was a very interesting problem. We had a very creative Israeli student who really pushed that experiment, the success, and we were measuring very, very tiny temperature differences within the droplet as a way to do absorption spectroscopy, where the droplet we were observing was our detector. But that was focused on our ability to measure what was going on within the solution phase without any foreign container. The only interface was a liquid gas interface.

ZIERLER: Well, on that note, we'll pick up next time in the late 1980s and the new research directions you were pursuing at that point.

[End of Recording]

ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It is Monday, March 7, 2022. Once again, I'm delighted to be back with Professor Richard Flagan. Rick, once again, it's great to be with you. Thank you for joining me again.

FLAGAN: Glad to.

ZIERLER: Today, I'd like to start with a very topical question before we go back into the historical narrative. We're all watching the unfolding horror happening in the Ukraine right now. As an aerosol scientist, when you look at all of the bombing and fires that are engulfing buildings and tanks, when you look at the potential of radiation plants or plants that produce radioactive energy, what are some of the things you're seeing or concerned about?

FLAGAN: Beyond just the general insanity that's going on, you see all the fires, and you see huge amounts of smoke. That poses direct risks that are trivial by comparison to the other risks the people in the area are facing, but if you look at the Russians attacking the nuclear power plants, as far as I know, those nuclear power plants are built on the same technology as Chernobyl was with the graphite shielding. That's combustible. If that burns, that can loft huge volumes of radioactivity into the atmosphere. I know people who were involved in the investigation of Chernobyl, people in Finland who were directly downwind and received some of the first signals from Chernobyl, although they did not recognize them at the time. Going back to the era of the Soviet nuclear tests, they had atmospheric conductivity measurements, which were put in place to detect the radioactive fallout from the nuclear tests as a very simple, yet sensitive detector of that increased activity. When Chernobyl happened, those sensors started going offline. The signals just went out of bounds. This caused panic within the Finnish Meteorological Institute that was responsible for manning these stations. They sent people out to tune them, get them working again, and they went offline again before too long.

When the head of the Finnish Meteorological Institute returned from sailing, the first weekend that the ice in Helsinki Harbor was open, he found his door covered with notes telling him to call the Institute right away. They had seen the first signals from Chernobyl. This continued on, and the Swedes eventually detected it and recognized it for what it was. They found particles of uranium as large as seven microns in diameter collected in filters that had been sampling throughout that period, catching particles that had been lofted up into the sky by those fires. And this is a long ways downwind. With the level of bombardment going on, what happens if those facilities are hit again, if they're hit with the kinds of munitions that are going on? Are we looking at another nuclear disaster? I know that there are other plants of that same vintage around the other Eastern Bloc countries and some of the countries the Soviet union owed money to, and that was the way they paid them, building nuclear power plants, which were built better than the Soviet ones and actually had good controls in them. But they still have that risk of combustion of the shielding materials. There's a potential with what's going on of creating a huge environmental disaster long downwind from those facilities.

ZIERLER: Have you been involved in creating detector technologies that give us a sense of what's happening right now in Ukraine?

FLAGAN: Not directly. I've been involved in creating instruments that measure small particles in the air and have created quite a number of those. And those are used for looking at particulate emissions and particles formed in the atmosphere. But as far as the specifics of what is happening in the Ukraine right at the moment, they're not really that specific. The instruments I've created give you size distribution. Some give you an ability to collect samples for subsequent analysis. I've been involved indirectly through collaborators with instruments that do give you real-time chemistry. There's a company that has developed and commercialized an aerosol mass spectrometer we use extensively in our research. We flew the first of those instruments that was sold outside of Aerodyne's own laboratory sold for outside use. That was part of an Office of Navy Research grant we had. While that was being developed under an SBIR grant, I was a liaison for ONR with that company, providing technical support in reviewing their progress as the instrument was developed and in guiding it to the ultimate capability of being something small and light enough that we could fly it for atmospheric measurements. Those instruments are all over Europe and the world these days.

ZIERLER: It sounds like the bombardment around nuclear reactors in Ukraine right now, this is a massive global concern.

FLAGAN: Oh, yes. It's a very real concern. There's a very real threat posed by putting those facilities at risk being disrupted in a major way.

ZIERLER: Well, let's return to the historical narrative. We're going to pick up in the late 80s, early 90s. I wonder if you can talk about some of your work generally in photo oxidation at this time, both on the instrument-building side and on the capture and analysis side.

FLAGAN: That was a period where we had been using an atmospheric chamber. At that point, the chamber we were using was a 60-cubic-meter Teflon reactor, basically shaped like a huge pillow. It was installed up on the roof of the Keck Laboratory, used ambient sunlight for the illumination for driving the photochemistry. I think we were still using that at that point. The instruments we were using were commercial instruments. They had some serious problems. The key instrument for measuring the small particles was something called an electrical aerosol analyzer, which would get a measure that was representative of the number of particles larger than a given size. And it would step through sizes and pick differences between two successive measurements in order to get the concentration within a given interval. It was doing this during an experiment where that entire size distribution is changing continuously with time and often, where the numbers of concentrations were low enough that the readings were quite noisy. That noise would sometimes give you negative numbers. We don't have negative particles. We had a lot of trouble with analyzing the data.

My students were doing some really creative work trying to extract best estimates of the size distribution coming out of those experiments. But the data were really inferior. I was teaching a course at that time on aerosol measurements on environmental atmospheric measurements in general. At one point, I was teaching about the measurement of the particles. Each week, I would give one lecture talking about an instrument, sort of dissecting it, walking through how it worked, good and bad parts, helping the students to understand the tools with which they were working and to get ideas on how to take them in new dimensions. As I was preparing a lecture on an instrument I really didn't like, I started thinking about some of the other instruments we were working with, some of the mass spectrometry we were doing, trying to get the chemistry on the particles and started to wonder–we had an instrument that was a calibration tool. It would separate particles in a narrow interval of sizes as they flow through a long cylinder inside a tube, electrostatic separator. Charged particles would migrate across particle-free sheath gas as they were carried downstream in this instrument. We'd take it out at a downstream point. We would step the voltage if we wanted to make a size distribution, and that would take 20 or 30 minutes to do a decent job of capturing the size distribution.

It was way too slow for the experiments we had. Sometimes the action was over in ten minutes. Thinking about that, I started asking the question, "What would happen if we changed its mode of operation? If instead of classifying the particles in a steady flow, steady electric field, and then stepping, what would happen if we classified them in a time-varying field and just continuously counted into time bins?" Rather than work on the lecture I was supposed to be preparing, I did the thought experiment, the zero-order approximation of the instrument, and derived a model of it, and I looked at what its response function would be. With this simple model I'd developed, it pointed to a new instrument. I got the same answer that we got with stepping mode. Some different constants, but it was basically the identical result. The next day, I gave the students a lecture on an instrument that didn't exist. I showed them how we could take this classifier, this tool we could only use as calibration because it was so slow, and turn it into an instrument that would be fast enough to make the measurements.

That was an interesting start. All it took was a computer program to translate that calibration tool into the instrument we needed. Two weeks later, we had the first version of that instrument working, and my students used it in the lab course. The student who had defended her thesis whose frustration led me to this later asked me, "Why did you wait until after I finished to get this done?" That instrument went through quite a few variations. When I sent the paper out for review, I got a very elegant set of handwritten notes showing me a major improvement of it. The improvement was something we immediately adopted. From the notes and the initials on them, I was able to identify who the reviewer was and asked whether he would want to be a coauthor on the paper, which he declined. [Laugh] It was one of those beautiful reviews.

ZIERLER: When we talk about photo oxidation, does this have both an industrial and naturally occurring context?

FLAGAN: It certainly has a major naturally occurring context, the photochemistry that we're looking at in the atmosphere. I have not looked explicitly at the industrial applications of photochemistry. I know they exist, but I don't know the details of them. I've focused much more on what's happening in the atmosphere.

ZIERLER: I'm curious if you became involved at all in the aftermath of the Exxon-Valdez disaster. Was there an atmospheric dimension to this at all?

FLAGAN: That one, I was not involved in at all. I don't recall nearly as much of a discussion of the atmospheric context as in the Gulf of Mexico disaster decades later.

ZIERLER: Deepwater Horizon.

FLAGAN: Deepwater Horizon.

ZIERLER: What might be the difference?

FLAGAN: There, you had a huge fire as well.

ZIERLER: Is that to say that generally, oil spills are contained in the water and land, they don't find their way into the atmosphere?

FLAGAN: No. Depending upon the oil, there are different amounts of volatile components. Methane is one obvious part of it. That was the fire in the Deepwater Horizon, at least the significant initial part of it. It was definitely a concern. The volatile components will evaporate, and once they evaporate into the atmosphere, they can undergo that photochemistry.

ZIERLER: I wonder if you can talk a little bit about Raman scattering techniques and how they were relevant for your research.

FLAGAN: We did some work, trying to get to basics of chemistry going on inside particles, using a modern version of the Fletcher-Millikan oil drop experiment, which involves levitating a charged particle in an electric field. In Millikan's original experiment, it was just two flat plates. His very creative graduate student, Harvey Fletcher, found a way to electrostatically nudge a particle back to the center of his view volume if it started to drift too far out of that. That was one of the keys to the experiment that gets missed in most of the discussions. He was also the guy who came up with the use of oil rather than water so it didn't evaporate. We used that to look at what was going on within the solution droplet. We could levitate a charged droplet and hold it for an extended period of time. And with that droplet levitated, we could then measure the change in mass with time as an indication of the increase of amount of material dissolved within it due to reactions of gaseous species diffusing into it.

But it would be much nicer to be able to probe the chemistry going on within the droplet itself. We actually used two different optical techniques to probe what was happening within the particle. The first that we used was a Fourier transform infrared spectroscopy, where the small absorption that was occurring would actually heat up the droplet by a very small amount, thousandths of a degree kelvin. But that would cause enough of a size shift by evaporating water that looking at optical resonances in the angular scattering pattern, we could detect absorption. That worked, but it was very tedious and slow. We had a later student who tried using Raman spectroscopy, and he was, in fact, able to probe the absorption with that by looking at the shifts in the scattering pattern from that levitated droplet. Data analysis from those experiments was very complicated because we had to deal with all of those optical resonances within the small particle. It was not something we pursued a lot after that initial effort. But it led to a very nice thesis for Chak Chan, and he went on to use it in other methods extensively in his career when he moved back to Hong Kong after finishing with us.

ZIERLER: Have these techniques been relevant for more recent work, or are they sort of bound in a particular chronological time for your work?

FLAGAN: We've used the levitation technique at many points along the way. The first experiments were just looking at the thermodynamics of solutions. What happens if we take a solution droplet and dry it out, if we lower the relative humidity it's exposed to? It will shrink and dry, but it won't necessarily crystallize at the point where thermodynamics tells you it should because the nucleation of a new phase provides a kinetic constraint against getting to that full equilibrium state. We could actually measure the water activity over the solution droplet going into extremely supersaturated solutions. This happens in the atmosphere, so it's directly relevant to the atmosphere. In that drying process, you can go into that supersaturated state, keeping the droplets wet and much later than they would be after they'd dried out. That occurs in the atmosphere. On a morning after a fog, you will see, as the sun and temperature rise, the fog disappears as fog, but it still seems hazy for a long time due to those droplets remaining wet. It is a factor in the haze following foggy days. It's an important part of understanding the atmospheric not just visual range, visibility, but also in the context of climate change, how that haze is scattering sunlight back to space. It's part of the large source of uncertainty in radiative scattering in modeling of the energy balance for the planet. The levitation, we've used also to look at that crystallization that occurs, to probe the kinetics of the crystallization.

That has gone beyond looking at the atmosphere in a project that I did with Julia Kornfield. She got me interested in the question of how polymers crystallize. We looked for homogeneous nucleation in polymer solutions. This was much later. Adam Olsen finished his thesis in about 2005 or 2006. We would levitate a small polymer particle, then we would cycle it through increasing relative humidity, then decreasing relative humidity, and measure the humidity at which you would get the solid deliquescing and becoming a solution droplet, then we would see those optical resonances, then we would dry it. It would eventually crystallize. We were looking for a stochastic, a statistical process, but we found systematic variations. What happened in one experiment depended upon what had happened in the previous experiment. We would see systematic changes as we went through a sawtooth in relative humidity. Ultimately, we had to look at the crystallization very differently than we had before and realize that we could explain this in terms of entanglement of the polymer molecules that was persisting, even though the polymer was seemingly fully solvated, fully in solution.

ZIERLER: I'm curious, your collaboration with Aaron Rulison, Ablation of Silicate Particles, was that part of a larger endeavor on non-terrestrial research, or was that more of a one-off project?

FLAGAN: That was a key part of his thesis. Tom Ahrens, professor of geophysics, was looking at questions that related to the extinction of the dinosaurs. Major impact spewing huge amounts of material into the upper atmosphere that darkened Earth for a long time. And from that came the question not of these huge objects coming into the atmosphere, but of smaller ones. If you have a really small piece of rock entering the atmosphere, what happens to it? That project actually came out of a term project in my aerosol physics course that I posed for a student who I knew had a strong interest in astronomy. He built his own telescope, he was a very avid amateur astronomer studying in chemical engineering. I knew he had been interested in this, Tom had been asking me questions about this. I had him look at the theory of what happens when a small rock comes into the atmosphere. He dug through the literature and came up with major flaws in the physics of the models that were commonly used to describe how a meteorite behaves in the atmosphere.

Basically, they assumed that the drag constant proportionality from velocity to drag force for a particle coming into the atmosphere is constant as it goes through different regimes of the atmosphere, where it has to vary a lot, where the velocities are immense, coming in at velocities of many kilometers per second and decelerating down to velocities where simple fluid mechanics would describe what's going on very well. The model he came up with said that the particles had to heat up, but ultimately, they'd start to melt. Once they melt, then you'd have strong shear forces, and liquid would be pulled off just due to aerodynamic drag. This called into question a number of aspects of what would happen when an object entered the atmosphere. Tom Ahrens was very interested in this. He proposed that we use a large apparatus that he would shoot a projectile at a surface and measure what happens to the rocks that it hit. We used that as a way to inject small particles into a gas and then watch those small particles. We started out looking at the system here at Caltech. Ultimately, we used a larger facility up at NASA Ames to measure what happens when small particles of known composition enter into a stagnant gas, undergo this very rapid deceleration. What temperatures they reach, how long they're visible as hot radiant objects, and the nature of the particles left behind. It led to a very different model for meteorite entry into the atmosphere than what was commonly used at the time.

ZIERLER: Was JPL a factor in this research at all?

FLAGAN: Not from our involvement. They may have been involved with Tom, but we were looking at one small piece of the puzzle at that time.

ZIERLER: An overall public policy and perceptions question. If today, we can understand that there's widespread public appreciation that coal, even unique among the fossil fuels, is dirty, both in terms of its potency in creating carbon emissions and as an air pollutant, when did this shift in public perception come about, and what was some of your research that might've contributed to this shift in perception?

FLAGAN: The shift was coming about when I arrived here.

ZIERLER: That early?

FLAGAN: It had started much earlier than that. There were some major air pollution episodes, back when I was far too young to be doing any science, that really caught the world's attention. There was an air pollution episode in 1952, I believe, in London that, over the course of about a week, killed about 5,000 more people than would have died during that same period without that episode. It was a statistically significant perturbation on the death rate. That was clearly linked to the air pollution in London, where a lot of coal was burned, not just for power generation but also for space heating, so in small, very inefficient furnaces in houses. That awareness was out there from that time, and long before. The first air pollution laws date back to very early in England's development, where it was recognized that coal was a major problem. That's always been part of coal, that recognition. When I interviewed at Caltech, I was actually introduced to that question specifically with regard to some aspects of the coal combustion by a then-professor of environmental engineering science, as the program was called at the time, and chemical engineering, Sheldon Friedlander. He posed the question that, in urban centers, at that time, the aerosol was enriched with heavy metals. Why? Where does it come from? Why are those particles so abundant in the atmosphere, well beyond the natural abundance in the area? The first project I started when I came to Caltech was looking at what happens to mineral matter in coal when coal is burned. Coal typically contains ten or more percent of mineral matter by mass. We built a laboratory combustor to explore that, where we would burn on the order of a kilogram of coal per hour. From the point of view of coal combustion, very small. From the point of view of typical laboratory experiments, rather large.

But it allowed us to look at the nature of the particle generation. Even before I did that, I'd done some modeling. The first thing I did on that problem was to just read vast amounts of literature, to the point that I wrote a review paper on it. In doing that, I found strong hints that mineral was being vaporized during combustion. We built an experiment to test that. With that, we measured the size distribution and got size distributions that were close to what I had predicted with a very simplistic model at the time. But we also used some samplers that I had developed to collect size-fractionated samples of particles down to quite small sizes. We found ways of using tools of the geologists to look at the distribution of the mineral content among those different size fractions. That was my first venture into the area. The broader recognition of the problem came out of a number of environmental episodes. Those extreme air pollution events led to solutions, reducing the use of raw coal for space heating to go to somewhat cleaner fuels, building tall smokestacks to make sure the air pollution isn't released right at ground level. that became a key part in the acid rain problem. Exporting the large amounts of sulfur and particulate matter into the atmosphere high enough that it could be transported long distances before it could come down in downwind locations.

There's been a continuous evolution recognizing the contributions of coal to air pollution, air quality, and climate change. Experiments we're doing now are showing that sulfur released by coal combustion has drastically altered how particles form in the atmosphere. In the modern world, the clouds we see would form on seeds that are the smaller particles in the atmosphere. Those seeds are different. Clouds are formed in different ways, perturbed by man's action through the burning of coal and other fossil fuels. This is in addition to the effects of the greenhouse gases, which is the driving force behind the general warming of climate. Part of the reason this has been slow to be identified has been that the particles that have been perturbing the atmosphere led to a cooling effect, reflecting sunlight back to space and counteracting some of the global warming that's been building over time.

ZIERLER: Given how far back these problems go, are you surprised just how prevalent burning coal remains in the world economy?

FLAGAN: Given human nature, unfortunately, I wish I had grounds to be. [Laugh] If you're trying to build large industrial systems, if you want to process huge amounts of steel, burning coal is a convenient way to do it. Plus, you want to add carbon to the iron to make quality steel. The operations for that industry, coal is fairly natural. For building power plants, it's a very dense fuel, has a lot of energy release capability. It became recognized as an important fuel very early on. Early steam engines used wood, but coal turned out to be a more convenient source where it was available. The ease of use of these resources is very, very tempting, and people succumbed to that temptation. They did not recognize the dangers that were being created in the process.

ZIERLER: Just a nomenclature question, what are aerosol agglomerates?

FLAGAN: When you start with very, very small particles–a good example would be diesel soot. Diesel soot is produced by gas-phase reactions in fuel-rich regions of a flame that start building up bigger hydrocarbons from smaller ones. They form aromatic rings and ultimately form tiny particles that, as we see, are black. Those little, tiny particles that form start out as molecules and grow up to a couple of tens of nanometers in size. But they're produced in huge number concentrations, and they undergo Brownian motion in the air, and on occasion, run into one another. They'll stick together initially probably by van der Waals forces, but as they're forming, additional material is coming out of the gas phase, deposits on their surfaces, and grows them up to larger and larger sizes. But the rate at which these little spheroids come together and stick is so great that the amount of material coming from the gas phase can't completely coat it and make a nice uniform sphere. They form aggregates of particles that get–ramified is a common term. It's not like a cluster of grapes. They're more string-like, fractal-like aggregates that, aerodynamically, will look quite small, but physically, will become quite large.

ZIERLER: I'd like to talk about your work with journals and open access. But as context to that, tell me about your service on the Library Committee at Caltech. How far back does that go?

FLAGAN: My first introduction to the Library Committee came when I arrived at Caltech. At that time in the Caltech library history, Caltech had a whole bunch of small libraries. One large central library, but a large number of satellite libraries. There was one in Environmental Engineering, and I was constantly in there, digging through the literature.

ZIERLER: This is pre-internet, of course, when people actually went to the library to find things.

FLAGAN: Precisely. The first challenge I was given that I took on was this review that I did on particle formation and coal combustion, which had me going to libraries all over campus to find the different kinds of information to pull all the pieces together. I would sometimes visit 10 or 11 libraries in a day. I was asked if I would be willing to serve as Environmental Engineering's representative for libraries, which I agreed to. From that, I was on the Library Committee for many years. Caltech libraries were a very special place. They were scattered, and it wasn't very efficient to go to all of them. We didn't have all the journals. Caltech is a small place, and when you start going into fields that have not been represented at a small institution, you discover gaps in what they have. One of the important things that Caltech had done that was very creative was to set up a document delivery system, where if you found a reference you needed access to, there was a little slip in the library you could fill out. Then, a few days or a week later, you would get the xerox of the paper in the campus mail. That was very convenient, done at so many cents per page. We used it a lot.

We went through a number of transitions in the library. One was the use of this document delivery system was becoming so large that the university librarian concluded that it was not practical to keep counting the number of pages being scanned, and he proposed at one point that we should just have a fixed charge for a copy. It was set at a reasonable amount so that people wouldn't object to it too much, and it continued to work well. But following the period of very high inflation in the late 70s, the commercial publishers learned a very important lesson, which they've been profiting from to this day. That is that the use of scientific journals is very inelastic. If they raise the price, the number of subscriptions does not go down easily. What's going on? It's the university library that pays the price for the subscription, and it's the scientist who's not paying that price directly, who demands that we have the journal. Journal publishers were escalating the price at many times the inflation rate, sometimes 20, 30% per year. Huge escalations in the subscription price.

Because they had a captive audience. The librarians were stopped from canceling the journals because the faculty demanded to have access to the scholarship in their field. The proposal made by the then-university librarian was dealing with the numbers of subscriptions. "We're going to have to cancel subscriptions. The cost is going far beyond what we can pay. How do we choose what to cancel?" You look at the journals, you see which journals are not being used very frequently, yet cost a lot. Those are the ones you cancel. Then, what do you do with the document delivery? The first proposal, which was from the then-university librarian was, "Well, we cancel the subscription, we'll have to charge the faculty for the inter-library loan to get access to those articles." That original document delivery was even for journals in the Caltech collection. You could request an article without having to physically go to the library, then you'd just get it in a day or two. As you start canceling, you say, "OK, well, we've got to go outside and get it? The researchers should pay it out of their research grants."

My response to that was, "If we do that, nobody's going to willingly let a journal be canceled. If we want to deal with the budgetary question, we have to think about the faculty, the research that's going on here, and make it practical for people to allow journals to be canceled." Ultimately, that was the driving force that switched to a fixed cost per article. It was the same whether we had the article on campus or not. Again, this is pre-internet. If you wanted to physically make your own copy, you could walk to the library and not pay that fee. You'd just pay the per-page cost of copying. If you wanted to have someone else do it and deliver it to you, you paid for it. This expanded that outside all at a constant rate. Following that, the library was able to not fully meet its budget because the escalation was still going on. But we were showing faculty the prices, they'd see what the price was for the journal and how often they used it, and they'd say, "You can cancel this journal," because it was practical for them to do it. I was on the Library Committee for a long time, through this extended period where we were having to cancel large numbers of journals every single year.

ZIERLER: Did those discussions get to the existential level where there may have been discussions about whether Caltech even needed to maintain a library?

FLAGAN: The first question came with books because it was cutting into the purchase of books. We had one provost who argued, "Caltech faculty don't read books, they write them." That didn't get to the point that says, "Who is it who actually reads books?" It's mostly students, and no, they don't write the books. But that was a starting salvo into the question of whether we need a library. We clearly needed access to the literature. The access to the main journals for the fields where there are large numbers of people regularly using the same key journals was not going to change easily. For the smaller disciplines, the effect was much more direct when the canceling started. Canceling Physical Review, The Journal of the American Chemical Society, Cell. Those, you're not going to cancel. Someone's got to bring those on campus and make them available. But the journals used by a small number of research groups were prime targets. As the internet and electronic forms of access to journals developed, then there came increasing questions as to how many journals we needed to have access to.

ZIERLER: What was the outcome of that discussion?

FLAGAN: At one point, Steve Koonin, then provost, said, "We should have a true accounting of who uses what journals and how much." He took the approach of pressuring the library to do a survey for every faculty member. "What journals have you published in in the last five years, and what journals have you cited?" and that would define the set of journals we'd need to consider at all. First, that's a huge burden. Maybe not for people who work in a relatively well-defined, well-constrained discipline, but for people who cross boundaries, it became a huge exercise. There was pressure to rely on electronic access as opposed to having physical library, that was one outcome. That survey brought attention to these questions of what journals we needed. There were faculty who listed hundreds of journals they used. The answer was not so easy. This was also a time when there was growing interest in developing the internet into a mode of communication. In physics, which is where our provost at the time came from, there was ArXive, which had been set up at Los Alamos as a preprint server for the physics community. There was a way to get access to the articles even without subscriptions. Why can't we have that in all fields? Why do we need these journals?

The library was looking at modes of using preprint servers, of providing access to material that went beyond the journals, things like digitizing the Caltech theses to make those available. Then, there were a few ventures aimed at making fully electronic open-access journals. There was one headed out of economics that got a lot of attention at the time. In discussing all of this, reading what was in the literature and discussions with publishers when they were not responsive to our pleas that their escalation of journal prices was not sustainable, the question of how we should respond to this journal crisis really came to the fore. In one of many meetings with the then-university librarian, Anne Buck, and I with Steve Koonin, the provost, where he was arguing for huge cuts in the library budget, huge cuts in the subscription budget, the idea came out, "Let's get together people in the community. Not the publishers, but the users. The universities, the libraries, maybe some society publishers that are well-behaved in the pricing game. And let's have a serious discussion about what would be the best way to use the electronic medium." Buck and I organized a conference, which occurred, I believe, in 1997. It was the Conference on Scholarly Communication.

ZIERLER: This was an international conference. This went well beyond Pasadena.

FLAGAN: It went well beyond Pasadena. It was small, but we invited participants from universities in the US and in Europe. It did not cover the whole world. The conference was to be a workshop to come up with some proposals as to how we might better use the internet. At this Conference on Scholarly Communication, we had a total of four speakers. Each of them presented a vantage point both on publishing and the need for publishing. We had quite a number of library people from around the country, some from Europe. We had a number of academics. And we had provosts from seven or eight major research universities in the US. The provosts were a very important group. They're the people who pay the bills. We wanted to try to get the whole spectrum of viewpoints for how publication fits into the academic setting. I'd have to dig back to remember all of the speakers. The most eloquent one on the question of why we need to develop open-access journals was a professor of theology from the UK.

His specialty was Christianity in the second century, a field in which there was one journal, a very glossy sort of journal that wealthy people might subscribe to and have on their coffee table. In that field, it took about seven years from the time an article was accepted until it was published. And he was a very eloquent spokesman for why there was a need for a different mode of publication, where you weren't constrained in the same way, by producing a showy document that would satisfy the coffee table mode of publication. We had one presentation each day. The rest of the time was all spent breaking up into groups, discussing what the problem is, and coming up with proposals for solutions. Most of the panels we set up had representatives from all the different groups that were represented at the conference, but one was distinct. One was all the provosts, the people who pay the bills. For a week, we had serious discussions. The last day, each of the panels presented their proposal for how we would do things and synthesized that into an overall proposal. That proposal is published on the Caltech library website, it's entitled Scholars Forum. Who benefits from a journal article most? The primary beneficiary is the author. The secondary beneficiary is the university.

ZIERLER: Because it gains some of the prestige that's garnered on the author.

FLAGAN: That is correct. The university has a portfolio of contributors who build its reputation for what it does in research, in scholarship in general.

ZIERLER: Maybe it's a naive question, but what about the readers? Wouldn't they stand to be the primary beneficiary of a journal article?

FLAGAN: What is the author buying with their labor they've devoted into the object that goes to the journal?

ZIERLER: The educational betterment of the readers.

FLAGAN: Eyeballs. What the author wants most is readers. They want lots and lots of people to read their work. They select where they publish according to who the readers are, how many there will be, and if it's getting to the right audience. If you look, then, at who's paying the bills, it's the library. The subscription rates that libraries pay are far higher than the subscription rates an individual pays. Many people don't have access to journals. I look around the country, around the world, and there are lots of small companies that don't have access to the journals unless they find some back door in through a library. A proposal was made that we should have journals, and they should be open-access. Who should own the copyright? The author. What does the journal need? The journal needs a perpetual license to publish that piece of work. When the author submits it, they grant a license to publish. That license should not be revokable. You'd like that material to remain available long term. But the author should also be able to post it on their own website. The question of how you process this is very important. Why do authors publish in a journal in the first place? Because that venue is recognized by the readers as being significant for the author's particular research. But what is the value added by the journal? Is it the glossy presentation? Maybe. The real value is in the vetting of the article. It's in the certification that this piece of work has met the standards of the field.

ZIERLER: You're saying peer review is a unique aspect of journal publication.

FLAGAN: Peer review is the central aspect of scholarly journals, that there is a vetting, a certification function that the journals provide, which is the biggest value of the journal. How is that provided? There's an editor for the journal who, for large journals, will typically be paid. For large journals like Nature or Science, they're paid quite well. Those are professional editors. For many scholarly journals, the editors, if they're being paid, are being paid a pittance compared to what their time is worth and the time they spend on that journal. I've edited a journal and received zero financial compensation for my efforts. I received compensation in terms of recognition from my colleagues, and in terms of things I learned in the process. But I was not paid for that. We have an imbalance. If we create open-access journals, how do we pay for it? How much do we pay for it? The real value is in the reviewing, which is done almost entirely by volunteers. Reviewers are not paid for their efforts.

There have been many surveys over the years that show that scholarly publication is the most profitable or one of the most profitable, highest return-on-investment industries out there. I have seen articles, in one case, by the then-president of Elsevier saying, "Let's do an analysis on what the value of an article is, going through all the different costs, the research, the equipment, the graduate students, the post-docs, the faculty time, all that went into writing it." The function of the chief editor, the person who's sending it out to reviewers and then passing judgment on the reviews–there's a technical editing component, which is a real cost but varies highly from author to author, article to article, depending upon how well the person writes and how much attention they pay to their writing for the particular submission. One might assume that the cost of that technical editing might reasonably be a cost for the author, that if they spent more time getting it into the proper format and critically evaluating their writing, there would be less technical editing required. One could think through all of the different processes involved. One of the proposals that came out was that we ought to use things like ArXive, the internet repositories for articles, have authors submit there, and maybe editors and editorial boards should select from there articles that they, then, want to go through the additional effort of having reviewed. But you could have public reviews submitted.

One could come up with very different models than what we have now, which is electronic versions of print journals, many of which are designed to ensure that they have a high enough print component that, for the libraries to have access to everything, they have to make sure there's enough that's not available electronically open-access to demand a subscription. Because once it's available open access, anyone can get to it. There are many different ways we can look at it. A lot of what's done right now is you have commercial publishers double dipping. They charge high subscription fees, which the libraries are paying, they charge fees to authors if they want to make it available open access so everybody can see it. They get paid for the same article twice. This is a really critical issue for academia, now and in the future. We did try to get this discussed at a higher level. We tried to get it discussed by associations of universities, associations of university presidents. Somehow, it always took a lower standing than it deserves.

ZIERLER: Last question for today. 25 years out, when you started thinking about these things, in light of the ubiquity of the internet, and maybe even specifically social media and the way that online analytics give a much more precise sense of readership or eyeballs, how have these issues changed since the late 1990s, and how have they remained the same despite these technological progressions?

FLAGAN: There were some major ventures in creating open-access journals, PLOS, Public Library of Science, being a prime one. That started out with a view that they were going to compete head-on with Science and Nature, they were going to have feature articles and other expensive components that require high-level professional labor to make them work. They set a price. The commercial publishers are very savvy, and they took the very high price required to put all of those very expensive contributions into the journal and said, "Well, this is what you would have for a totally open system. No profit. We'll take that as a floor and raise the prices above that." The commercial publishers have monetized open access quite effectively and worked very hard to make the case that they are required. My sense is, we don't need them anymore, that societies can handle the challenge of electronic publication. The machinery exists, there are open-access versions of very effective software to run a journal. If you wanted to set one up, you could do a quite credible job in a few days with software that you do not have to purchase. You might find it advisable to get someone involved who's used it before to assist you in getting started, but you don't have to do like PLOS did and hire a big software company to build it, then leave the ownership of the software in the hands of that company so that they can hold you over the coals in the future. That original software didn't work very well, and they replaced it, so I don't know exactly what the status is now. I think we've come a long ways, but we haven't come a long ways in terms of true open access.

ZIERLER: What would true open access look like in a way that would be economically viable for all interested parties?

FLAGAN: First, you have to define what a fair return on investment is, and you have to constrain the return on investment to a fair level. 40% return on investment is not a fair level. I think there's a real place for professional societies. A lot of the journals come out of professional societies. Right now, there are many supposedly open-access journals being stated that are predatory journals, companies set up to extract money from the community, not to provide a valuable service. Societies could set up journals. There are costs, and there are services required that would have to be paid for. The fees that are charged for the open access, author publication charges, as they're called, are needed and have to cover the real cost. But again, within reason. They don't have to cover luxurious office headquarters for a large company in some expensive city. You might see journals that look very much like they are today. But one could also ask the question, can we do something with the preprints? I think COVID has shown us very clearly the value of these preprint archives.

The ArXive has had huge attention over the last two years because that's where a lot of work was being published long before it was considered for publication in regular journals. Are there ways to use that to provide the vetting and certification that's needed in a way that guarantees you're getting good reviews? There are journals out there that I think do a very good job of at least one form of that. There's one set of journals I publish in fairly frequently that's run out of the European Geophysical Union that's open-access. Instead of having a few editors, each of whom is handling hundreds of papers, they have a large number of editors, each of whom handles a small number a year. They will ask for a couple of anonymous reviews, where someone takes on the challenge of providing a critical evaluation of manuscript, but they open up the review process so that others in the community can provide a signed review, and those reviews become part of the archive along with the author's response to them. And they have a discussions journal, which is where the first submissions go in, and they have the regular journal, which is where those that have been accepted for it reside. But both are continuously available. Things that have gone into discussions are available as in the preprint servers, even if they're not accepted. The final version is available, but the original submitted versions are as well. There are very different models that can come out. I think we need to be trying more of that.

ZIERLER: Obviously, there's a generational question here because for better or worse, for graduate students or post-docs looking to make a name for themselves in an academic setting, journal publications is still the golden standard for hiring decisions, for tenure considerations. This is really an issue of ongoing major import.

FLAGAN: That's where having a framework, as the EGU has done with a number of different journals, that can gain credibility for providing that critical certification role is very important. There are also a huge number of predatory journals out there that provide none of that, where they allow you to have a line on your CV but nothing more. The articles are available. Are they any good? How many authors will bother to read something that's published in a place that's known as a very suspect forum? These are issues that I think are going to remain. And this is part of the challenge that the internet age provides us.

ZIERLER: On that note, we'll pick up for next time with more work in the 1990s, then we'll move into the new century.

[End of Recording]

ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It is Wednesday, March 23, 2022. I'm delighted to be back with Professor Richard Flagan. Rick, it's great to be with you. Thanks for joining me again.

FLAGAN: Glad to.

ZIERLER: Today, I want to start back on a technical topic. In the late 1990s and early 2000s, then again more recently, you were involved in research on radial differential mobility. First of all, what is the device that analyzes that?

FLAGAN: The device is called a differential mobility analyzer. This is a device for measuring small particles in a gas. We take the aerosol, the particles in a gas, we expose them to ionizing radiation that causes the particles to develop a steady-state charge distribution. We get to its charged state where, ideally, we know what fraction of particles of each size are charged, and that fraction is typically rather small. The particles that we're trying to get to are particles that contain one elementary charge. If we then apply an electric field, we're applying a known force to the particle, and it will migrate at a velocity that's determined by the charge on the particle, the aerodynamic drag on the particle, and the electric field we've composed. This technique has been used for measuring aerosol particles and gas ions for over a century. I one time wrote a history of these techniques, and the earliest instrument that looks like the modern instruments in its basic structure was published in 1918. It's been around for a long time.

When I started at Caltech, there was a commercial version of this instrument that had just come onto the market. It was mainly a calibration tool. You could take an aerosol that contained a wide range out of sizes, and out of that wide range of sizes, introduce particles–your one wall of the system with two electrodes, migrate the particles across a clean, particle-free gas, and at a downstream point, extract out particles that migrated within a narrow range of mobilities, migration velocities. This could give us a narrow slice of the distribution where we could count the particles, and by counting them and stepping through the voltage, we could get the size distribution. When I started, that was the calibration tool. It was too slow to make practical measurements. We had another much worse instrument that was the basis for making size distribution measurements, which caused my students no end of frustration in trying to unravel noisy data from a far-less-than-ideal instrument. I'll come to the radial part in a bit, but let me give the background first. We had an older instrument that we were using for all our experiments that could make measurements on a timescale of eight to twenty minutes.

We had this beautiful calibration tool, and it was actually in the late 80s that I was teaching an atmospheric measurements course, and each week, I would give one lecture where I would talk about an instrument, dissect how it worked, what was good about it, what wasn't. I came to the point where I was going to talk about this instrument I really did not like. As I was supposed to be planning my lecture, my mind wandered, and I started asking the question, "What would happen if we took this calibration tool, where we could conceivably make size distribution measurements?" And some people had tried it. They'd set up the voltage, wait until the particles transited through the entire classification column, then count particles, change the voltage, wait several seconds until it reached a new steady state, count, and step through size space. To do that and capture the full size distribution would take half an hour to an hour if you were trying to do a careful job. It was way too slow for our experiment. Sometimes, in our chamber studies, the action was over in five minutes. I asked the question, "What would happen if we classified the particles in a time-varying electric field?"

ZIERLER: Why is that the question?

FLAGAN: The question was how we could make the measurements fast enough to be useful in our experiments. We were spending 90 to 99% of our time twiddling our thumbs and waiting, and 1 to 10% of our time counting particles. If we could classify the particles in a time-varying electric field, we might be able to continuously be counting and still scan through the entire voltage range we needed to to get different particle sizes. The smallest particles don't have much aerodynamic drag and migrate really fast. The largest particles have a lot of drag and migrate very slowly. You have to go to a higher field to get the same velocity. If we could scan the voltage, we could make the measurements much faster, and we might be able to use this higher-quality tool to do the measurements. I wrote down a very, very simple model of the instrument, analyzed it, and found that the response function of the instrument with the scanning voltage took the same basic form as that with the stepping, I just had to use a different way to account for the voltage change, I had to average the voltage over the particle's migration.

The next day, I gave my class a lecture on an instrument that did not exist. Because all we were doing was changing a mode of operation, and because in the mid-80s, we had access to use the computer to control instruments, two weeks later, we were able to have a zero-order version of that instrument working, and the students used it in the laboratory. That was the development of what we called the Scanning Electrical Mobility Spectrometer. That allowed us to capture the entire size distribution not in half an hour, not in five minutes. The first time we applied it on an experiment, we did scans much too fast, as we later learned. We did it in about 30 seconds. We could get down from half an hour to half a minute. And we could capture dynamics occurring on the timescale of order five to ten minutes, and that opened up whole new avenues for research because now, we could look at dynamics in ways that had not been possible before. We got cleaner signals than we had with the older instrument, and we were able to get measurements faster.

Now, that first version was using the commercial instrument that was the classification column, which was the coaxial cylinder classifier, where the particles were introduced in a slot around the edge at the top of the larger cylinder and extracted out through a slot around the perimeter of the inner smaller cylinder with the voltage applied between them. We had the central rod at a high voltage, the outer part was grounded because we were going to voltages like ten kilovolts, which was a good way to have things set up. We could sweep the voltage very rapidly, and we could scan through particle size phase much faster than had been possible before. The grant I supported the student on who did the first implementation had some stipulations in it that said basically any inventions would be given freely to the American people. It wasn't possible to patent it because it wasn't possible to recover the cost of the patent because we had to give it away. We didn't patent it. It took five years to convince a company to produce it commercially because they had to figure out some way to protect their investment. But it is now the primary mode for measuring small particles worldwide. That was done with the cylindrical column.

As we started using that tool, we needed a lot more of them, and they were very expensive. I designed one myself and had a machinist make it. After that was made, I sat down with the machinist and started going through the costs for all the different components of it. As I went through that exercise, I realized where I had to simplify the instrument in order to make it practical so we could use the number of instruments we were suddenly finding we needed. We needed many more instruments. Once we knew the capability, we had a lot more things we could do with it. Not long after I had done that, I was at a conference, I heard a talk on someone doing something very different, but during the talk, I realized that he had just given me a clue for a different way to lay out the instrument. We could do the same job with a classifier that consisted of two circular plates. We would introduce the aerosol in a radial slot in one plate and take the classified sample out through the central port in the other one. We built a radial flow differential mobility analyzer, which is the instrument you started out with on this question.

ZIERLER: What does the term radial convey in this context?

FLAGAN: The flow was coming in at a large radius, flowing radially inward, and going out on the central axis. We had ports at the center of both of the electrodes, where we could take out excess flow and the classified sample flow.

ZIERLER: The burst of activity in this area of research 20, 25 years ago that you picked up more recently, what are the connections there, and what might be different about your most recent work in this area?

FLAGAN: As we pushed the limits with these differential mobility analyzers, we were able to do quite a wide range of experiments probing different aspects, primarily at the atmospheric aerosol, although it was used in a number of different venues. We were using the differential mobility analyzers, for example, to measure the yield of different hydrocarbons in photochemical oxidation, producing secondary organic aerosols, smog, and it became a major tool. The starting point for the next burst of developments really came when there was a student who was working with Professor Costas Giappis. Student was Nick Brunelli. He was making some very tiny nanoparticles. We had used our radial mobility analyzer for classifying very small particles down to a few nanometers in size. But Nick was using a little microplasma reactor that was producing particles smaller than a couple nanometers. I was on his thesis committee, and he would spend huge amounts of time doing electron microscopy, having long delays between when he would run an experiment and when he would be able to get any data off what he had produced.

In one of his committee meetings, as he was bemoaning the slow progress due to the electron microscopy, I asked him, "Why don't you try more direct measurements? We have this technique of differential mobility analysis that can allow us to measure small particles." And he reminded me that the particles he had were smaller than any of the classifiers we had would be able to measure. He had talked with my students and concluded that the instruments couldn't do it. I said, "OK, this is true. But I could show you how to design one for the particles that you want." He and I worked together, I gave him the guidance, and he designed an instrument to target particles down to about one nanometer. Nick was a very practical student, he'd grown up on a farm, he knew how to build things. He immediately went into the shop, got on a lathe, and turned the parts to make his own instrument. In a matter of a couple months, he had a very tiny radial differential mobility analyzer that could go down to one nanometer, which then allowed him to greatly accelerate his measurements and really became the key component of his thesis.

ZIERLER: Is one nanometer some kind of a benchmark that suggests this is where the field has real value?

FLAGAN: One nanometer was about the limit of what he was seeing with the particles he was generating. He was looking with the electron microscope, he was seeing things down to that size. That was the target.

ZIERLER: What's the future of this device? How do you see it being used, given its long history, going into the future?

FLAGAN: In studying the atmospheric aerosol, one of the huge challenges we face is understanding the particles that are formed in the atmosphere from vapor phase precursors, a process called homogeneous nucleation. This is starting from molecules, and the molecules gradually accumulating enough material that you grow particles up to sufficient size that they become thermodynamically stable and continue to grow. Half the particles in the global atmosphere are formed by homogeneous nucleation. The data on such particles has been rather limited. You can use mass spectrometers, that will get you up to something on the order of one and a half, maybe approaching two nanometers, but you can't go much beyond that. The radial data DMA that we had previously built took us down to about, in the original version, five nanometers. There were some other instruments that got down into that same range, perhaps down to two to three nanometers. As we got down to one nanometer, we were getting down below the thermodynamically stable size, so we were able to start looking at particles as they were forming, providing a bridge between what can be measured with mass spectrometry and what can be measured with other presently available instruments at the time. Looking at new particle formation is one of the key things we're focusing on with this.

ZIERLER: Back to the administrative side. In 1997, more briefly, then recently for a longer term, you served as executive officer for chemical engineering. The unique way that Caltech organizes itself, where we have division chairs, not deans, executive officers, not department chairs, I still have to translate this to the other systems I'm more familiar with. In your experience, how is the executive officer like and not like a department chair or department head at other schools?

FLAGAN: In terms of handling things like teaching, dealing with a lot of the local problems, the executive officer is the front line in dealing with that. But without the direct resources to effect change. The budgetary constraints at the time I was doing this were largely held by the division chair. In some parts of the Institute, there's some increase in resources down to the executive officer, but that's still relatively minor. Issues like hiring, in other universities, a department chair is given authority to handle hiring within that department. It still goes through the college, through the dean, but the action is started, and salary and many space issues are in the hands of the department chair. The executive officer here is dealing with a lot of the detail but without the real authority or resources to effect change directly. That has to go through the division.

ZIERLER: It's the Division of Chemistry and Chemical Engineering. What does the fact that there's an executive officer for Chemical Engineering, what does that tell us about the administrative boundaries between chemistry and chemical engineering within the division?

FLAGAN: The executive officer is dealing with the specific academic programs within the division. Chemical engineering is quite distinct from chemistry. There's also BMB, which is another option, as they're called, within the division. We have Chemistry, Biochemistry/Molecular Biology, and Chemical Engineering as the three options. Chemistry and BMB are not so distinct from the overall division. Chemical Engineering is quite distinct. We have a totally separate curriculum. At the time, we had to deal with ABET accreditation of engineering programs, which is a totally different accreditation program, much more rigorous than what Chemistry goes through. It's linked around the different academic programs. Chemical Engineering has, and at that point, had, strong impact on what we could teach, how we had to structure our teaching in order to maintain that accreditation. That was a huge effort requiring development of massive amounts of documentation every few years to convince outside panels that we were doing a proper job of educating our students. I can go into more of that and some of the things we've done with that that I've led the way on if you want.

ZIERLER: Before we get to the students and the pedagogy side of things, there's a certain symmetry between the Division of Biology and Biological Engineering and Chemistry and Chemical Engineering. But of course, for Chemistry and Chemical Engineering, that divisional merger in the disciplines has a much longer history than BBE. If I understand, one of the things with the more recent merger of Biology and Biological Engineering was that you needed the biotechnology revolution, the kinds of devices that Lee Hood was building, to bring biology into a more technological realm. What does that tell us about the connections between chemistry and chemical engineering, and how we understand basic science and applications, that might explain why the merger of chemistry and chemical engineering goes farther back in Caltech's history?

FLAGAN: If you go back to the turn of the 20th century, in chemistry, a lot of the chemistry dealt with the applied chemistry, the chemistry of how you do things in industry. But as you start thinking about the industrial chemistry, that brings into play all kinds of considerations that are not necessarily fundamental chemistry. In the early 20th century, there was developing a split of that applied side from the more basic. As chemistry moved towards the more fundamental chemistry, the applied chemistry brought in additional components, like how to design a large reactor that will operate at high pressure safely. That's engineering. There was a split that was occurring as Caltech was developing. By the mid-20s, that split was becoming quite pronounced. I have seen some documents that suggest that the name of the division may have been the Division of Chemical Engineering and Chemistry very early on, even though there was not formally a chemical engineering option for decades. In the mid-20s, as part of this national move, industry was very concerned about how to know what kind of person they were hiring. They were trying to hire a chemist to do bench chemistry, the basic discovery-based chemistry in the laboratory.

If you're trying to get someone who can put together large reactors and process tons of material, that's quite different. There was a move to establish some form of standardization. The American Institute of Chemical Engineers had been formed. Through that auspice, they started trying to develop standards for teaching, and that was actually the birth of ABET, the Accreditation Board for Engineering and Technology. And Caltech was one of the early participants in that. Early in chemical engineering at Caltech, there was a very large project on high-pressure thermodynamics, high-pressure processes, particularly in the refining of oil. That was a key part of the development of chemical engineering here. There was this split that occurred very early in the development of Caltech, which is quite different from the bioengineering development, where biological engineering of various forms started out in two different places.

One, there was a longstanding program in chemical engineering on biomolecular engineering with people like Jay Bailey, Greg Stephanopoulos, Frances Arnold, who were really developing within chemical engineering. There were other aspects of biology-related engineering developing in the Engineering and Applied Science Division. There was a falling-out between those engineering components in the EAS Division with that division as they focused much more on the fundamental biology, and there was a move of that component into what became BBE. It was a very different genesis that occurred. The original chemical engineering was sort of pulling together some pieces of mechanical engineering with the chemistry of thermodynamics to try to deal with the needs to understand how to carry out chemical processes at a large scale.

ZIERLER: Let's return to your leading work with students and what you saw as some needed changes to the curriculum for their preparation.

FLAGAN: Let me first focus just within chemical engineering, then I can talk about some issues related to Caltech as a whole separately. I moved from the Environmental Engineering Science Program in EAS to chemical engineering in CCE in 1990. That would be the equivalent of moving from one college to another at most universities, something that's virtually unheard of. And it was seamless at Caltech. It went very, very smoothly. In fact, I moved my office right away. My laboratories didn't move until the Schlinger building was built some years later. Chemical engineering, at that time, was constrained through this accreditation process where there were criteria sat out for what a chemical engineering program should be. Initially, they were very prescriptive. "You must have these specific subjects taught." It later became somewhat less prescriptive, but still gave a lot of license to the people who would go to a university and examine their academic programs to see whether they fit the bounds. While there were a lot of nice words saying you could design a program to serve your students, they were not following that fully.

As we went through this accreditation process, we would get demands that we offer particular courses, courses that our students found too low-level, not intellectually interesting. We were told in no uncertain terms we had to put them back into our curriculum. When Mark Davis was executive officer, we had one of these instances where they demanded we put back a particular course, a standard first chemical engineering course that almost all universities have, a mass and energy balances course. No one wanted to teach it, none of the students wanted to take it. We put it back, the students complained about it, something only Caltech students would complain about, "This course is too easy, we feel like we're wasting our time in it. We want to learn more." We shifted from having the traditional chemical engineering curriculum to one that established tracks within chemical engineering. The first two years were common, then students would do some specialization in biomolecular engineering, environmental materials, or a process track, which was basically much like the classical chemical engineering, which no one enrolled in for at least five years.

They all wanted something that was more focused. We did that, and we fought the battle with ABET every few years. When I was executive officer, I went through several rounds of this accreditation cycle. In order to meet the accreditation requirements at the time, chemical engineering required more academic units be taken by our students than students in any other undergraduate degree program at Caltech. Yet, we were constantly being criticized that our students were not taking enough engineering courses. When ABET visited mechanical engineering, they would accept the applied math sequence taught by the engineering division for engineers as an engineering course. For chemical engineering, they always tried to deny that. One time, when we were criticized for not meeting the ABET requirement that 3/8 of all credits required for a degree be engineering units, my response to ABET was, "I could satisfy your requirements. All I have to do is drop organic chemistry as a requirement for chemical engineering students, and in doing that, I would reduce the denominator enough to get us into that criteria and still satisfy the Caltech graduation requirements. We're not going to do it. Our students need organic chemistry."

Then, I pointed out what the rules ABET said of the degree requirements. The degree requirements was the minimum number of units that Caltech required for any degree. We satisfied that. Therefore, we were satisfying the condition. I had to repeat that several times. Ultimately, when we faced up to one more round of accreditation, and at a faculty lunch, we were discussing, "How do we deal with these arguments we know we're going to get. We get the same complaints every time we do this. How are we going to deal with those same complaints we always get?" We finally came to the conclusion that ABET was forcing us to make our students take far too many courses to satisfy bureaucratic constraints that had nothing to do with education. We decided we would drop the accreditation. We did some very careful searching. Mike Vicic went through a detailed analysis of all of our graduates over the prior 20 years. A key argument for accreditation was that, if you needed to get a professional engineering certification, having an accredited degree shortened the time to get there. We looked at how many people had gotten that accreditation, and over 20 years, you could count all the students who were professional engineers on the fingers on one hand. We had far more students who might've been better served by a program serving a pre-law program than a professional engineering program.

We decided to drop it. But my feeling was that the accreditation program was broken, the people who are doing the accreditation are people for whom that is their primary activity. They are not our peers in most cases. Very few of the visitors come from research universities. The vast majority of them come from schools where they don't have students who are likely to go onto graduate school. What we're preparing our students for is very different from what ABET assumes. I argued that if we were going to drop it, we should not do it quietly, we should do it very publicly. I wrote a press release that we sent to a number of journals. Ultimately, ACS picked up on it and integrated it with a discussion of another chemical engineering program that was dropping accreditation, Stanford, where we basically publicly announced we were dropping it and suggested that maybe it's time to rethink the entire accreditation process. Not much has happened nationally, but there are a few other schools considering dropping accreditation or have dropped parts of it along the way.

ZIERLER: What's been the legacy of this decision?

FLAGAN: We've reduced the number of units we require of our students to be the same as the rest of the Institute, which is something that the dean's office has been after us to do for a long, long time before that. We've been able to give our students increased flexibility. If a student comes in and says, "I want to take this course that's not an engineering course, but it fits where I'm trying to go for this reason," we're able to deal with those changes. We listen to our students and work with them to try to ensure that we're giving them the education that they need, not one that someone outside is dictating. I think the net benefit has been a real plus. I have not heard any complaints from students or parents about this. Back when we were accredited, one time, one of the examiners was rather abusive to our students and suggested strongly that we were going to lose our accreditation. I received a phone call the next day from a parent who was rather angry. We communicated carefully with the students that this was coming, communicated with them early enough that they could communicate with parents and argue the case. We communicated with our alumni. We have not seen a downside to it yet.

ZIERLER: Back to the research. All of your work on ship track formation. What are ship tracks?

FLAGAN: If you look at a satellite picture of the ocean, the easiest case is to look at the Pacific. There are times where you will see a uniform cloud deck. If you look at it in the infrared, longer wavelength, the clouds are often not quite so opaque as normal. Ships traversing the ocean emit pollutants into the air. They're generally burning the dirtiest oil out there, or at least have been. The sulfur dioxide that's emitted, the soot, the particles of other kinds, those mix up to the top of the marine boundary layer, which is where the marine stratocumulus clouds form. And they act as what we call cloud condensation nuclei on which water will condense to form cloud droplets. The ship tracks are a local perturbation in cloud formation at the top of the marine boundary layer. And they'll extend for thousands of miles sometimes. Ship tracks are interesting because what we can do is look at perturbed clouds in a very confined space. Outside of the ship track, we have the background stratocumulus clouds formed in relatively clean air, then we have a local perturbation. We could study the effect of those additional emissions on the cloud development just by flying through those plumes.

ZIERLER: Is this something you've been involved with for a long time, or was this a shorter, more focused period of research?

FLAGAN: For a long time, I was not involved in field work at all. When we developed the scanning electrical mobility spectrometer, the scanning differential mobility analyzer system that I described earlier, John Seinfeld was approached by some funders from the Office of Naval Research about getting involved in a field project that was going to be looking at the aerosol cloud development over the Atlantic. They had the cruise planned, a lot of instruments on the ship, and were going to be going out for an extended mission. They wanted John to join to bring modeling in so they could interpret the results in light of the micro scale processes as you would capture them in a basic air quality model. When I learned of that, I suggested maybe this was the time to do something neither he nor I had done before, go out into the field with measurements as well. We now had an interest that could follow the dynamics of the aerosol in a way no one else could do. We convinced ONR to support us to do that, recruited a very good student to take the second copy of that new instrument onto the ship, go out, and make measurements. That started us doing field work. It was quickly followed by taking our instruments onto an aircraft that was supported by ONR, and ultimately, to our being asked to join with the Naval Post-Graduate School in an Office of Naval Research effort to develop an atmospheric facility initially focused on using remotely piloted aircraft, UAVs, to go out and probe the atmospheric aerosol.

ZIERLER: What are some obvious challenges in doing field work not on land, when you have to be on a boat?

FLAGAN: That first project on the ship, the instruments had to be packed up, put in a shipping container, and shipped to the Azores, where eventually, they were loaded onto a ship some months later. That had to be done months in advance. We had to have everything we needed to continue the experiments for a two-month deployment, including all spare parts, all tools, all materials we might need. That all had to go in advance. Because once you're on the ship, you only had what you have or somebody else has brought along. It turned out we recruited a very good student for the project. She was very thorough and enough paranoid to think of all kinds of things that could go wrong, so the spare parts list was long. We had to pack the instrument so that it could be installed on the bow of the ship and had to think, "What happens when we have a storm, and waves break over the bow?" We sealed the instrument, we had a hermetically sealed box, we had to provide ventilation to keep the instrument cool enough to operate, but we had to armor plate it so that, if a wave broke over it, we wouldn't lose everything. That was a one-inch marine plywood box enclosing the entire system that came back with a hole in it roughly one foot in diameter. We have no idea what hit it, but it was during some rough weather.

The instrument was damaged in shipping. It had to be repaired with what we had sent over. We discovered some strange behavior on this second copy of a brand new instrument, which the student, Lynn Russell, sent me a fax–this was about 1994–when I was at a conference in England. Had to figure out what was likely going on, communicate back to Pasadena to try some experiments with the other version of that instrument we had, rewrite some code to control the instrument, test the code, fax a listing of the code that Lynn would then have to type in to reprogram the instrument and keep it running. Fortunately, we had picked right with the person who went on that ship. She was able to keep her instrument running and get the data more continuously than any of the other instruments, even though all the other ones were standard commercial instruments that had been used for years, and she had the spare parts that kept other people going. But it's a different kind of challenge when you have to be totally self-contained.

When we took instruments onto aircraft, the first aircraft we went on was a relatively large one, and Lynn was on that aircraft during the flights and able to deal with the instrument herself. The second aircraft we used was supposed to be an optionally piloted aircraft. Even though it could, in theory, be remotely piloted, we always used it as a piloted aircraft, but the instruments were all sealed in the long nose of what had been a pusher/puller type of airplane. The front engine had been taken off, a beefier engine for the rear, and the long nose extended in front of what was nicknamed the pelican because that was what it looked like as it flew over the ocean. Before that aircraft would take off, you'd get everything going, close the nose of the aircraft, then say goodbye to it. It would come back in 12 hours, and you had data or you didn't based upon how well your instrument worked. They had to be fully automated. That really took us in the domain of fully automating all of our measurements because we found we got better data with the fully automated instrument than we did with the ones where people could adjust them as needed.

ZIERLER: On a serious note, on the morning of September 11, as you were watching the smoldering ruins in Lower Manhattan, I wonder what jumped out at you as you were seeing first responders and the obvious risks they were taking.

FLAGAN: I was in Sweden at the time. I'd gone to Finland to be on a thesis defense committee and gone to Sweden to give a seminar that day. I was just checking my email when another American who happened to be there said, "Go to CNN now," and watched the second plane go in. Watching those huge clouds as the buildings collapsed, knowing the health impacts of silica in the air, it was clear that beyond the immediate catastrophe, all the people who were directly killed or injured in the original events, watching all of the people subjected to these huge clouds of dust, there were going to be huge health impacts. What they would be wasn't clear, but it was clear that this was a disaster from an aerosol perspective. It was a huge disaster, and we did and still continue to see it these days.

ZIERLER: Do you see that essentially as parallel with all of the suffering that American veterans have suffered as a result of exposure to burn pits in places such as Afghanistan and Iraq?

FLAGAN: The concept of the burn pits, I had been introduced to long before we got involved in Afghanistan and Iraq. I was on a series of National Research Council committees over a period of about seven or eight years dealing with the question of how we could safely dispose of our stockpile of chemical weapons. As part of that, I got exposed to how they had previously disposed of really nasty munitions. The whole burn pit concept was terrifying. Not just the nasty weapons, but the way they were handling it, the fact that they had these huge open pits, dug-out areas of ground, often with scaffoldings over them, where they would bring and dump material in, burn it, and try to keep the stuff burning to totally destroy it, the munitions, propellant, explosives outside the projectal. And the very low degree of protection that people had in that. The health effects of that, I'd seen data on. Yes, this was a different kind of source, but it was, again, an exposure to really nasty stuff in the air without real thought, in that case, about the effects on people. That kind of thing has always bothered me.

ZIERLER: Obviously, you're not a medical doctor, but with your area of expertise, what jumps out at you in the perennial debate about correlation and causation when we're trying to find justice and compensation for military veterans and first responders who are exposed to all of these noxious chemicals?

FLAGAN: When you have such a clear exposure, the correlation is the statistical inference. But there's an exposure to something with strong potential health impacts. One thing I've looked at over the years is how and where particles deposit in the lungs. I keep asking the questions, "What are the health effects, and how do they relate to the specific details of exposure?" It's difficult to imagine exposures at high levels not having severe health impacts. You look at firefighters who are constantly exposed to smoke, who suffer various long-term health impacts. Cigarette smokers, the worst personal pollution. I'm not quite sure how to deal with the distinction between correlation and causation. To say that the health impact is definitively caused by a particular exposure is a very difficult case to make when there's a long lead time. If, however, you have a definite exposure, where you have little doubt that there is a serious burden of a pollutant deposited in the lungs, is that what caused lung cancer, what caused emphysema? It's very hard to prove that point. You can show that exposures have a high likelihood of doing that, and I think that's where you have to go. But I'm not a medical person, and I'm not a statistician.

ZIERLER: Going deeper back in history, I wonder if you were ever involved, from a policy or advice perspective, on the way that, for example, the Department of Veterans' Affairs assumes a presumptive connection between certain illnesses and Agent Orange exposure in Vietnam.

FLAGAN: Not been involved directly in that.

ZIERLER: Have you studied the issue in the way that spray droplets can affect individuals at a certain level?

FLAGAN: I have certainly read a lot of the literature on inhalation of different materials and effects that have been detected. Have I done it with regard to Agent Orange? No. Have I tried to look at the specific physiological responses? No. What I have done in thinking about air pollution and how particles affect the lungs is collaborated with people who do have that expertise. The group I've collaborated most with has been in the Department of Environmental Medicine at USC, where John Peters, Frank Gilliland, and others have been studying the health effects of air pollution for a long time. I've collaborated formally with them and participated in many informal discussions with them, participated in some of their research center retreats where they discuss all aspects of what they're looking at. And I've done that not because I see that I'm going to be the person doing the direct human health effects studies, but rather that I understand how to do the measurements, I understand the aerosol, I can look at the questions they're asking, and do as I've done with other kinds of science questions, and say, "How can we get the data necessary to address the health questions?" Ultimately, getting the data means getting the data and delivering it to the people who are actually doing the hands-on health research.

ZIERLER: Somewhat relatedly, how far back does your interest in looking at allergens and aerosols go?

FLAGAN: I started developing a personal interest in a particular puzzle. The medical profession has long, and with good intuition, linked asthma with pollen. But if you think about it, for the wind-pollinating plants, pollen grains are 20 microns and larger. They're huge. If you inhale them, they'll deposit in your nose or your throat. They're not going to penetrate down deeper in the lungs, where inflammation they trigger can cause constriction of the airways sufficient to increase the pressure drop so that a person cannot exhale. Yet, there is this putative link between pollen and asthma. A local allergist, Mike Glovsky, came to me one day. I'd met him several times previously. He had a paper from some Australian botanists who thought they had an explanation, which was that if you take rye grass pollen, for example, rye grass being a huge allergen that's been linked with some very severe outbreaks of asthma and related respiratory responses, put it on a microscope slide, put a drop of water on it, the pollen will rupture. You can watch it explode. It'll spill out all the cytoplasmic contents, and in that are lots of little starch granules and other particles that they suggested got out into the air, and it was inhalation of those fragments of pollen that were responsible.

And this was a neat hypothesis. I didn't see the physics behind it. I could understand the rupture, but I couldn't understand how these things got out into the air. These were botanists, and they weren't thinking about physics and the forces involved. It just didn't make sense. Mike asked if he got some funding, whether I'd be willing to have an undergrad look at this project over the summer. I said, "It'd be a good experience for an undergrad. I don't know what we're going to learn, but we might learn something. Sure, why not?" He went out and tried to raise the money. Ultimately, he raised some money, but it came too late. All the students had already gone home for the summer or were already in research projects. That option didn't work. He asked if I'd be willing to work with this guy who'd written this paper. I thought if we could get him over here, we might be able to do something together and figure out what's really going on. He called the lab of Bruce Knox, who had passed away, and his last graduate student, who had been taken on to clean up and shut down the lab, answered the phone. Ultimately, we brought Phil Taylor over from Melbourne for six weeks. He got a visitor appointment, and we found ways to deal with all the bureaucracy of bringing someone in on short notice. We had about six weeks, and we set out to try to figure out how the particles could get out into the air. Phil would go out on campus or in local parks early in the morning and pick rye grass flowers. You don't normally notice them, they're only a millimeter or two long.

They don't have any color to them other than green. We set up a system where he could store them during the day, then we set up a chamber where he could expose them to varying levels of humidity and have some air motion to see if we could entrain fragments. Ultimately, we discovered that the pollen could rupture if the flowers opened up when the humidity was high enough, or if you got some liquid water on it. It would go an osmotic shock, spill its contents, but the inner surfaces of these catkins [the flowers] was textured with little waxy nodules on the order of 50 nanometers in diameter. And that makes a superhydrophobic surface. Any liquid would just ball up, not really wetting the substrate. Then, it would dry out when the sun would come up later, and you've got these little dried particles that have these starch granules and such that contain the allergen. If there was a gentle breeze, those could be entrained off. In a short six-week stay, we were able to find that yes, there was a mechanism. It required that you actually look at what surface it was occurring on. You couldn't blow them off on a microscope slide. That wouldn't work because they would adhere too strongly to that. But these superhydrophobic surfaces were just right for doing it.

They released the stuff into the air, we proved that we could see these small fragments that were allergen-latent in the air and that these particles were small enough that they could get down to the sixth, eight, maybe tenth generation of branching into the lungs where the tubes are small enough and the flows are still high enough that if you got swelling in the muscles of the walls of the bronchi, it would constrict the air flow and increase the pressure drop so that someone trying to exhale could not exhale. That's the wheezing associated with asthma. We went from, "Here's a plausible mechanism that doesn't really capture the physics," to, "Here's a way that you could actually get a response that might be the trigger of an asthmatic attack." That initial exercise led to several years of work, including doing some computer vision work, trying to automate pollen counting, since I discovered how inefficient and biased that process was. We were able to make some interesting progress. Ultimately, Phil Taylor went back to Australia, and the initial funding I had for that ran out. That project stopped, but we made some very good progress on it, and people are still following that lead today.

ZIERLER: Last question for today. Did any of this research lead to translational or even clinical applications?

FLAGAN: One direct clinical application that occurred out of that was, we had set up some pollen samplers on campus, and we discovered a fungal spore that no one thought was present in the air in the Los Angeles Basin. This allergist working with me, Mike Glovsky, saw that immediately, added that fungus to his allergy tests when he had people coming in with allergic diseases, and he found that about 40% of his patients were allergic to that fungus. It was an allergen that no one thought was here in LA, so no one looked for it. But we saw it here. It was something that caught the eye of this Australian botanist in my group because it originated on the flowers of the eucalyptus trees that, in Australia, would normally fall off the tree, but here, with the gentle weather we have, tend to stay on the tree around the year, the dead flowers, and accumulate the fungus. That was the local source.

ZIERLER: For next time, from the Amazon to semiconductors, much more to talk about.

[End of Recording]

ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It is Friday, April 1, 2022. I'm delighted to be back with Professor Rick Flagan. Rick, once again, it's great to be with you. Thanks for joining me. Today, I'd like to start with a question hot off the press. I know shortly, you're going to be engaged in a meeting at JPL to simulate the clouds on other worlds with a focus on Venus. Tell me about that project and how you might contribute to it.

FLAGAN: The atmosphere of Venus has been very interesting for a long time. There were some probes that actually measured in situ the atmosphere of Venus in 1978. A package was dropped down, measured particle sizes for big particles as it fell. When it got to the surfaces, temperatures were too high for it to continue to function. Temperature at the surface is about 500 degrees Celsius, pressure of about 90 atmospheres. But at about 50 to 60 kilometers, the temperatures and pressures are quite Earth-like. At that altitude, there are clouds. The clouds obscure the surface. It's only been relatively recently that they've been able to find ways to get pictures of the surface. The clouds were measured in those probes back in the 70s, which gave some insights into the nature of the clouds. They start at temperatures right around ambient temperature here, 300 kelvin, pressure around one atmosphere, and extend up to where the pressure's about a tenth of an atmosphere. Temperature's colder like at the [poles] of a positive Earth. The existence of the clouds is interesting, but also their chemistry is very interesting. The clouds on Earth are primarily water. The clouds on Venus have a little bit of water in them, but they're primarily sulfuric acid. It's a quite different chemistry, and we don't know very much about them. What's being planned is a mission that would place a laboratory, suspended by a balloon, into that cloud layer on Venus. This planning has been in the works for some time. The actual mission will occur in the next decade, so it's not coming up immediately. But as they plan for this, they will be flying instruments to characterize the cloud droplets in the interstitial aerosol in that layer.

But a key challenge that comes up is whether the instruments will survive to make the measurements because of the very corrosive atmosphere they're dealing with. I got asked to help them try to find a way to characterize the viability of the instruments and of the materials that would be used for this balloon-borne laboratory, this aerobot, that is being planned. Just recently, there was a call from NASA for facilities to support planetary science research. As we've been thinking about the question of probing the clouds on Venus, and at the same time, I've been working on a project to enlarge a very creatively designed facility for simulating clouds on Earth, I've suggested to the group at JPL that this is something that would be natural for JPL, to develop a cloud simulation laboratory that could simulate the clouds and the aerosols in that region of the Venus atmosphere. In the discussions, this has gotten broadened out from just Venus to looking at clouds on other worlds as well, so that could include Titan, moon of Saturn, Jupiter, Uranus, and exoplanets as well.

The initial target is the atmosphere of Venus because there are missions being planned for that, and it would be very nice to have some basis for modeling some of those clouds already developed at the time when those missions happen. I'm bringing in the aerosol background and learning a lot about how people think about missions to probe other worlds. It's a very different kind of science than most of what I've done. The time horizons are very long. But it could provide us with new insights, not just for Venus, but as we look at other worlds, we will also ask questions that may help us to understand aspects of what we see here on Earth.

ZIERLER: Where do you see this going long term for JPL?

FLAGAN: The proposal we're preparing is to develop a facility that would enable experiments to probe aerosols and clouds on other planets in general. JPL is very heavily involved in planetary science writ large. This would give them the in-house capability for laying the groundwork for the basic science of how these clouds develop, their properties, their dynamics. This would become a long-term facility. Since the purpose of this program is to develop facilities that would be usable by the planetary science community nationally for a long time into the future, that's something that would serve JPL better than building a user facility at Caltech.

ZIERLER: A broader question. I'm not sure where to orient it in the historical narrative, but your work generally on cloud condensation, has it always been undertaken within the broader framework of understanding the role of clouds in climate change?

FLAGAN: That has been a very major part of that. As we look at climate change, the greenhouse gases are quite stable in the atmosphere. Carbon dioxide stays in the atmosphere for a very long time. Water is a huge greenhouse gas, but it rains out, but it recycles again and again. What we call the greenhouse gases really refers to the stable gases giving us the long-term perturbations to the energy balance for the entire planet. As we look at aerosols, the particles that are formed in the atmosphere due to pollution and natural events produce haze that reflects some sunlight back to space. But when those particles serve as the seeds for cloud droplets, so-called cloud condensation nuclei, they greatly increase the amount of sunlight that's reflected back to space, so they have a cooling effect. But in contrast to something like the carbon dioxide in the atmosphere, the clouds and aerosols are relatively short-lived. In clouds, the water droplets are continually cycling through. Water is condensing as particles are entrained into the clouds to undergo cooling, as they penetrate out through the top, they will evaporate as they reach dryer atmospheres, or they may rain out. But the lifetime of the cloud droplets is relatively short. Clouds are not homogeneous over large areas. The uncertainty in the cloud cover is the biggest uncertainty in the energy balance for the planet. That has really been the driving force for looking at the clouds.

ZIERLER: When you got involved with aerosols from a climate change perspective, did that change both the kinds of people you were collaborating with and your funding sources, just given how important climate change is and what a terrifically complex subject area it is to tackle?

FLAGAN: Yes, it changed both. The real starting point for looking at that was when they developed something I've described earlier, the scanning electrical mobility spectrometer, the instrument that made it possible to measure size distributions rapidly. We had a new instrument, and we took that to the field, initially on a mission that was ship-borne, funded by the Office of Naval Research. I had not had funding from ONR previous to that. It got us working with people who were doing a whole variety of measurements related to climate. We later got involved with the Naval Post-Graduate School and developed CRPAS, Center for Remotely Piloted Aircraft Studies, where we did some attempts at using remotely piloted aircraft, unmanned aerial vehicles, UAVs as they're referred to today. There, we would fly missions where we'd be bringing together people from universities all over the country to fly on our aircraft. I've gotten involved with NASA, flying instruments up to much higher altitude. We've done measurements up in the stratosphere as well. It's changed both the funding, the people, and the sizes of the projects. Because these projects are huge. To try to get together all of the tools you need for these intensive studies, we will often have experiments that involve 100 or hundreds of people from institutions all over the country and sometimes all over the world.

ZIERLER: What are some of the big takeaways we've learned so far? Climate change means in some places, wet areas will become wetter, dry areas will become drier. What are some of the big takeaways about cloud formation in a rapidly changing global planetary system?

FLAGAN: From the field studies we've looked a lot at the direct anthropogenic effects on clouds, mankind's effects on the formation, dynamics, and structures of clouds. The effects are profound. If we combine these field studies with the chamber studies, particularly the chamber studies we've been doing at CERN, the so-called CLOUD Experiment, it becomes clear that clouds today are different than the clouds in the pre-Industrial Era. We've perturbed the clouds quite dramatically over much if not all of the globe. You have to go to very remote places before you get to situations where you don't have the sulfuric acid nucleation as forming the cloud condensation nuclei that's dominating. We're still learning about that chemistry. The range of chemistry, the range of sources that are affecting the clouds is immense. We're constantly probing new parts of that. Typically, when we do these airborne missions, we'll go to a location where we can probe a small subset of the mechanisms that are involved. We've probed a limited range ourselves. Other people using similar techniques are probing in places we haven't gone to yet.

ZIERLER: It's a topic we touched on briefly, but if I can refine the question, given all of the pros and cons with sulfur dioxide and cloud seeding as a way to mitigate the environment in a very last-ditch kind of effort, given the fact that you can't seed clouds naturally because they simply won't stay around long enough to be useful, is there any happy medium between the dangers that can be found with sulfur dioxide and how ephemeral natural clouds are? Can we split the difference in any way?

FLAGAN: There are definitely proposals to release huge amounts of sulfur dioxide, some have proposed hydrogen sulfide, into the upper atmosphere to try to produce high-altitude stable clouds, getting preferably up into the stratosphere, where you don't have the mixing that will cause clouds to dissipate so quickly. There are proposals to do that. There are natural experiments that have done that. Volcanoes will, when there are really large eruptions, inject large amounts of sulfur into the upper troposphere and lower stratosphere. When they do that, they will alter climate. And there have been notable examples. Pinatubo was an example that had quite remarkable effects on weather patterns and temperatures over a huge part of the globe. And there have been others. There have been years, essentially, without summer, where it never got warm enough in the Northern New England area that they could raise crops effectively. There's a lot of discussion of doing that.

My personal view is that it's the extreme of hubris to assume that we understand climate well enough to do that without having unintended consequences. Yes, we will cool the weather over some large region if we're able to put those thousands and thousands of tons of sulfuric acid into the air to make that aerosol. But what is that going to do to the hydrologic cycle in different regions? Are some places going to get flooded? Are some places going to go into drought? Are some places going to suffer huge crop failures? These are huge questions that we really don't know the answer to. We can develop models, but the models all have uncertainties. Part of the reason for that is that we don't have the computational tools to make models that are completely first principles, fundamental science-driven, that there are parameterizations required for these large-scale models, approximations that are necessarily built into them because of the limitations of the computational tools we have today.

ZIERLER: For all of the excitement about quantum computers and what they can do in simulation, would they possibly be an asset for this research at some point in the future?

FLAGAN: Very likely. Will they eliminate the uncertainties? Probably not. They may reduce them. I'm not a modeler, I'm not dealing with models at that scale. But when you look at the models, they get essential features right with a lot of tweaking, a lot of work that's been done to try to capture what we know has happened, and in that way, to fit the models to make them describe the world have seen it, and thereby, hopefully predict better what we will see. But I would be very reluctant to rely on that for the future of this planet. I would far prefer to see serious action taken on attacking the root cause of the problem, which is the greenhouse gas emissions. But that requires concerted effort on a global scale to control those emissions.

ZIERLER: One connection you'll have to elucidate for me. In the fabrication for semiconductor devices, what role does aerosolizing silicon nanoparticles play?

FLAGAN: Less than I would like. [Laugh] Perhaps a little background on how I got involved in that question might be helpful. At one point, at JPL, I saw some pictures on a wall of some agglomerated silicon nanoparticles, very small ones, that I made some comments about. The microscopist relayed those comments to the people who were responsible for the samples he had taken those pictures of. I got involved in a project that was a flat-panel solar array project that JPL was doing. One key part of that was trying to reduce the cost, particularly the energy cost, of making silicon for photovoltaics. I got involved in an aerosol process they had started that was aimed at using a process carried out strictly in the gas phase, nucleating the new particles of elemental silicon as a way to go from purified gases, where you could get the purity you need for long-life solar cells, produce particles, then separate those particles from the gas so you could melt them and pull the single crystals that, in the 1980s when this project got started, were the basis for the solar cells.

I got a limited amount of funding from them to try to help them explain why they could never get the particles to grow big enough that they could separate them from the gas. I had a very good graduate student, and together, we worked on that problem. When we could put in words why they had failed–they were carrying out a gas-phase reaction, decomposing silane gas to produce silicon nuclei, then trying to grow them to large size. But they were greedy, they carried out the reaction fast. Because they were doing so, they produced huge numbers of nuclei. They could grow those nuclei somewhat, but they would also agglomerate and form fluffy agglomerates that you could not easily separate from the gas. Once we could put it in those words, the answer to how to speed up the conversion from gas to big particles was simple. Slow down. We had to slow down the reaction to grow the particles without agglomeration, and we had to limit the number of nuclei we were producing to slow down the nucleation and carry out slow growth so they were always scavenging the vapor rapidly enough to suppress additional new particle formation. And we did that.

Some time later, I was sitting on a thesis committee for one of Harry Atwater's or Kerry Vahala's students, and they were talking about certain kinds of defects in some of the semiconductor devices they were fabricating. I started asking, "Where do those defects come from? What is the source of the defects?" I finally came to the conclusion that they were nucleating semiconductor particles in the gas, those nuclei were depositing on the wafer as they were trying to grow their devices, and that became the source of the defect. I suggested to Kerry and Harry that there might be a very interesting project to look at that nucleation and control it so that we could control those defects. I forget which one it was, but one came back and said, "No, if you can do that, we've got something much better." We got started on a project that was aimed at producing floating-gate memory devices, which consist of a piece of silicon, very, very thin dielectric layer on top of it, on the order of a couple nanometers thick of silica, then a semiconductor layer across that, another dielectric floating gate electrode. What they would do is tunnel charge into this thin suspended layer of silicon to store that charge. That produces a static memory. But if there's a defect that allows charge to leak out anywhere, reducing that tunnel barrier or directly communicating with the substrate below, you would lose all the information stored in that device, so this is a failure mode for floating-gate memories.

Their thought was, if you could make that floating gate discontinuous, so you had isolated storage elements that are electrically isolated from one another so that if you had a defect, you would have some probability of losing a tiny bit of charge, but most of the charge would be stored. We wrote a proposal on that. We first wrote a proposal for an engineering research center, thinking this could be a rather revolutionary technology, and we could do something very interesting with it. We talked about the transport processes in the device, and the most critical review we got was, "Transport? I don't see anything about trucks in here." One of those absolutely ridiculous reviews that was used to not fund us. But we wrote a separate proposal as well just to do that little piece of the larger center proposal we'd been putting together, and that was funded. We started working towards building these devices. We ultimately developed a collaboration with what was then Lucent, where we would process wafers in our laboratory. I had to put a small clean room into my laboratory so we could do the processing clean enough to make the devices.

We would prepare the wafers, put down this layer with huge numbers of particles that we would lay down on the surface. Then, we'd ship those to Lucent that they would process through their research fabrication facility to make devices, and they worked. When they spun off Agere, that was one of the key technologies they highlighted in that spinoff. That company they spun off did not do well, so the process did not go anywhere with them. Ultimately, we started a collaboration with Intel, and for that, we had to shift from doing 20-centimeter, eight-inch wafers up to 30-centimeter wafers, more cleanliness, more challenging issues, build new tools for doing the deposition. But in the end, Intel got cold feet. The thought of taking a wafer that has such huge numbers of particles into their fab where they've spent decades trying to get rid of all the particles in their process just made some of the engineers at the company very nervous about allowing one of our wafers into their tool, so we were not able to continue the collaboration.

ZIERLER: As you say, not as much as you would hope in terms of aerosolizing the silicon nanoparticles. If there was more, what translations would that yield technologically in applications?

FLAGAN: One of the places this could be quite important would be spaceflight missions, where you have a lot of energetic particles that can create defects in semiconductor devices. You've got cosmic rays that will go through that will create defects that could cause static memory devices to gradually lose their memory. This would give you longer-term memory that would be more radiation-hard. That would be one. That's not a common application, but it's one that could be very significant. Another one that came out as we started doing testing with prototype devices we made was that because you've got a discontinuous floating gate, you actually have the ability to vary the amount of charge you're storing in interesting ways. It looked like we could conceivably get more than one bit per transistor, so we may get two or four bits, multiple voltage readout levels you could have with the device. That's one that I don't know the details on the ultimate application side very well, but it seemed to excite the solid-state physics people who were involved in the project quite a bit.

ZIERLER: When did you get involved in studying aerosols as they relate to the Amazon climate and environment?

FLAGAN: That, really came about through students. There were a couple different aspects to that. One, there was a graduate student who got involved in a field project using a classical type of cloud condensation nucleus counter that involved some field work in the Amazon. He did some work with that. That ultimately triggered some work here in trying to come up with better ways to make measurements of cloud condensation nuclei. There, I have to say that our attempt did not succeed directly, but students in our group, when they left, found ways to take what we had done and turn it around to make something that is now the primary method for making those measurements. That included the student who had gone to the Amazon and another student who had done primarily theoretical work. Some other work came about with a fellow who had visited Caltech many times, Andy Andrea from Max Planck Institute, who was heavily involved in Amazon research and had had this one student go to his facilities in the Amazon for making measurements.

There, it was just discussions with him about some strange things they were finding. They were finding some particles that were quite unusual in their structure, and they couldn't explain them. They thought they had discovered something that was really new and radical. I had a botanist in my group at the time because we were studying pollen. I got him involved in the discussions, showed him those pictures, and we had a meeting with Andy. Phil Taylor explained that these were particles from the digestive tract of a leaf cutter that the creature would rub on their wings because they were superhydrophobic, and it would protect them from the rainfall. These were little particles that looked sort of like buckyballs. It was really getting involved in trying to diagnose what they were seeing and being able to make the connections across fields where people didn't normally talk.

ZIERLER: Given the import of the Amazon to the global climate, in what ways was this research important not just locally, but on a planetary scale?

FLAGAN: That little piece was a curiosity more than key science. The chemistry and the particle formation of how clouds form, the aerosols that act as CCN, the dynamics of those clouds in the Amazon is where the real global importance comes in. The work we've done in the chamber facilities here at Caltech, the laboratory work we've done in the chamber facility at CERN has all produced data, mechanisms, understanding that goes into the modeling of these larger-scale phenomena. It's putting all of those pieces together. Ultimately, these global models have to touch on every part of the Earth's atmosphere. It really has to span the globe. There are so many different processes that are happening in different regions that become important that you have to take into account. Getting all of those pieces is the critical part.

ZIERLER: Last question for our morning session today. We've already discussed some of your collaborations with John Seinfeld. On what projects have there been collaborations more broadly between the Flagan research group and the Seinfeld research group?

FLAGAN: Our groups are, to a large extent, indistinguishable. [Laugh] Our groups overlap, we share space and are involved across the board. John Seinfeld is a superb modeler. He started out as a chemical engineering applied mathematician and went from there to developing large-scale models of initially urban air quality, then extending those to parts of the global climate. He's not been one of the primary developers of global climate models, but has been a key person in putting concepts and tools into those models. My work has been largely experimental. Students who are going through our groups, when it comes to working on the atmosphere, I've always got John involved in the projects. It's just so natural because he brings so much to the discussions. But he's got people, even modelers, involved. I often get involved in trying to put things into more practical physical terms, and often, in looking at questions that come out of the models, they predict something that we haven't seen.

Why haven't we seen it? Most often, it's because we haven't looked for it, we haven't had the right tools. A key part of the collaboration has been the back-and-forth between pushing the bounds on the modeling by getting measurements that were not available before, so suddenly, you see gaps in the model where there was no way to know whether it was right or wrong. Suddenly, you see it's wrong, something's missing. At the same time, the model might say, "We need better resolution of this to answer which of these pathways is important." The combination of modeling and experiment, especially measurement techniques, has really been critical. When Sheldon Friedlander, who had been in the Engineering Science faculty at Caltech, left in about 1980, the chamber facility on the rooftop of the Keck Laboratory sat idle for some time. Finally, they decided to bring it back to life. I was very heavily involved in that because I was comfortable in diving into the instruments and getting all the pieces working. John is very comfortable with many aspects of the instruments now. He certainly understands the data very, very well. But when it comes to diving in, opening up an instrument to find a problem, I'm more likely to be involved than him.

ZIERLER: So this is a mutually beneficial relationship among individuals that spreads to the research groups at large.

FLAGAN: Yes. And the students take full advantage of it.

ZIERLER: Well, we'll leave it there and pick up later today.

[End of Recording]

ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. I'm back with Professor Rick Flagan on Friday, April 1, 2022. Rick, new topic to talk about, your work on iodine emissions and their connection to marine aerosols. First of all, what are iodine emissions as they relate to marine aerosols?

FLAGAN: What are the specific molecules? We're typically dealing with iodine oxides that are formed from the parent compounds that occur in decay of seaweed, for example. They're often quite abundant in coastal regions.

ZIERLER: What are some of the possible applications of measuring this?

FLAGAN: One thing that's been noticed is that in coastal regions, there's often quite pronounced new particle formation occurring. That new particle formation is yet another piece in the puzzle of sources of particles that ultimately become cloud condensation nuclei and affect global cloud cover and ultimately, the energy balance for the planet.

ZIERLER: Is there a portion of your research that you would say is most closely connected to issues of human health that stands out from all the other aspects of your research?

FLAGAN: Work we've been doing looking at size distributions in the atmospheric aerosol is part of the puzzle of understanding how particulate air pollution affects human health. In terms of direct work that's affected human health, where we've actually been looking directly at the health effects, a key component of that was when we were looking at pollen and trying to understand the link between pollen and asthma. That got me started in looking in detail at the question of where particles deposit in the respiratory tract when they're inhaled. As we went through the entire COVID pandemic, as we continue to go through it, the questions of how people emit particles that contain the SARS-CoV-2 virus and where particles that are released by people deposit when others inhale them become key.

While I've not had funded research looking at that particular problem, it's a question that is still very much in my mind, trying to understand the link, why there are differences in the infective nature of the virus with the different variants, for example. You do have people who are showing strong symptoms, who are coughing and sneezing. That's what the medical profession homed in on early in the discussion of the virus spread. But that was really focused on huge particles. Smaller particles are the ones that can get deep in the airways, where the damage is done. When we look at inhalation of pollution particles, where the particles deposit very likely has a strong impact on the health effects they have.

ZIERLER: Because the deeper they are in the body, the harder they are to expel?

FLAGAN: Yeah, in the upper airways, there's cilia, in the bronchi and trachea, that will move deposited particles up. When you get deeper into the lungs, there are no cilia carrying out that action, so material can accumulate until it's expelled by something like coughing.

ZIERLER: Your work with Stanley Kaufman and others, when you were looking at particle surface treatment for promoting condensation, were you trying to promote condensation?

FLAGAN: That was a small invention that really came out in discussions with them. Stan Kaufman was at a company that makes a lot of aerosol instruments, and just in talking with him about instruments, some questions came up that really rose out of work we had done in trying to synthesize the silicon nanoparticles for the memory devices we talked about earlier. In that project, we made silicon nanoparticles. Little, tiny particles about two nanometers in diameter of pure silicon. We wanted to overcoat them with oxide. We did that using tetraethyl orthosilicate, a volatile precursor for producing oxide. But we had trouble getting a uniform coating. And we had discovered that if we exposed the silicon nanoparticles first to an appropriate vapor–in the case of the TEOS, that vapor was ethanol vapor–then we could get a uniform coating and fully wet the particle, so we would produce, then, nice nanoparticles with a core of silicon. We would then deposit about a two-nanometer-thick shell of silica so that, when we laid down the particles, they wouldn't be touching each other in an electrically conductive way.

The question comes up when you use condensation to grow really small particles for detection. We condense a vapor on particles too small to see, then grow them up to size where they will then scatter light so we can count them. But not all particles wet equally, so you will find that the composition of the particle affects how the vapor condenses. This recognition that finding the appropriate coating agent modified the surface slightly so that it would wet better would reduce that sensitivity was something that really came up in a discussion of things we were doing and led to a patent, which, because it came out in discussions between people from this company and me at Caltech, is jointly owned by Caltech and that company. That company has not done anything with it, to my knowledge.

ZIERLER: How come?

FLAGAN: They have not seen that enough people are really concerned about small differences in what size particle you can detect to create the magnitude of market that they want. We do run into the problem of that detection. We've taken a different approach. What we've done is improved our separation so we can know the particle size, then we can calibrate. We can now measure down to sizes where you would have that composition effect be very important, and we can address it directly in the calibration of the instrument, so we haven't gone back to that additional complication. Because we might have to find a different surface treatment for every kind of particle we'll get.

ZIERLER: An overall question. In the past 10 or 15 years, have there been advances in mass spectrometry that have allowed you to ask kinds of questions or pursue research that might not have been possible earlier?

FLAGAN: There certainly have. Really, over the past couple decades, maybe a little bit more, one of the key developments was the development of techniques for taking an aerosol sample into vacuum in a way that you could size-separate those particles, evaporate them, ionize the vapors, and analyze them with the mass spectrometer. You can do direct analysis of the aerosol particles with mass spectrometry. The first work on such instruments was done long ago, but there was one company that developed an instrument that has really revolutionized aerosol science. It's given access to direct, near-real-time chemistry of the atmospheric aerosol particles. When that was first being developed, that company, Aerodyne Research, sought support from an SBIR program for funding from the Office of Naval Research. This was in response to a call that went out for instruments supporting the aircraft that we were using with the Naval Post-Graduate School for doing atmospheric science projects. I was charged with being one of the technical monitors of that program, so I was visiting that company regularly throughout the project and learning a lot about it from the very beginning. At the end of that project, we ended up with the first of those aerosol mass spectrometers they produced that was taken outside of that company. That was for flight. They built it to meet the requirements of putting it into a small airplane, and that gave us a tool to get the chemistry, not detailed molecular chemistry, but at least to identify key components of the atmospheric aerosol as we were flying. We could then get a level of data we had not been able to get before. That is now used around the world, and we still use it extensively here.

ZIERLER: Another general question about advances in instrumentation. How small can we detect nanoparticles? In other words, where particle physicists are not sure if quarks are made of constituent parts, what is the analog in the world of aerosols? How small do nanoparticles get, and what are some of the theoretical bases for thinking that there might even be smaller particles out there?

FLAGAN: With the mobility techniques we use for a lot of our small particle measurements, we can go down comfortably to one nanometer. But we really have a problem if we go down below about one and a half nanometers. The reason for that is, as we get down to those very small sizes, what we're doing is putting a charge on a particle, migrating it in an electric field, taking out a narrow range of migration velocities of a poly-dispersed aerosol, then we go through a condensation process to grow them up so that we can detect them. In order to put the charge on the particle, we generate gas ions. When we get down to about one and a half nanometer, the condensation process, when we drive it aggressively enough to activate aerosol particles in that size range, will also activate gas ions with mobilities corresponding to that size range. Our measurements start overlapping with the mobility measurement down at that very small size.

ZIERLER: What has been some of your engagement in the world of biosensing?

FLAGAN: This came out of the work we were doing on allergens. A key question that kept coming up was, after we detected respirable-sized allergens for pollen, how could we measure those particles in the ambient air? This becomes a little different than the typical biosensing application because what we're really interested in is getting large numbers of measurements to map an area. If we're looking at pollen, we're looking at particles that are relatively large, that fall relatively fast, that, for the tree pollens, you'll see a lot of yellow on the ground this time of year from the falling pollen. Respirable particles may disperse farther. We were trying to find ways to get that allergen as a sensor rather than staring through a microscope and trying to look at the morphology of the pollen grains, which is the state-of-the-art today. Kerry Vahala in Applied Physics had developed some really beautiful optical resonators. He had a student who had done some work looking at adsorbing molecules onto these sensors, trying to detect very small amounts of different biological molecules, so these looked promising.

That work is and remains controversial. The student who did the work produced some results that suggested detection at a single molecule level. Measuring the very small change to the resonant frequency of the little micro toroid of glass on which these biomolecules were adsorbed suggested an extreme degree of sensitivity. But there were some problems with her data computer, and ultimately, a lot of the raw data were lost. The paper was published. It's been fairly highly cited, but the results have not been replicated fully. We did some work looking at that sensor, not trying for the single molecule detection, but looking at having a sensitive detector. The physics for single molecule detection, the model that was published in that original paper or thesis, we later showed didn't work as it was described, that it required some nonlinear optics to come into play that would give the sensitivity needed. I was not interested in that cute result of detecting single molecules, I was much more interested in seeing if we could push it to where we could make it a practical sensor that would be cost effective for doing atmospheric measurements. Ultimately, I concluded that while this was promising, it was people who had a lot more experience with the biological and biochemical side that I do that really needed to be involved in this to carry it forward.

ZIERLER: Looking at your contributions to the 2010 California Research Report at the nexus of air quality and climate change, it raises the broader question, for you, when is it useful to combine air pollution and climate change as things to study in tandem, and when is it more useful to keep them separate?

FLAGAN: For understanding things like the atmospheric aerosol, the two are intimately linked. There sometimes is an overuse of the term air pollution or pollutants in ways that confuse the issue as to what one is dealing with. How to deal with the extreme exposures that occur in some urban settings that have direct near-term health effects on individuals is a different issue than how to deal with gases that are being released that will have an atmospheric lifetime of centuries to millennia. There's a question of time scales that differentiates the two in terms of their effect. The emissions of CO2 is a huge factor in climate change. It's not directly a huge factor in local human health, though the climate effects may profoundly affect human health.

ZIERLER: Just by judging the amount of news coverage these topics get, we don't hear so much about acid rain and ozone depletion anymore, but for both of them, how much of this has been because these issues are largely mitigated, and how much of it is because it's media saturation, and we can't be bombarded with all of the world's problems all the time?

FLAGAN: Media saturation is clearly a big part. With respect to acid rain, I think it's likely that much of the problem has been addressed. Certainly, the very egregious examples that were being reported have been addressed. Early on, there was an adage, "The solution to pollution is dilution." One way to deal with the emissions from burning coal and the sulfur oxides that are emitted was to build tall smokestacks. When that was done extensively in Britain, the Nordic countries started seeing acidification of their lakes. That still continues. It's continuing around the world. But a lot of the developed world has invested in technologies to reduce those sulfur oxide emissions that were the primary culprit behind that. You still see it going on. In China, on the rise to the Himalayan Plateau, there's a national park that's very, very popular because the beautiful colorful waters, very elegant-looking pools they used to have. But with the coal combustion in China, when last I visited, that was diminishing quite a bit. The pools were losing a lot of the color that had made them one of the most popular tourist attractions for the Chinese within China. Part of the reason I was visiting Chengdu at that time, along with Michael Hoffmann, was to talk about air pollution and water pollution and how the two were linked.

ZIERLER: Have you found generally that collaborating with Chinese scientists is easy? Are they able to operate freely? Specifically because the Chinese government wants to, literally sometimes, sweep these problems under the rug.

FLAGAN: In that case, I gave a short course to a bunch of Chinese students, helped introduce some concepts at the university. I did not establish any formal research collaborations. The place where I did have some collaborations was with the COVID-19 pandemic. I had visited Peking University in August of 2019, and one of my hosts there was a professor who was dealing very closely with biological aerosols. He had developed an interesting technique for looking at particles in exhaled breath. As the pandemic developed, I got an email from him asking if I would help him in trying to analyze some of his data, which was very interesting data looking at the transmission of COVID-19. The first dataset was from hospitals in Wuhan, which was the epicenter of the first outbreak of COVID-19. These were data obtained, if I remember right, in February and March of 2020, so right at the beginning of this severe pandemic.

They were looking at patients who had had the disease and had recovered and were about to be discharged, measuring the SARS-CoV-2 viral RNA in exhaled breath. They found that about 20% of them were emitting substantial amounts of viral RNA. They didn't have direct measurements of the live virus because the laboratory was not equipped to handle such a dangerous virus, which ended up costing over a year in getting the paper published. The second set was when they had the next wave in China, which was people who had been abroad from China, returning home, and bringing the disease back with them. That was data from Beijing. There, they found that the viral RNA in exhaled breath was higher among people who had just started to show symptoms. They didn't have pre-symptomatic people, but they had people sometimes with samples taken the same day they first showed any symptoms, and the concentrations were highest then and decayed over the course of the disease.

ZIERLER: Thinking about initiatives on campus, like the Caltech Nano rDNA Project and the Caltech Photooxidation Flow Tube Reactor, I wonder if one of the larger stories is that in recent history, Caltech has remained a center for cutting-edge instrumentation research in aerosol science.

FLAGAN: That certainly is a big part of what I think my group has done is to help maintain that.

ZIERLER: Who have been some of the key supporters of this work? Obviously, this requires significant funding.

FLAGAN: NSF has been the biggest supporter. Office of Naval Research has supported some. But really, the primary support has come from NSF. There was an organization called the Coordinating Research Council that supported the work we did when we first demonstrated the scanning electrical mobility spectrometer. That was developed using their support. That was a foundation. But really, NSF has been a major supporter. EPA has supported some. A lot of it has come in terms of a project that's ongoing that has, as its focus, not instrumentation development. To get grants to support instrumentation development is extremely difficult. If you want to develop what are called major equipment purchases, that, you can sometimes get. But to write a proposal to build a new instrument–the instruments I've built, the real cost is not that outrageously high. When I write proposals, I'll often get mixed reviews. For example, when we developed a new kind of classifier that's allowed us to go down comfortably to the one-nanometer size, I'd get reviews that would say, "This is fantastic. This is exactly what we need in the field."

Then, later on, the review would state, "But he's only going to get data in the third year. He'll spend two years developing the tool. What good is it? Don't fund." I've had proposals that have had very, very strong reviews, but one negative comment will kill it. The agencies really don't seem to support the development of the tools of science very willingly. They're after the discovery. They want to support the reason that you're developing the tool, but there seems to be a mentality that says that academics buy instruments and use them to do science, companies develop the instruments, which is not true in most cases. Most of the time, the companies are taking something that someone has developed on a fundamental research project, but it's usually a bootleg part of that, something that was not specified in the forefront objectives. Maybe there are other parts of NSF that are more open to that, but the reviewers of my proposals have not seemed to take that perspective.

ZIERLER: As the predictive accuracy in meteorology seems to get better year over year, what credit can aerosol science and the kinds of things you and your colleagues do take for contributing to that accuracy?

FLAGAN: I have to say, I really don't know. I don't think that the cloud models that are in the weather predictions really build on that particular aspect of the science. It really comes more from better observations of the basic meteorological parameters, and improved computational tools and vastly improved computers that make those predictions possible. I could be wrong, I have not really looked at that link.

ZIERLER: What has been some of your recent work establishing connections between biomass burning and cloud formation?

FLAGAN: We've done a lot of work in which we have used our instruments aboard aircraft to probe clouds. One laboratory we use for doing that is the marine stratocumulus cloud deck that we often see off the California coast, particularly in the early summer. Often, we'll look at local perturbations of the cloud deck due to thinks like ships. But virtually, every year when we've gone to make such measurements, our aircraft is based in Marina right next to Monterey, where the Naval Post-Graduate School is, so we're flying out over the Monterey Bay, going out and probing the cloud deck over the Pacific. Virtually every year that we do that, there will be incidents of smoke plumes extending out off the coast. We have probed many of those in the process of looking at cloud perturbations that result from them. You get very interesting contrast between the unperturbed and the perturbed.

ZIERLER: Bringing our conversation ever closer to the present, what have been some of the interesting developments recently in the subfield of aerosol nucleation?

FLAGAN: The key recent developments have come out of this CLOUD experiment at CERN, which I've mentioned several times. In that experiment, we have an extremely clean chamber. The chamber is a 26-cubic-meter, stainless steel cylinder. It's electropolished stainless steel. It's manufactured to the cleanliness standards of the best that you can find in pharmaceutical manufacturing or microelectronics manufacturing. It's the cleanest large vessel that one can conveniently make today. That reactor is temperature-controlled over a very wide range. We can go from a little bit above room temperature down to temperatures of -50, even -75 degrees Celsius. Typically, we don't go much over 25, but we could go to higher temperatures Celsius, above room temperature, as well. The air that goes into that chamber is taken from liquid nitrogen and liquid oxygen and mixed to produce a synthetic air. The liquid nitrogen and liquid oxygen are removing most of the contaminants, so it's the cleanest air that one can get. With that chamber, we can actually run experiments that are clean enough to get down to the molecular level in looking at new particle formation. You take that beautiful chamber, then it's surrounded by instruments. Typical experiment, there's one starting up next week, which I will not be participating in this time due to the uncertainties, but around that, we will have many high-resolution mass spectrometers to probe different classes of molecules.

They'll use different kinds of atmospheric pressure chemical ionization to probe different kinds of molecules in the system. They'll do it with a resolution sufficient to identify all the atoms that are in the molecules. They don't get molecular structure, but they get the elemental composition of the individual ions that are detected. With that, we can begin to watch the formation of molecular clusters. In our first experiments, we looked at the role of sulfuric acid in the nucleation. Those first experiments were a little over a decade ago. We would see the sulfuric acid monomer, the dimer, the trimer, the tetramer. Beyond that, we'd start seeing other things that were in our reaction mix contributing to the growth. The sulfuric acid is not so abundant to grow particles out to large size, but it's the key to nucleating the new particles. Then, we see highly oxidized organics. A typical experiment, we might see a wide range of individual molecules, but with something on the order of C10 O10, plus other stuff, plus hydrogen as a monomer of the organic that's adding to the cluster. Then, we see a dimer, a trimer, and we see those continuing to grow out to large size. And by large size, with mass spectrometry, one can get up to a size approaching two nanometers. That's primarily a European experiment, but there are now three groups that are actively involved in it. We're involved, more recently, Carnegie Mellon has gotten involved, and they bring some very good mass spectrometry to the table, then there's a group at University of Colorado, Boulder. But most of the groups involved are European.

We bring our mobility measurements that take us down to one-nanometer size. The mobility measurements we're bringing in give us better resolution of the size than the techniques they had before we got involved, which were based upon that condensational activation. We bring less sensitivity to the composition as we try to get the size distribution. What we can now follow is how the particles grow from molecular clusters out to large thermodynamically stable particles. We can see that the growth occurs by first extremely low-volatility oxidized hydrocarbons depositing and growing the particles. And that reduces the surface free-energy barrier for slightly more volatile molecules to condense, then slightly more, and slightly more. We can now watch the clusters march through composition space as the particles nucleate. The classical theory of nucleation looked at a single or a few kinds of molecules contributing to the growth, but what we're forming under these atmospheric conditions is sort of a soup that has a lot of different molecules that contribute.

In one very recent work that got a student from my lab and a student from this CMU lab lead authorship on a very nice Nature paper, we found a mechanism that explained how nucleation can occur in the highly polluted atmosphere, places like Beijing or New Delhi. That was the initial formation of tiny clusters by sulfuric acid, some growth by oxidized organics, but then under the right conditions, you could have nitric acid and ammonia come together and lead to really, really rapid growth, explosive growth of the very small clusters. They grow so fast that species that you wouldn't normally expect to find in those small particles are really playing a key role.

ZIERLER: To bring our conversation right up to the present, and to circle back to our initial conversation, where we spent so much time talking about COVID, looking forward, unfortunately, to the next pandemic, because there will be a next pandemic, do you think the contributions of experts in aerosol science who understand fluid dynamics to understanding COVID-19 mitigation and transmission has been sufficiently formalized so that when this happens again, people like you and your colleagues will be part of your equation at the beginning?

FLAGAN: I would hope so, but I think there's more we have to do. There are some aerosol scientists who have really played a key role in being out in front, speaking to the press all the time. I've done a fair amount, but there are others who have done many, many times more. The medical profession, the people who are looking at the signs of new diseases occurring–we start thinking of how infectious diseases propagate. There were some standard assumptions in the field of disease propagation that were used at the beginning of this pandemic that were wrong. Once they started to accept that coughs and sneezes might play a role, that became the entire issue. All the protocols said, "Don't wear N95 masks, wear cloth masks." The real message was, "Don't go out and buy N95 masks because we don't have enough to protect the people who are going to get directly exposed and who we need to be able to do their magic to save people's lives.

We need the first responders, the medical professionals to have the protection." That was part of the message that was given in saying, "Don't go out and buy the best masks." First, we should've had preparation. We should've had more masks stockpiled. We'd had masks and respirators stockpiled. Those stockpiles were dismantled, thrown away. Masks were old, but had not totally lost their effectiveness; they were not maintained because it takes warehouse space and money. The assumption that we've got a new disease coming in, and we've identified what the cause isn't, identified that aerosols were not part of the equation, when they didn't know the cause or the mechanism of transmission, is the wrong message. Nothing should be taken off the table until you know what the cause is. We don't know that the next one will be a respirable pandemic. The chances it spreads as wildly as what we have seen with this by water in the developed world is probably a whole lot smaller than the probability of a respirable transmission just because the vast majority of the water we get is directly treated. That's something you can do.

We don't have a way to clean the air. But we don't know what the next mode of transmission will be. Could we be seeing the propagation of antibiotic-resistant pathogens through groundwater that might get out into the air, that might come with our food? Could we see something totally different? We don't know. I think the first thing is, we need to make certain that the people who are thinking about these diseases, about epidemics and worse, have kept their minds open that we have a viable body of people who are looking at the diseases that are out there in the world today. When it comes to respirable transmission, I think we need a much deeper look into how the disease is transmitted. I don't think that's going to easily come out of epidemiology. Just looking at the statistics is going to give us what the end result is. I'm not sure it's going to give us the mechanism. When I read papers describing epidemiological effects, I'm often left saying, "Yes, you've shown there's an association, but association does not mean causation." I think we need to get more physics into medicine.

On the question of airborne transmission, a lesson I take from this is, look where the disease has its effects. If you have something that's having its largest effects in the deep lung, why look at the nose and throat as the primary source? Mechanisms exist for particles to be generated in the deep lung. We really ought to be looking at those. But we need to convey that story in a way that communicates it to the people who are on the frontlines, who are dealing with the disease, the people who are looking at the statistics. We need to somehow get the message out that there are other sciences that can come into play and tell us a lot about what's going on. We've had a bunch of pandemics in the last few decades. This is by far the worst.

ZIERLER: But not the last.

FLAGAN: It's definitely not the last, and the next is probably not that far ahead.

ZIERLER: On that somber note, now that we've worked up to the present, I'd like to ask one broadly retrospective question to bring it all together, then we'll end looking to the future. If I can, I'll ask you to survey your career, all of your collaborations, all of your research endeavors. Where have you had most satisfaction in your contributions to fundamental research, to just understanding how nature works, and where have you had most satisfaction in contributing to applications, to actually improving human health and the environment?

FLAGAN: In the fundamental work, the places where I think I've made the biggest contributions have been in looking at problems and trying to find ways to get the information that's missing. In a lot of cases, that's led to thinking about how we make the measurements, how to make new instruments, how to get at the parameters we have not been able to probe before. Sometimes, it's looking at it more theoretically. How do we analyze the data? Can we develop good first principle models that give us better understanding of what's going on? That's been a combination of both. Looking at health effects, I've been more on the side of that, but the tools we've developed have helped to get better data on what's out in the air. I haven't developed new models of lung deposition. I can see that they are seriously needed. I have ideas on that. It will probably be my students who carry out some of those ideas. I'm constantly trying to plant seeds for things that people may want to think about in the future as the tools become practical.

In terms of the applications, the efforts to understand air pollution, to understand the uncertainties in energy balance for climate change considerations, the laboratory studies we've done looking at yields of secondary organic aerosol from many, many different compounds, from mixtures, from simple and complex systems, each of those little pieces are detailed pieces of a really complex puzzle. But what's been done is those pieces, that quantification of what's going on in the atmosphere from laboratory experiments has gotten integrated into models that are used in designing strategies for controlling air quality. I look outside, and I can see the mountains right now. Not very well, there are trees blocking them. [Laugh]

ZIERLER: That's not the problem. [Laugh]

FLAGAN: There was a time when from here, we'd have gone months without seeing that there were mountains there at all. I might've seen the lights from the radio towers up on Mount Wilson, but you wouldn't see the mountains during daytime at all. I think my work in concert with John Seinfeld has played an important role in cleaning that up. That's something I'm happy with.

ZIERLER: Finally, looking to the future, a fun question, then one specific to you. As you well know, as our telescopes and instrumentation get better, there's more and more excitement that we might be able to detect either bio-signatures or techno-signatures on an exoplanet that might finally resolve the question about whether we're alone in the universe. For you or someone like you, with your areas of expertise in atmospheric science, how might you contribute to verifying that all-important question if we happen upon something pretty interesting?

FLAGAN: I had a meeting earlier today dealing with aspects of that. Planets that have life as we know it very likely have things like clouds. It might not be the same chemistry. Life as we know it doesn't necessarily mean it's carbon-based life as we're looking at it. But to understand other worlds, you will ultimately have to be able to see the nature of the atmosphere in which life might develop. I think the lessons we've learned with increasing detail studies of the Earth's atmosphere can be used to look at other worlds. We've gone a long ways in looking at Mars. Digging into the surface is the key to looking for life there. Over the coming decade, there will be missions to Venus. While the surface is unlikely to have life, certainly nothing like what we know could exist at the surface, temperatures are way too hot, and pressures are way too high, there's a cloud layer in the atmosphere of Venus. It's at an altitude of 50 to 60 kilometers, where the pressures range from an atmosphere or a little more to a tenth of an atmosphere.

The clouds aren't water. They have some water in them. But they're primarily sulfuric acid. Life there would not look like anything we have. It would probably be microbial in nature. But that is one area where there's a lot of interest in looking for it, especially since phosphene has been found there. That's attracted a lot of attention. Other worlds could also have signs. That's the big issue in planetary science. I'm not a planetary scientist and do not have the detailed knowledge there, but I can see that the people who are looking at those other worlds and trying to understand them are increasingly asking deeper and deeper questions about the chemistry of the atmosphere, and not just what it's made from but what's going on chemically, the reactions in the atmosphere, but also the aerosols, the hazes, and the clouds. I think there will be a role for aerosol science in that.

ZIERLER: Finally, for you, for however long you want to remain active in the research, what's most important for you to accomplish that you haven't yet?

FLAGAN: I can think of a bunch of different directions, but one issue I keep coming back to, having lived through the smog in Los Angeles, having seen the health effects occurring around the world on the order of World Health Organization numbers, something like several million people annual die from air pollution, and particulate pollution is a key part of that, to understand that and to motivate continued efforts to clean it up and document that it is actually being cleaned up, one thing that's really needed is networks of sensors that provide scientifically valid data to document air quality. But to do this on a global scale, and it needs to be done on a global scale-every continent except probably Antarctica has air pollution problems, although I would not be surprised if Antarctica had indoor air pollution problems–we need sensors that are cost-effective, that are cheap enough that they can be deployed in large numbers, we need ways to gather the data from them, to analyze it, and to use that to look at what people are being exposed to on a very broad scale. I think some of the tools I've developed have the potential to play a key role in that.

They can't play a key role with current manufacturing technology, at least not with the mindset of the companies selling the scientific instruments today. You've got to think about how you make the instruments more like how you make a toy. It's got to be cheap, but it's got to give scientifically valid data. Not just data that the lawyers and regulators decide is important. For particulate matter, PM 2.5 is a number that's good for lawyers, not good for science. I think there are ways to make measurements that give high-quality data that will inform people about what exposures are. Then, if you had that exposure data, you could take give that to the epidemiologists and say, "We can give you PM 2.5 numbers, but we can give you numbers that will allow you to determine what the dose to different regions of the lungs will be, and you can start thinking about the real processes involved that are causing the health effects. Can we develop and propagate the scientific tools to a level that we can really, on this grand scale, begin to address longstanding health questions?

ZIERLER: On that note, it's been a great pleasure spending this time with you. I'm so happy we were able to do this and capture your recollections and perspective over the course of your career. I'd like to thank you so much.

FLAGAN: You're very welcome.

[END]