Section Head, Deep Earth Processes Section, Earth Sciences Division, National Science Foundation (Ret.)
By David Zierler, Director of the Caltech Heritage Project
July 27, 2022
DAVID ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It is Wednesday, July 27, 2022. I'm very happy to be here with James Whitcomb. Jim, it's very nice to be with you. Thank you so much for joining me.
JAMES WHITCOMB: Thank you.
ZIERLER: To start, would you please tell me your current or most recent title and institutional affiliation?
WHITCOMB: My last title was Section Head in the Division of Earth Sciences at the National Science Foundation.
ZIERLER: How long had you been at NSF? When did you start?
WHITCOMB: Started in 1989, and I retired in 2017.
ZIERLER: Was that your first job out of school?
WHITCOMB: Depends on which school you're talking about. [Laugh] If you're talking about when I graduated from Caltech, I stayed on at Caltech at the Seismo Lab on the research faculty. I graduated in '73, then I was on the research faculty until 1980, at which point I went to work at the University of Colorado at an institute called CIRES, Cooperative Institute for Research in Environmental Sciences. I was there for about five years, then I came back to Pasadena to join a friend of mine that I'd worked with at Caltech and JPL. He started a company called ISTAC, and the goal of that company was to utilize the new GPS system, the satellites the military had been putting up. And one of the things about them was that many of the generals wanted to deny civilians and people outside the US military any access to use of them. This company basically had a way to circumvent that and, in a non-threatening way, utilize them for measuring high-precision positions on the ground. Of course, one of my goals was to be able to use precise positioning to measure strain that ultimately would likely be relieved as earthquakes. It was a startup company called ISTAC, and basically, the Challenger disaster happened, so we didn't have enough satellites up, and the company ran out of money. At that point, back in '89, I came to NSF.
ZIERLER: Tell me about the circumstances of you transferring over to the NSF. What was the opportunity, and what was interesting to you about that new work?
WHITCOMB: Basically, NSF is the government organization that funds academic research. It's an attractive job in that it gives you an overview of research in the whole world, really, but mostly at institutions in America. You can have a large influence on where the science goes. While at Caltech when I was a grad student, I looked at many different techniques for earthquake signals in seismology, electrical resistance in the crust, strain (which we were not able to measure at the time), and many other techniques. Utilizing this multidirectional approach, coming to NSF gave me an opportunity to carry on in these ideas and support those kinds of research. For example, some things I was involved in was the development, building, and funding of the Southern California Earthquake Center. I basically started a project called EarthScope, which utilized the GPS techniques, seismology, satellite radar or InSAR, and other techniques to cover the whole United States and Alaska with instrumentation. That was a very successful project. It gives you an opportunity to have a big influence on science.
ZIERLER: When you joined the NSF, was SCEC already operational?
WHITCOMB: No, in 1990, there was an organizational meeting for doing something about Southern California earthquakes, because basically, the USGS was mostly concentrated in Northern California, and there was a lot of pressure, both scientific and otherwise, to pay more attention to Southern California. And I attended that organizational meeting. I was only about a year into being at NSF. I attended that meeting, participated in it, then we backed away as NSF representatives when it became clear that NSF would be approached for funding. They wrote a proposal to a special program that NSF called Science and Technology Centers, and I became the program officer in charge of the review of SCEC. I put together a panel to review it, and it was successful. Then, I was the program officer in charge of that center. That's how SCEC got started.
ZIERLER: What were the overall missions for SCEC? How regional was it conceptualized? In what ways would you be able to extrapolate what would be learned from SCEC in Southern California across the United States or even globally?
WHITCOMB: Almost from the beginning, SCEC became an attractive place to study earthquake science for people all over the world. Its initial focus was on Southern California, mainly because if you look at different ways of computing hazard, Southern California probably has the greatest earthquake hazard in the United States. There was a need there. But the basic premise behind SCEC to start out was to try to pull in as many different kinds of data being gathered by different people at different universities and broaden the tools and people working on earthquake hazard and prediction. Basically, the US Geological Survey and about six or seven universities started out as the members of SCEC, but now it's got something like 1,000 scientists from 100 institutions participating in one way or another. It's really using Southern California as a laboratory, but the applications of the research are worldwide.
ZIERLER: Tell me specifically the ways that NSF supported SCEC. What was it doing to actually get it off the ground?
WHITCOMB: As I said, there's a special program called the Science and Technology Center program, so there was an opportunity–it's always difficult when you have something sort of in-between size, bigger than a normal grant but not huge. It's really hard to fund things that size, so you look for special programs to do so. This special program was the most attractive one at NSF. Also, the USGS wanted to bring more attention to Southern California, wanted to participate and were willing to help fund it. It was a joint funding agreement between NSF and the USGS, and it was a competition with other Center proposals. They competed against something on the order of 10 other proposals. Competition was very tough, but it was a shoo-in. It was one of three proposals funded that year.
ZIERLER: You were new to NSF, but I wonder if you had an appreciation for the history of NSF support for seismology before SCEC. Was your sense this was really the first big, funded project, or did the NSF have a longer track record in this field?
WHITCOMB: Seismology is one of the sciences that was always funded by NSF, but SCEC was probably the first big effort that was funded by the National Science Foundation. Before that, the big arrays were funded by the Air Force, NOAA, and USGS, so government arrays were basically the main effort. This included the Global Seismic Network that was funded by NOAA, which ultimately became a USGS network.
ZIERLER: What was your interface with SCEC on a research level? How closely involved were you with some of the research happening at SCEC?
WHITCOMB: Being at NSF, we don't do the research, so my role was to monitor and review the research on a regular basis. We had annual meetings, reports I reviewed, and then periodically we put together review panels from the community to see how well they were doing and whether we were getting the best science for the money.
ZIERLER: What were some of the objectives in the proposal that resonated, both from the fundamental science perspective and the applications perspective, the things that would make earthquakes less hazardous or mitigate the damage?
WHITCOMB: That's changed a lot over the years because SCEC has been around a long time. There have been big advancements in an integrated approach, for example, EarthScope. Many of the lessons we've learned in SCEC, I think of as being an application to larger things like EarthScope. The first thing SCEC did was bring together the broad community to enable community models, and that was really the first time, I think, that happened. There were several different people from different fields, and they developed velocity models, earthquake fault models, and many things like that, which–it's not that people weren't able to do it, it just never seemed to happen and come together prior to SCEC. Then, the new technology of GPS came along that enabled us to actually monitor, for the first time, the motions the plates were undergoing, and thereby, if we saw the motions, we could calculate the strain. Any strain that happens in the crust is a direct measure of the potential for earthquakes. That's the first time we were able to do that. And SCEC was one of the first places to, in cooperation with NASA and the USGS, enable a large array of GPS sites to be implemented. But that was just in Southern California. This led the way, I think, to the broader efforts that followed.
ZIERLER: Around the late 80s, early 90s, what were people saying with regard to earthquake prediction? Was there optimism that SCEC might get the field at least closer to earthquake prediction?
WHITCOMB: Earthquake prediction has always been a difficult thing. There are some people who theorize that earthquake prediction in the short term is impossible, because of the complexity of physics of fracturing materials. It's never been that clear. But one earthquake prediction that science in general has been very successful in developing, including SCEC, is long-term earthquake prediction. For example, as I said, when you measure the strain using GPS in the crust of the earth, it's a direct measure of the potential for earthquakes. We've really made great advances in long-term earthquake prediction that will save lives and save dollars, which has been a big advancement since the birth of SCEC.
ZIERLER: What about earthquake early warning? How important was that for SCEC's early development?
WHITCOMB: That was really not a high priority. In fact, I don't even remember it being seriously discussed as a practical effort at the time. There was a lot of discussion of how valuable earthquake early warning would be. It depends on the size of the earthquake because you don't really get a very long warning, and it wasn't clear at the time how useful that would be. Now, people have decided that it is worthwhile to do, so there are earthquake early warning systems being developed and in place. We'll see how useful they'll be when the next big earthquake happens.
ZIERLER: Was there a component of SCEC focused on earthquake engineering, just thinking about buildings and mitigating property loss?
WHITCOMB: Oh, yes, earthquake engineering was a big part of early SCEC. The first director of SCEC, Keiiti Aki, was thought to be half in earthquake engineering and half in seismology. There was a lot of effort along those lines.
ZIERLER: Given that SCEC is made up of scholars at many regional universities and beyond, were you involved in the discussions about SCEC's siting?
WHITCOMB: That was decided early on in the organizational meeting. The choice of the first director was put to a vote by everybody there, except for the NSF people, of course. Also, the location of where SCEC would be housed was partly determined by who the director would be and that USC was very generous in terms of waiving much of the overhead that the contracts would be charged in terms of funding other universities through the mechanism of SCEC; so that was very attractive.
ZIERLER: In what ways did SCEC encourage collaboration between Caltech, USC, UCLA that might not have existed otherwise?
WHITCOMB: I think it was huge between all the universities that were involved. There has always been sort of friendly scientific competition among universities, and really, SCEC broke down much of those barriers of competition between different universities within Southern California, between Southern California and Northern California, so there was a lot more sense of cooperation. The individual grants that came through SCEC were not a lot of money. What you were getting from participation in SCEC was the joint effort of working with other people, and in getting more leverage in terms of the science and data involved because more people were in the game.
ZIERLER: For those first 10 years from roughly '90 to 2000, was SCEC sort of the major portion of your portfolio, or did you have other major projects at that time?
WHITCOMB: No, I had other duties. I was program officer in geophysics, so I ran the geophysics competition along with another person. We ran competitions twice a year, people would send in proposals to be funded, and we'd send those out for review, then we'd convene panels to look at the reviews and make decisions in terms of funding priority.
ZIERLER: During this time period, given your vantage point, what sticks out in your memory as some of the really interesting research that was deserving of funding?
WHITCOMB: To me, some of the most exciting things are the technology, the high-precision positioning of points on the ground with GPS was revolutionary. Then, some of the things that came out of the seismology arrays, both in SCEC and elsewhere, because we had so much spatial data coverage, we were actually able to use seismic noise to invert for the structure of the crust and upper mantle that wouldn't need to have artificial noise sources to do that. That was exciting. There are so many things. It's hard to pin down one. [Laugh] But we certainly have a lot more knowledge about the structure of the crust, how it's deforming, and earthquake hazard. We're still pointing up some of the real mysteries that are going on, like earthquakes in the Central United States, for example. We still don't understand those very well. There's still a lot more to do.
ZIERLER: In what ways, if at all, does NSF partner with industry in its interests in geology, geophysics, and seismology?
WHITCOMB: Industry can write proposals and be funded at NSF. We do see proposals that are successful. Basically, I think industry leads the way, in many respects. We've been riding the coattails, for example, on computers, and technology, and things like that for many years that has enabled us to make huge advances, and we didn't ever have to pay for that. The industrial competition from the internet and stuff like that has really helped lead the science. There was also a lot of cooperation with industry in that part of EarthScope that actually drilled down into the San Andreas Fault. We utilized the cutting edge of the industrial drilling developments, which enabled us to do directional drilling and utilize many of the down-hole instruments that were developed by the oil industry, for example. Basically, we're taking advantage of industry but not funding a lot of developments in that. Although, you can think of way before my time when science that was funded by NSF played a key role in shale oil recovery, for example. That was a scientific breakthrough that basically enabled the US to become independent in terms of its own oil resources.
ZIERLER: You mentioned it in passing, but to discuss in more detail, how did EarthScope begin?
WHITCOMB: It originated way before EarthScope started. There was a period of about 10 years when there were ideas and proposals floating around for drilling into the San Andreas Fault, putting a seismic array moving across America, having a GPS array installed in America, and developing InSAR for geoscience. These are all proposals that were more than 10 years old before Earthscope started.
The beginning of EarthScope is somewhat related to SCEC. There was a policy of the Science and Technology Center that once centers graduated from the program after 11 years, NSF funding would be cut. SCEC was worried about that, and rightfully so. Therefore I set up a meeting with NSF and SCEC management to discuss the future of SCEC. At the same time, I worked with upper NSF management to develop a cooperation between USGS and NSF whereby we could leave the Science and Technology Center umbrella and continue SCEC under separate funding. SCEC was so important in terms of the earthquake hazard in Southern California that I and others felt it should be continued. This separate funding plan was successful.
Although the continuation of SCEC upon graduation from the Centers program was resolved, everyone agreed that the meeting with NSF upper managers should be maintained with an expanded agenda. Basically, we kept the meeting because it was with Bob Corell , who was then the Assistant Director for Geosciences, and our new Division Director, Herman Zimmerman, in Earth Sciences. It was a good opportunity to bring all of these unfunded proposals that had been floating around for 10 years to their attention. We had a large group of scientists come in, make presentations along with two-page proposals that had been solicited informally. NSF had this special account for very large projects that was at the time called Major Research Equipment (MRE; now called MREFC). But these four projects discussed were all intermediate-level size, bigger than normal grants, but not big enough to be MRE. They were sort of in the valley of death, as I call it. Hard-to-get money because it was considered if they really wanted to do it, the Directorates could do it within their current budgets, but they were not big enough to be MRE-qualified.
The presentations were made, and upper management was excited, but there was still no way to fund these individually. I sat down with the division director and a couple of program officers, and it was just a matter of expanding the vision. What I've done my whole career since Caltech was to integrate multiple-technique measurements of crustal dynamics and structure, so that was the idea here, to combine all of these things. You had GPS measuring the dynamics and seismic measuring the structure. There was also drilling in the San Andreas Fault. If you were going to look at the dynamics of North America, you needed to see what was happening in the Fault. Drilling into the Fault to see the physics was important. Also, NASA had this incredible technique called InSAR, interferometric synthetic aperture radar, which had been around for a long time. But the first time they used it, they saw these V shapes in the ocean surface, and it turned out these were actually the ripples coming from submarines underwater, which were not normally seen. NASA had this great technique, but there was, I think, a lot of reluctance from the DOD to allow this into civilian hands. We were trying to figure out how to propose InSAR in a safe way acceptable to the DOD, so that was the fourth part of this presentation.
We combined these four things into a single package that basically had the vision of combined techniques to look at the dynamics and structure of the North American continent. The combined vision was an appropriate size for a Major Research Equipment proposal. I coined the name EarthScope, which everybody liked.
An MRE proposal is unusual in that it is submitted from inside NSF. I wrote the proposal with input from the community, organized the community, made the presentations to upper management and to the National Science Board, and was successful in getting EarthScope as a top priority in the NSF budget proposal to Congress. This was about a two-year process. That's how it got started.
ZIERLER: From its inception, what was the envisioned timeframe of Earthscope? Was it envisioned to be a bounded, discrete project? Or would it be ongoing indefinitely?
WHITCOMB: There are two answers to that. NSF wanted these things to be bounded. They didn't want them to go on indefinitely because they wanted to use the money for something else eventually. It was decided it would be appropriate as a 15-year project. The seismic array would move across the United States and Alaska over the 15-year period. The GPS instrumentation would be installed in a semi-permanent way, but we'd have to figure out ways to support that ultimately, after 15 years. The InSAR never got off the ground until just recently because either NASA or DOD stopped it. Now, NASA has a project that will be InSAR for earth science called NISAR. It is a cooperation with India, who will jointly support it, and is scheduled for a 2023 launch date. The initial idea for EarthScope was that it would take 15 years to basically complete the plan, and that 15 years would be a natural breaking or pausing point.
What has happened, and the 15 years was up fairly recently, was that much of the permanent installations were either taken over by the USGS or were continuing to be supported by the division of earth sciences, either through a university consortium called UNAVCO or through IRIS. IRIS is seismic, and UNAVCO is GPS and InSAR. Relatively recently, the NSF basically told IRIS and UNAVCO that they should combine forces, which might be more efficient for management reasons, and maybe scientific reasons, it's not clear. IRIS and UNAVCO decided for their next proposal that they would combine, and they chose the name EarthScope for the name of the combined entity. EarthScope will continue basically in name and somewhat in function. That will happen, I think, in the next year or two.
ZIERLER: Given how broadly conceived EarthScope was in terms of all of the fields, geodesy, hydrology, vulcanology, what do you see as some of its legacies in promoting cross-pollination across these fields?
WHITCOMB: It's certainly a huge umbrella. You tend to get these serendipitous discoveries that cross the fields. I'll just name one example I know of, which is the GPS systems. They, of course, receive signals coming down from satellites. But when the satellite gets down low on the horizon, you get multiple signals from reflections off the ground, and those reflected signals are a function of moisture or snow on the ground, so a scientist at the University of Colorado used the reflected signals. They inverted the reflection data to calculate the amount of moisture in the local soil or the depth of snow on the ground. These GPS stations are now measuring not only the position of the station, but what the hydrology is locally in that area, which is incredible.
ZIERLER: What research did EarthScope make possible that otherwise simply would not have happened?
WHITCOMB: Certainly, EarthScope made possible knowledge of the structure of the entire United States crust and upper mantle in a uniform basis because they marched with uniform spacing across the United States. EarthScope's seismic data also has given us much more detail about the structure of the mantle and the core that never would've been discovered without the EarthScope density of instrumentation. The GPS gives us now real-time measurements of strain, a direct measure of the possibility of earthquakes. Part of our goal was to capture two volcanic eruptions with EarthScope instrumentation. Two eruptions were captured, one at Mount St. Helens, and one at Augustine volcano in Alaska. Observation of volcanic eruptions had not been previously done with multiple instrumentation in such detail. And I previously mentioned using seismic noise to invert for seismic structure in the crust and the upper mantle. That was a clever use of the data.
ZIERLER: Because it's such a wide-ranging area of research and so fundamental to NSF's mission, did EarthScope contribute at all to climate change research?
WHITCOMB: The satellite data from GPS is certainly a direct measure of the precipitable water vapor, and I think meteorologists are only now beginning to use that data on a regular basis. Since you have GPS stations all over the world, you can get a much better measure of the weather and use that as input for your weather models. Then, I did mention that you can measure soil moisture next to every GPS station around the world.
ZIERLER: Getting back to your own career, following your position as program director in the geophysics program, where did you go from there within NSF?
WHITCOMB: Then, I became what's called a section head, so I had several programs under my supervision, geophysics, geochemistry, tectonics, Earthscope, and Instrumentation and Facilities. I had more than half the programs in the division, under my direction. For a while, I was also acting division director, but that was just about a year.
ZIERLER: Was this the position that you retired from, or was there another one after that?
WHITCOMB: I went back to section head, so I was still supervising these multiple programs when I retired.
ZIERLER: What year did you retire?
ZIERLER: Are you enjoying a complete retirement? Are you active in the field and following the literature?
WHITCOMB: Yes, I'm following the literature, and the last couple years, there have been some virtual meetings, which I've tried to attend. I'm trying to keep up as much as possible. But it's really a blessing when you're employed in the field and are able to go to meetings as often as you want to keep up with the science. And your friends are spread out all over the world. So in retirement, unless you're really wealthy, it's hard to keep that up.
ZIERLER: Let's go back and establish some personal history. By way of context, before graduate school at Caltech, where did you do your college degree?
WHITCOMB: I grew up in Colorado, went to Colorado School of Mines, where it was all boys and all engineering degrees at the time. I got a degree in geophysical engineering. Then, when I left there, I was interested in oceanography and went to graduate school at Oregon State University, where I got a master's in oceanography, basically doing geophysics on boats, mainly off the coast of Oregon. Then, I worked for the US Geological Survey for two years at the USGS Branch of Astrogeology, which is in Flagstaff, Arizona. The Flagstaff office was started by Gene Shoemaker of the Shoemaker-Levy Comet fame. Gene was one of the leaders in exploring extraterrestrial geology, if you want to use that term, for the moon and various other places. The Branch of Astrogeology is where that was done. Then, I got an award under the Fulbright-Hays program, spent a year in Sweden under that. On recommendation of my professor in Sweden at the Seismological Institute at the University of Uppsala, I applied to Caltech, and that's when I went there.
ZIERLER: As a college student, were you specifically interested in seismology and earthquakes? Or was it more a general geologic perspective?
WHITCOMB: I guess I was more interested in physics and geology combined, which was geophysics. Basically, I drifted into the thing that most appealed to me, and that was the department that fit my interests best.
ZIERLER: What year did you arrive in Pasadena?
WHITCOMB: I came to Pasadena twice. The first time was to Caltech in '67, then I came back to join ISTAC in '85.
ZIERLER: In '67, were you up in the old Seismo Lab, the mansion in the San Rafael Hills?
WHITCOMB: Oh, yes. That was a great place.
ZIERLER: What was it like to be there?
WHITCOMB: It was just another world. Once a year, they'd renovate a bathroom into an office as we needed more room. Which was kind of a shame because it was a beautiful old mansion. At lunchtime, we'd go play tennis. They had a tennis court. There was an elevator shaft down into the ravine there, but it wasn't working. There were kumquat trees, avocado trees, loquat trees, all sorts of fruit trees down there. We had a hose, and we'd shower off after tennis, and go back to work in the afternoon. [Laugh] It was a great place.
ZIERLER: Is that where you spent a majority of your time? How much of your time were you on campus as a graduate student?
WHITCOMB: I guess we moved down to campus just before I graduated. I really didn't spend a lot of time on campus as a grad student when we were at the mansion.
ZIERLER: In the late 1960s at the Seismo Lab, what were some of the big ideas, the debates, the things faculty were excited about?
WHITCOMB: The big debate was whether plate tectonics was real. Because much of the evidence for plate tectonics came out before I got there. But at the institute I was at in Sweden, they still believed that, for example, Iceland had been pushed up by horizontal forces pushing in and squeezing up Iceland into an island, which is exactly the opposite of what is happening; Iceland is pulling apart. Iceland is there because volcanic rock is coming up to replace the gap that is left. There was still a lot of debate about plate tectonics. I remember when I was at Oregon State, we had one of the great early names in plate tectonics come and give lectures. About half of people poo-pooed the idea and thought it was far-out and crazy. But at Caltech, I think there was still some hesitation about plate tectonics, even then. But then again, we had these research projects after a year at Caltech, and they probably still do them, and one of mine dealt with plate motions. I guess we pretty much believed in plate tectonics by '67.
ZIERLER: What was the process of determining who your thesis advisor would be?
WHITCOMB: Don Anderson was recommended to me before I even went, by the professor I worked with in Sweden. His name was Markus Båth, a seismologist. I was accepted at MIT and was going to go there before I got the Fulbright, and I was hoping to go to MIT once I came back, but Markus talked me into applying to Caltech. He said, "You should work with Don Anderson." Basically, I guess, it was recommended to me.
ZIERLER: What was Don Anderson working on at that point? I know he was doing so many things.
WHITCOMB: He was doing all sorts of things. He was working on the structure and chemistry of the mantle and physics of mantle materials. It turns out that he was a key leader in the formation of IRIS and the Global Seismic Network beginning in 1984. I think those were his his main interests. But he was interested in everything. There wasn't anything he couldn't talk to you about.
ZIERLER: In what way were computers in use at the Seismo Lab?
WHITCOMB: Anybody in my generation remembers walking around with these giant boxes of computer cards. That's what our programs and data were on. God forbid you ever dropped a box or a deck. [Laugh] And it did happen at times. I remember I bought a little motorcycle and had it strapped on the back of my motorcycle. If that would've dropped and scattered, it would've been weeks and weeks of work gone. [Laugh]
ZIERLER: Of course, this was decades before online databases. How was data disseminated? How was it shared, not just within the Seismo Lab, but more broadly, to the extent the Seismo Lab had its own proprietary data?
WHITCOMB: A good example is that we'd hear on the news that there was a major earthquake somewhere, and we'd start immediately composing letters and sending them in the mail, asking various institutes around the world for their seismic records, the originals, if they would send them, or good copies. Then, we'd wait for the records to come in by mail, and once they arrived, we'd hand-digitize every record. Of course, if there was really a large earthquake, the light beam would go completely off the edge of the photo paper, so you'd have to search the edges to see where it came back, then guess how far it travelled off-scale. It was very time-consuming. It's amazing now that you can just go online and download the data instantly from anywhere in the world.
ZIERLER: In terms of creating your own thesis topic, what aspects of it came from Don just giving you a problem to work on, and what aspects were your own initiative and the things you wanted to do?
WHITCOMB: Don actually gave me my first idea, which was to look at something called P'P', which is seismic waves that travel through the earth's core, reflect from discontinuities in the mantle on the other side of the earth, then come back down and are recorded on the other side of the earth. He saw this interesting paper, so I started looking at P'P'. Then, basically, in order to invert the data, I found that there was a great deal of uncertainty in the models of the core of the earth. There were lots of different layers that people were proposing. They didn't agree with each other, so I had to know what the structure of the core was because all these waves I was looking at that were P'P' were basically going through the core twice, so I needed to know the core structure very accurately, so then I started looking at core structure. That led me to visit MIT, where they had data and programs to look at arrays of seismometers, especially an array called LASA up in Montana, which actually measures what's called the phase move-out, the time delay of the phase as it comes up, which identifies which part of the core that particular signal came from.
After doing all that, I finally discovered that all of these complicated cores could be simplified, and there was really only basically one layer there, so I made a new core model and then utilized that to analyze the P'P' data. In the middle of all this, the San Fernando earthquake happened in 1971, so everybody dropped everything and went out in the field to find the fault or started looking at seismic data. I started looking at the first motions of seismic data, which you could invert for the earthquake focal mechanism. There were all sorts of initial ideas for the focal mechanism mostly based on already-mapped faults in the area. Basically, data showed right away it was a thrust earthquake on a fault that was not mapped, so that helped people in terms of where they would go out in the field to start looking at the fault itself. That became another part of my thesis, the focal mechanism of the San Fernando earthquake and its aftershocks. And with that, I got a quite detailed picture of what the fault looked like in terms of its shape with depth. It turned out to be quite a complicated structure. Then, we also went out in the field to help map fault offsets and things like that. It was quite interesting.
My thesis turned out to be in three parts: P'P', structure of the core, and San Fernando earthquakes. [Laugh] I ended up spending six years at Caltech getting my PhD. But the thing that's a trap is that it's so much fun that you're really not in a hurry to get out. Finally, a lot of people just have to say, "Okay, I'm going to cut it off here and graduate." [Laugh]
ZIERLER: As you say, it was a very complicated structure. If you could explain, relative to what? What's your baseline?
WHITCOMB: Well, when you say thrust fault, most people think of a fault that is a flat plane, and then the slip is along the plane. It turns out that the main part of the fault was a flat planar thrust, but there was another part that curved down on the west side. The main part was a thrust fault, but the part on the west side turned out to be a strike slip fault.
ZIERLER: There's always that duality in the thesis between being hyper-focused on what you're doing and being responsive to some of the bigger questions in the field. On that latter question, how did you feel you were slotting into what was going on more generally at that time?
WHITCOMB: I guess you get that naturally because you're not only working on your own stuff. The Seismo Lab had this wonderful institution called coffee hour every morning. In the old lab, it was in the furnace room, this narrow hallway with a wall on one side and the furnace on the other side. And Don Anderson always had a certain chair he sat in, and we'd have coffee and talk science. It was a new topic every day, twice a day. Then, we'd have visitors from around the world who'd come and spend six months or so at Caltech. It was just a great environment. When we moved from the old Seismo Lab down to the campus, we tried to maintain that coffee hour institution. I guess it was somewhat successful, but it turned out to be a regular room with chairs and stuff like that. But it never had the same flavor of coffee at the old Seismo Lab. Then, you'd always go to meetings at least two or three times a year. You would also be reviewing other people's research. For example, every year, the new class of graduate students had to do three research projects, so part of the duty of the older graduate students is to help them and review that, have practice sessions and presentations. We were constantly reviewing other people's research besides the coursework we were doing. Everybody at the Seismo Lab, all the professors, were working in the frontiers of fields, and they were all interested in diverse things, not just Don Anderson, but everybody usually had a lot of different interests. It was just a way of life. You were always involved in broader interests in the science.
ZIERLER: What was Don's style like as a mentor? In what ways was he hands-on, and in what ways was he hands-off?
WHITCOMB: I've thought about that a lot. Some of my best memories were Don and I in his office, talking science. It was a real high, like a drug high almost. I've often thought about what made him like that so that I could emulate it as much as possible. Don was just a brilliant guy with really wide interests, and he sat and listened to me, and I think he pretty much did that for other people as well. Then, he'd chime in with some idea, "What about this, what about that?" But basically, he'd just let you run with your ideas. I know he had the reputation for being abrupt or something, but I never saw that. Maybe he didn't suffer fools lightly, but that's a little bit off the mark. He tended to irritate people who couldn't convince him of what they thought he should think. More times than not, he was right about what he thought. He was just an amazing guy. He was a lot of fun to be around; he made sure the tradition of the Seismo Lab always had an occasional sherry afternoon, where we'd stand on the front steps of the old Seismo Lab and have sherry. I can't remember doing that on campus. Maybe there was some liquor prohibition or something like that. His wife was charming and had us to their house frequently. I just enjoyed Don a lot.
ZIERLER: Besides Don, who else was on your committee?
WHITCOMB: Tom Ahrens, I think. I can't remember the names. I don't know why, but my defense was just a breeze. It turned out to be fun. I was apprehensive at the beginning, but it really wasn't bad.
ZIERLER: After you defended, what opportunities were available to you, and what was most compelling?
WHITCOMB: As soon as I defended, they basically wanted me to stay around. The seismic array that was in Southern California was run by the Seismo Lab and USGS and was all analog, and they wanted to make it digital. They basically put me in charge of the seismic array, and we transformed it into digital data. I hired people to work on that and work on the normal seismic duties of reading records. Basically, you'd put the seismic phase data into telegrams and send them into an international agency that collected data from around the world.
ZIERLER: What were some of the overall missions of this work?
WHITCOMB: Digitizing the seismic array, but I also worked a lot on just doing research, writing proposals for the USGS support of the seismic array. I also became the liaison to JPL. I worked with people from JPL before I graduated. I was working with the astronomers who were doing very long baseline radio interferometry, VLBI; they utilized the big radio antennas, like Owens Valley and places like that. But one person at JPL actually made a portable VLBI antenna. One of the things you solved for in VLBI, which uses very stable distant objects that are probably out at 13 billion light years away, was the position of the antenna itself. This was before GPS, so we were trying to utilize that for measuring strain of the crust of the earth. Also, I was working with some of the people from JPL and others that I brought to Caltech to use the technique of magnetotellurics to measure the electrical resistivity in the crust of the earth. I wanted to test some claims that crustal resistivity changed as a function of stress in the rocks, which might be a precursor to earthquakes. I brought in a couple of people to Caltech to work on that for three or four years.
Also, I worked with some of the people in the physics department to develop radon-monitoring sites along the San Andreas Fault because there were some claims in literature that people saw changes in radon before earthquakes as a function of stress in the crust. I funded this research through my NASA support. I was looking at a lot of different techniques for the measurement of crustal changes. There was also a lot of claims in the literature about animals that were excited before earthquakes that might be precursors. So I actually looked into the egg-production industry via some of the giant chicken ranches around Southern California to see if there was any possibility of using that. But turned out that egg production is a function of many other things, including temperature, food, and so forth. There didn't seem to be a possibility of separating out any other variables from that.
ZIERLER: Was this sort of a long-term employment option for you, or did you see it as more temporary?
WHITCOMB: It was long-term as long as we had money to fund the operation of the seismic array. But I wanted to do other things, and that's when I went to the University of Colorado and CIRES.
ZIERLER: What year was that?
ZIERLER: What department did you join?
WHITCOMB: CIRES, Cooperative Institute for Research and Environmental Sciences, and they actually had geophysics as one of their areas of interest.
ZIERLER: This was a tenure-track appointment?
WHITCOMB: It was a permanent appointment. The employment actually was with a government agency, which was NOAA. CIRES was a joint effort between University of Colorado and NOAA. Then, I had a research faculty appointment at the University of Colorado.
ZIERLER: What was the job at this point?
WHITCOMB: Basically, doing research. I was looking at gravity. There was a US Geological Survey claim that they saw a big uplift in Palmdale, California, so I was trying to get independent confirmation of vertical motions other than from first-order leveling, which the Palmdale data was based on. I wanted to use gravity. You move the ground up, you're further away from the center of the earth, so the gravity drops. I was looking at time-dependent gravity. I had a network of gravity stations throughout Southern California and up through the Sierras to look for changes in elevation.
This was also the time that GPS was becoming important for the study of crustal strain and I was a co-PI on the proposal in 1985 to NSF to establish the UNAVCO consortium. UNAVCO's primary focus was to create a broad instrument and data capability for crustal strain monitoring using GPS.
ZIERLER: You were there for five years.
ZIERLER: What prompted the move back to Pasadena?
WHITCOMB: GPS started to be realistic in terms of being more accurate for measuring crustal motions. To me, this was the ultimate, to be able to measure crustal stress, which is a direct measure of the potential for earthquakes. I joined my friend in this company called ISTAC to develop instrumentation to do that. The big problem at the time was that the military was going to deny the accurate signals to civilians. My friend had a way to get around that. Then the Challenger disaster happened, they stopped putting up satellites, and the company ran out of money because there were no signals coming down. That was the end of that.
ZIERLER: I wonder if you can explain technically and intellectually why GPS was so good in this line of research, then in being able to make these measurements, what bigger questions did that get at?
WHITCOMB: To the first order, GPS is good because whenever you're measuring length using some sort of wave, the shorter the period of the wave, the more accurately you can do it. GPS signals were down at the few-centimeters level. If you measure that to a fraction of a wavelength, you're getting down to something interesting, sub-centimeter. And we had never been able to do that. For example, we knew plate tectonics probably worked over millions of years, but we had no idea over the short term whether the plate motions were smooth or jerky. We knew they were jerky close to faults because when an earthquake happens, it jerks. But when you get far away from a fault, what is the motion like? We had no idea because we couldn't measure it. Since the motions were on the order of three to five centimeters a year, you had to get down to the centimeter or better level to make any headway. The military had this great positioning technology, but they were going to limit the really high precision to the military only because they didn't want to give enemies the capability of being able to put them on rockets and hit things precisely. They were going to deny that access, but since the wavelength for the radio signals they were sending back and forth to satellites was small enough, if you could just lock onto a single wave, over time, you were able to get to a high accuracy. Of course, the other big difficulty was knowing where the satellites were, and that was a real problem.
But NASA and other people continued working on that to the point where they really knew where the satellites were to a high precision, which had to be calculated every day because the earth's gravity field is not only widely variable spatially, but it's variable temporally because any time you have a change in moisture, like in California, that changes the gravity in California by quite a bit. Or if the satellites fly over Greenland, one can actually invert that data to look at the change in gravity due to the loss of water from Greenland. You need to calculate the position of the satellites all the time because gravity was changing all the time. That was a problem. Now, they've got that solved and basically, by sitting there for a long time and watching the satellites over long periods of time, you could knock down the uncertainty to sub-centimeter, and that's good enough to make great strides in terms of measuring the plate motions.
ZIERLER: Being back in Pasadena, did that provide opportunity for collaboration at Caltech at all?
WHITCOMB: Actually, no. We were quite busy, so I didn't work a lot with people at Caltech at the time. I'd go to seminars and things like that, but I didn't visit Caltech that much.
ZIERLER: What happened with the funding?
WHITCOMB: Basically, it was privately funded, so we got to the point where we were just starting to develop instrumentation and services for commercial use, and there were no satellites, so our funding ran out, and we didn't have deep enough funding from individuals, so we ran out of money.
ZIERLER: Did the instrumentation have an afterlife at all? Was it picked up by other companies?
WHITCOMB: The technique was probably picked up. Much of it is confidential, I guess. Also, the military backed down because there was so much demand for those signals. I'm not quite sure what happened to that technology, but things advanced very rapidly, and many systems are able to measure at high accuracy now.
ZIERLER: It was at this point that the NSF opportunity came up for you.
WHITCOMB: Yes. Partly through a recommendation from Don, actually.
ZIERLER: We started our talk with the beginning of your career at NSF, so this brings us full circle. For the last part of our talk, I want to ask a few retrospective questions, then we'll end looking to the future. Just to go back to the Seismo Lab, what has stayed with you and informed your approach to the science, the way you've supported science at NSF, that you learned at the Seismo Lab, the way of analyzing data, collaboration, the overall culture of seismology, that's been useful to you over your career?
WHITCOMB: What's been most useful, I think, has been to develop sort of an instinct for what's promising and what's iffy in terms of science. Sometimes it's almost impossible to tell, and you just need to give somebody a chance to prove something. That's probably one of the biggest things I've learned. Sometimes you just give somebody a chance to show something is true or not. That's a lot of what the NSF is about, funding cutting-edge research. You're going to have some things that don't work out and many things that do.
ZIERLER: From your long career at the NSF over many presidential administrations and changes in history, how has the overall relationship between NSF and the researchers it supports changed over the years?
WHITCOMB: I don't think it has changed. NSF is in a position of great power, which comes with great responsibility. You're giving money to people to do things, support their careers, support their research. You have to be very careful how you step and not be heavy-handed in things. The relationship is one of funder and fundee. That doesn't change at all. If you have money to give, you're going to find people who would like to be funded, so you have to carefully evaluate and be as fair as possible in terms of how you make those decisions.
ZIERLER: Finally, looking to the future, to the extent that you've kept on top of the ways in which NSF is funding seismology and geophysics, looking 5 or 10 years out, what are the most important kinds of research NSF is uniquely suited to support?
WHITCOMB: The huge issues now are climate change and how that's going to affect life on earth. I'm not quite sure what geophysics has in terms of directly affecting that, but there are bound to be things that crop up that are in the bailiwick of people in geophysics or earth sciences that may help solve the problem. For example, there's a way to combine CO2 with minerals that basically sequesters it in the crust of the earth. I don't know if that's practical at the present, but it has been proposed, and people may develop a practical way to do that. I mentioned the thing about recovering oil from oil shale and that technology, which we funded in the past. There was an interesting thing that happened, when there was an oil well blowout in the Gulf of California, it was really the geophysicists who were asked to make accurate predictions in how much oil was really coming out, and it was quite different from what the oil companies were saying. That gave valuable information for people who had to actually deal with that oil spill. It turned out to be much worse than they thought, so they had to increase their remedial efforts.
I guess the thing about research in any field is that it's hard to predict where something is going to be useful. It's always a nice surprise when something you thought was just an interesting result turns out to be applicable in terms of solving a problem. And maybe the study of the atmosphere and ionosphere using GPS and satellites may play a part in that. Other areas, it's hard to predict where it will come in. Certainly, volcanic hazards and earthquake hazards are important. Sea level rise hazard on the Gulf Coast is critically dependent on what the crust is doing there. It turns out that the ground around New Orleans has been sinking for many years, and we would've known that if we had regularly monitored the crustal distortions via GPS down there. Sinking crust made the flooding of New Orleans much worse than people would have predicted. These kinds of hazards require much better monitoring to know just how severe they are because of ocean-level rise and what the impact of that is going to be.
There are also some really interesting things happening along the East Coast of the United States. The sea level is rising, and simultaneously part of the East Coast is actually sinking which worsens the flooding hazard. The crustal sinking is a long-term residual effect from glaciers that used to be in Canada that have disappeared. Because the Canadian glaciers melted, the response is that the crust is rising where the glaciers were. But the effect at a distance is that the crust is actually going down; that is, along the East Coast. You have a combination of ocean level rising and the ground sinking, which increases the flooding along the East Coast. There's one place along the East Coast that is doing the opposite. From tidal gauge stations, it looks like there are some areas along where Cape Fear is that seem to be rising, at least from older tidal data. There's an interesting crustal strain happening there. We did have a magnitude 7 earthquake in Charleston, South Carolina, in 1886. Cape Fear is just north of there. The East Coast is vulnerable to earthquakes, too, and I think it would be interesting to look at a more dense array of vertical motion sensors like GPS or InSAR along the East Coast to see what's going on there. There are a lot of interesting things to look at.
ZIERLER: There's no end of interesting research to fund, that's for sure. [Laugh] Jim, this has been a great conversation. I'm so glad we connected. Thank you so much.