skip to main content
Home  /  Interviews  /  Thorne Lay

Thorne Lay

Thorne Lay

Distinguished Professor of Earth and Planetary Sciences, University of California, Santa Cruz

By David Zierler, Director of the Caltech Heritage Project
April 14, 2022


DAVID ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It's Thursday, April 14, 2022. I am delighted to be here with Professor Thorne Lay. Thorne, great to be with you. Thank you for joining me today.

THORNE LAY: It's a pleasure to chat about Caltech activity in the Seismo Lab.

ZIERLER: To start, would you tell me your title and affiliation?

LAY: I'm a Distinguished Professor of Earth and Planetary Sciences at the University of California, Santa Cruz.

ZIERLER: How big is the program at Santa Cruz, and how far back does it go?

LAY: The campus itself is relatively young, founded in the mid-60s, but one of the core programs that was established early on was in the Earth sciences. They had about ten faculty up until the late 80s, when we began to grow. Now, we are at 24 faculty. Among them, there are four studying earthquake science, either from seismological or rock physics points of view, so we have quite a bit of strength in that specific area.

ZIERLER: Over the course of your career, what have been some of the major research questions you've pursued?

LAY: I started in graduate school back in '78, so I didn't start working in the field of seismology until then. I began working on using modern techniques to analyze the waveforms recorded around the world for earthquakes and nuclear explosions. I've continued to today doing such work. I also engaged in my thesis work on studies of the structure of the deep mantle and outer core. I haven't been doing as much of that of late, mainly because there is such a tremendous improvement in the datasets for studying earthquakes now that we can do in one or two hours what would have taken me many months to do at the start of my career. It's really an exciting, rapid technological and data advance. The capabilities have improved tremendously for the problems that I work on, which involve anything that is able to make the Earth shake, so that also includes processes like landslides, and of late, I've been working on signals recorded from a huge volcanic eruption in Tonga.

ZIERLER: What have been some of the key theories in geophysics and seismology that may serve as guideposts for your research?

LAY: My undergraduate background was in mechanical engineering and geology in the interdisciplinary geomechanics program at the University of Rochester, and I really liked the engineering approaches of first principles characterization applied to deformation of structural beams, fluid flow and convection, and to various elasto-dynamic problems. Bringing first principle methods into geophysics was really beginning to take off in the late 70s. It was the approach of representing physical processes with force systems in the Earth that, through F = MA applied to a continuum, allowed you to quantitatively predict the resulting excitation of elastic waves that would propagate through the medium. Seismic instruments capture the elastic vibrations with analog or digital recordings with precise time information. We're able to apply the first principles theory of elasto-dynamics to quantitatively analyze the recorded seismograms for attributes of the physical forces operating at the source. Because the inferences are based on F = MA, we have confidence in the results, as they are not just empirical, qualitative interpretations. Much of geology involves very complicated systems that you cannot confidently represent from first principles. The field of seismology appealed to me because you can.

Throughout my entire career, that is what I have done, using first principles theory to quantitatively predict observed ground motions, to try to model those motions by optimization procedures that give you the best representation of the physical forces that generated them. Then you try to interpret the inferred forces in the context of some geological process like faulting, volcanic eruption, or landsliding, which in detail is very complicated and not fully characterized by linear elasticity. There are thus aspects of the process that we cannot resolve from seismology, but we are able to resolve that part that produces seismic waves. That is the common theme of my work. It is appealing to my geophysics side because of its rigor and its foundation in basic continuum mechanics. It is not all that is done in the field by any means, but it's what I have chosen to emphasize.

ZIERLER: I asked about theory. What about instrumentation? What have been some of the technological advances over the course of your career that have allowed for new ways of interpreting the data and drawing conclusions?

LAY: The main way that ground motions have been recorded dating back to ~1870, when they invented the first instrument to record seismic shaking, involves a system that can exploit inertia of a mass within an instrument that is moving relative to the ground that is shaking the instrument. The instrument, being coupled to the ground, is in an inertial system. This was solved by putting masses on springs, and being able to monitor the position of the mass relative to everything else which is moving when the ground shakes. For the first 100 years, that was done with simple mechanical systems that would record the movement of the mass relative to the surrounding ground by producing an analog signal either by a stylus etching on a rotating drum or a light beam reflecting off a galvanometer mirror on the mass to make a trace on rotating photographic paper. This produced paper recordings of ground shaking, and everybody has seen rotating drums recording wiggling motions. Those wiggling motions were simply keeping track of the mass' differential motions in the seismometer. The data were recorded on paper with precise time being indicated.

When I got to Caltech as a graduate student in 1978, most seismic data involved such paper recordings. Global network operators microfilmed the paper recordings from many stations around the world, and data archives (there was one in the Seismo Lab) had copies of the microfilm, but we still had to print the microfilm and make a new paper copy. To analyze that, you had to manually trace the copied seismogram on a digitizing table and enter it into a bunch of punch cards to tell you point by point where the ground had moved as a function of time. You were reconstructing how the pendulum mass had moved relative to the ground, by having a paper recording and then tracing it. It was not very high quality, and tedious to digitize the signals, but that was what we had to do. That transformed dramatically in the mid-1970s, when they began to replace the analog recording systems with digital recording systems for both regional and global seismic stations. All the tedious process of recording analog signals on paper, perhaps microfilming the record and reprinting the signals on paper, mechanically digitizing it, producing a bunch of IBM punch cards with digitized signals, and carrying boxes of them over to the main campus IBM computer to enter as digital data for some processing software, was replaced by the signal automatically being written to computer tape as a digital signal, and eventually, onto disks that could directly load the signal onto the processing computer.

Seismologists also developed electronic force feed-back systems in the mid-1970s that would keep the mass in the instrument from moving and output out how much time-varying force had been applied to the mass. That enables you to record much stronger shaking over a greater bandwidth when you write the signals in a digital form. It then became possible to recover the higher quality digital data much more conveniently, for example you can transmit the data from the field to operational centers in real time and make it available to many users by internet. That is where we stand today. Of course, that advanced progressively with improving computer processing power, internet connections, and other telemetry methods. But over the ensuing 45 or 50 years, these instrumentation advances transformed the discipline to where now, when an earthquake happens anywhere in the world, it sends out P-waves and S-waves that bounce around in the interior and generate surface waves. As those waves pass by a station, the ground shaking is digitally recorded at that point with accurate timing, typically on multiple sensors that record the full vector motion, and the signals are transmitted to a data collection center. That center can immediately process signals from stations all over the world, detecting when there have been P-waves or other waves arriving at different stations from a common source. The associated arrival times are used to locate the event, the ground shaking amplitude and distance from the source are used to determine a seismic magnitude, and the complete ground motions can be analyzed to determine the nature of the source, including the geometry of faulting for earthquakes, or to detect whether it has unusual characteristics of the forces that operate at the source that indicate it is an explosion, landslide, or some other type source. Much of that type of processing is now done by computers automatically for regional and global seismic networks.

What research seismologists do next is to process the data further than the routine processing that catalogues the occurrence of events, and we extract additional information, such as a map of the space-time slip distribution on a fault during a large earthquake. To give you a reality check, when I started working in the field, it would take months to collect the analog data on paper copies and to digitize them, before I could even begin to do seismology with the data. What would take me perhaps six months to assemble the data to study an earthquake literally can be done in an hour or two with higher quality data. That is pretty dramatic progress, and it makes doing seismology a lot more fun. When I talk to my students about these old analog records (which we still utilize as the only seismic recordings available for events prior to the digital revolution), and I describe how we have to mechanically digitize the signals, they just look at me like, "What kind of Luddite are you? Why don't you just download a digital signal and analyze it?" Transforming global seismic recordings to a digital recording format involved a huge community effort that was in part spawned while I was a grad student by efforts at Caltech to convert the regional seismic network jointly operated with the U.S. Geological Survey to digital recording and when Adam Dziewonski from Harvard visited, eventually leading to deployment of a whole generation of new digital seismic instruments around the world that we use today.

ZIERLER: All of these advances in the capture and processing of seismological data, what does that mean for the importance of field work? In other words, can you do all of your seismology from a computer terminal, or are there still moments when it's best to be on-site when possible?

LAY: Of course, depending on the problem you want to solve, you may or may not be able to get sufficient data to address it with just the data collected from permanently deployed stations, which are rather sparsely distributed. In California, there's a station about every 20 kilometers, so it's pretty well-sampled. You don't often need to put out a lot more instruments, although seismologist do so when there are large events and they want to capture as many small aftershock recordings as possible. But if you're studying a remote part of the world, there will not be such a density of instruments. You can go and put out portable recording instruments that have a force feedback system and write to a local disk recorded on site, and in some cases that can be connected to an internet or microwave relay that immediately sends the data into a processing center. If your problem calls for higher-resolution sampling of the ground motions in an area than is provided by existing permanent networks, seismologists go and deploy instruments. I haven't been doing such work myself since the Loma Prieta earthquake in 1989 when I was involved in perhaps the first deployment of portable digital seismic instruments to record aftershocks. But that type of fieldwork is still the main focus of many seismologists. They love to go and deploy instruments in remote areas, capturing aftershock sequences to perform very detailed studies of large earthquakes and local seismic velocity structure.

When new earthquake-generating processes occur, as with the production of earthquakes from injection of wastewater in Oklahoma, there were not initially many seismic stations there because there had not previously been very many earthquakes, but now many stations are deployed to capture the anthropogenic triggered activity. This is motivated by the concern of triggering damaging earthquakes or of broaching the subsurface containment of the hazardous water that has been injected.

The other aspect of field work, which is particularly important in projects I'm working on now, is to go into the area where large earthquakes have happened in the past and try to characterize those earthquakes by careful geologic inspection of, for example, tsunami deposits. Some of the colleagues I'm working with are doing this in the islands along the Aleutians and Hawaii. They go look for past inundations by big waves that could not be produced by a storm, and they date the deposits by radiocarbon dating, using little bits of wood that got caught up in the inundation, giving a history of big earthquakes affecting both the Aleutians and Hawaii that happened long before we had any seismic instruments. That's often the only information we have about the prior behavior, provided by such geological observations, which can go back several-thousand years in some cases. For example, field measurements provide us the most quantitative information about large historical earthquakes off the Cascadia Coast, along Oregon and Washington. The last big event there was 300 years ago, long before any seismic recordings. That event was clearly experienced by the local indigenous population by the big tsunami waves, and tsunami was also detected in Japan. Now, geologic records on land and in sediments offshore have been able to map out 18 to 20 prior comparable-sized events over the previous few-thousand years. We now understand how often they happen so mitigation efforts for the next event can be pursued. Fieldwork is thus indispensable.

Detailed examination of surface fault exposures have also been extensively performed in places like the San Andreas Fault, where you can dig into the fault and tease out the signals of prior ruptures that constantly get overprinted by the next event. The most recent event messes up the features from the earlier ones, but with careful geologic work, you can figure it out. Fieldwork is very important. I don't do it myself now, but my hat is off to those who like to get out there and get dirty. It's essential, a very critical effort for understanding active tectonic systems given our short seismological history.

ZIERLER: Before we go back and develop the narrative of your time at Caltech, more recently, you had opportunity to reflect on the life and legacy of Professor Donald Helmberger. Tell me a little bit your relationship with Helmberger, what it meant to you and what some of his key influences in the field were.

LAY: When I arrived at Caltech, the Seismo Lab had a policy for incoming graduate students, which was that you were not immediately designated to work with any particularly professor, but were instead invited to talk to people, find out what they were doing, and try to gravitate toward something that intrigued you. Basically to sample the smorgasbord of activities to figure out what it is that you like. There could involve going into the field, going into very theoretical work, or looking at data. You don't necessarily know when you walk in the door, with little experience in seismology, what's going to appeal to you. I initially worked with several professors, Don Helmberger, Hiroo Kanamori, and Don Anderson. Starting a project with them was facilitated by their open-door policy. All of the faculty just kept their door open. If you looked in, they were sure to be working, but if nobody else was there, you could go talk to them or schedule a time to meet.

The process was helped by daily coffee hours, where faculty and students would bring in current results that they were working on, and everybody would sit around and talk about them. That spawned conversations that you could follow up and chat with people about to develop some research project. That's how I got to know Don Helmberger, Hiroo Kanamori, and Don Anderson. Dave Harkrider was on the faculty and I did eventually write a paper with him, and Bernard Minster was also on the faculty. Don Helmberger and Hiroo Kanamori were pioneers in what I described before, how you could represent physical processes, explosions and earthquakes, with force systems, specify an Earth model, calculate how that force system would generate seismic waves that propagate through the Earth model and be recorded by a seismic instrument, providing a quantitative framework for explaining the observations. They introduced me to that perspective in classes they taught and in conversations as we discussed possible projects. Rather than work on theory or go to the field, I found myself drawn to observations and seeking explanations for the signals using the techniques they had developed.

Don Helmberger and Hiroo Kanamori were key contributors to the development of theory for solving the elastic wave propagation problem for seismic waves. I was particularly interested to figure out how to use the methods they had developed as I didn't immediately see any way I could advance the theory more than had been done already. I started projects with both of them to assemble datasets and to learn how their codes could calculate the results for a specified force system representing an earthquake or an explosion. I worked with them in parallel and learned from both of them their different techniques. Professor Helmberger emphasized body waves, P-waves and S-waves traveling through the earth. Professor Kanamori was working mostly on surface waves which are formed by interference of P-waves and S-waves. I was exposed to the different observations of seismology by working with the two to do this quantitative analysis, as we used our observed recordings to deduce the forces operating at the source that generated the body waves or the surface waves. What was particularly interesting about working on projects with those two advisors was that they had very different styles. Professor Kanamori was a great teacher, very patient with some newbie like myself who came in knowing nothing. He could really elevate you and excite you to work on long-period seismology.

Working with Don was similar, but his teaching style was different. He never seemed to know what the answer was going to be until you did the work, whereas I think Professor Kanamori usually knew what the answer was going to be and was guiding you to find it. The difference in style made it fun working with both of them. It was very exciting. With Professor Helmberger, I started working primarily on signals caused by heterogeneity inside the Earth. By the 1940s, seismologists had produced models for the Earth velocity structure from the center of the Earth to the surface that were one-dimensional, varying with depth, and they were quite good. They could predict the travel times of direct P phases to within a fraction of a percent anywhere in the world. That was quite useful. But a 1D model is not sufficient to account for some of the observations. There are laterally varying structures at all depths. For example, there are oceans and continents at the surface. Within the interior, there are also lateral variations associated with mantle mixing, but they were pretty much unknown at the time. Don was very interested in those variations, how we could detect them with seismic waves, and what their importance might be. I worked with him on mapping three-dimensional variations in structure from the get-go. Of course, that's why he didn't know the answer, because nobody knew the answer. It had yet not been determined. It felt like exciting new discovery work every time we would map some observations or I would bring him in data and say, "Look at how these vary." He would get very animated, and the feedback was wonderful, so it stimulated you to do more.

We used that mapping in the early stages as part of understanding the nuclear testing differences between the Soviet Union and the United States, where the test sites in the respective countries were in very different tectonic environments. The heterogeneity under each of those regions influenced the strength of signals that spread globally from the explosions and this had led to controversies over whether the Soviet Union was cheating under a yield limitation treaty that had been signed in 1976. By understanding the lateral variations in structure with Don, we contributed to recognizing that the signal variations were actually mainly due to the Earth structure differences. It was fun to have a project mapping out Earth structure that had consequences for the assessment of compliance with the nuclear testing treaty by a party that was, at the time, pretty hostile. The other aspect of working with Don was that he just loved it when you showed him new data. All of his students brought him observations like I did, something that deviated from simple models or some complex earthquake signals, and he enjoyed seeing the data and trying to model the observations to the extent possible with simple representations of the faulting or the structure. When it didn't work, he enjoyed trying to figure out what it would take to make it work, and many discoveries resulted.

From our early analysis of lateral variations, we started to detect waveform complexities that appeared to be produced by 1D structures, part of a layering in the Earth that wasn't in the standard Earth models. When we tried to explain the data, we found that you have to introduce a new discontinuity in the lower mantle velocity structure to account for the observations, and that then became the D" discontinuity that I ultimately focused on in my thesis. But it was in the process of looking at a lot of data for something else - we had been focused on three-dimensional variations in structure - that we serendipitously managed to detect a subtle feature within the observations. By modeling it with Professor Helmberger's codes, we were able to interpret it as a significant new earth-structure discovery.

ZIERLER: Prior to Caltech, when you were at Rochester, did you have a focus in geophysics and seismology?

LAY: No, not much. My major there involving mechanical engineering and geology was called geomechanics, a hybrid degree program. I only had one class that had introduced seismology to me, for perhaps three or four weeks out of the semester. It was taught by Professor Geoff Davies, who had received his PhD at Caltech in geophysics, working with Professor Tom Ahrens doing high-pressure experiments. He wasn't a seismologist, and while he covered seismology in his class, mostly we talked about mantle convection which he was working on then. So I wasn't sure I wanted to be doing seismology when I was an undergrad. I just said, "Well, that's nice. It is quantitative." It appealed to me. My mechanical engineering strengths were useful for understanding the governing equations and their solutions. But when I went to Caltech, I wasn't really sure what I was going to work on. That's why their open-door policy was great. I had some awareness of the field, but nothing in detail. I didn't know these very famous professors. It was like, "Don Anderson? Who are you?" [Laugh] I feel, looking back at it, that I was extremely naive and unprepared.

But the nature of the Seismo Lab was such that it accommodated that lack of awareness well and gracefully allowed you to find your way without a lot of pressure, at your own pace, and with a full exploration of possibilities. I really enjoyed seismology as I learned more and more, and it became very fun and satisfying. I was interested in the theory and could master it, but I soon became an observationalist, using data to make some statement about processes in the Earth. I owe that entirely to the tutelage I got from Professors Kanamori and Helmberger, finding that it was a very rewarding direction. Especially early on, as we were just getting flooded with the onset of new digital data. Nobody had ever looked at it, and we'd never been able to process data of such high quality. Being an observationalist allowed exploitation of the new data and made it very exciting to work in that area.

ZIERLER: Coming from the outside, to some degree, into geophysics and seismology, I'm curious what kind of advice you might have gotten at Rochester and elsewhere, where there might've the known reputation that the Seismo Lab was welcoming, that you didn't have to be an expert on day one.

LAY: I didn't get much positive encouragement. In fact, Professor Geoff Davies was my geology advisor, and like I said, he'd gone to Caltech, but he didn't really seem to have liked it. He didn't encourage me to consider Caltech when I told him I was applying to schools. I think his personal experience wasn't all that positive, but it wasn't in seismology. My other advisors were engineers, and they were all encouraging me to do magnetohydrodynamics, things that they liked. What proved most influential was that I applied to MIT, Caltech, and some other schools, which I got into. It was the difference in the reaction of the schools that impacted me. Back in those days, you couldn't do a Zoom interview. Almost nobody could afford to go and visit. They didn't have visiting programs the way we do now to try to sell the university to the student. You kind of relied on the communiques you got with your acceptance and maybe some letter saying, "You will arrive at this time, and this is how it works." MIT came back and said, "Your advisor will be blah, blah, blah." It was a famous professor, but that meant nothing to me because I didn't know who he was or what he did, nor that I would necessarily want to work in his area. Caltech's letter said, "Show up and talk to people, you will find some interesting projects." It was much more casual.

That very flexibility of the Seismo Lab manifested in their contact with the prospective students from the get-go, was compounded by the fact that I was sick of Rochester winters, and Pasadena was going to be a lot warmer than Boston, so it made it an easy decision. But I admit it wasn't from a lot of knowledge of Caltech from the people around me. Being in upstate New York, maybe not that many Caltech folks were there. But the attitude that was conveyed was, in many ways, more inviting. Some people might like their start on graduate school to be very structured. If they are being told, "This is the person you'll work with, you'll start with them," it perhaps gives them a sure footing. But I was so unsure of what it was that I wanted to do, I had actually simultaneously applied to medical school. An emerging application of mechanical engineering in the late 70s was toward artificial joints and artificial valves, and they needed people who could do fluid dynamics, which wasn't taught in medical schools. Most people going into med school had no mechanics, so if you were a mechanical engineer, you would automatically get accepted to med school with no biochemistry. It was something of a strange situation. All the poor pre-meds who were panicking over getting into this or that med school, trying to get A-pluses in biochemistry, which is really hard to do, and I got in without even taking the class. [Laugh] So for me I had to make a decision, "Do I go be a doctor, and work on biomechanics? Or do I go and try this geophysics topic down in sunny Pasadena?" I made the latter choice.

ZIERLER: Given the unique opportunity you had to take a tour of all the research the professors were doing at the Seismo Lab, once you got the lay of the land, what was your sense of some of the big debates that were happening at that point in the Seismo Lab and in the field more generally?

LAY: Certainly, there were several things that were constant topics in coffee hour. One was that, in the mid-70s, the Seismo Lab had been very involved in developing an understanding of the effects of nonelastic processes on seismic propagation, what we call attenuation. These are deviations from the purely elastic F = MA solutions we had. We had to correct for nonelastic processes that kind of suck energy away. These are typically microscopic processes. As a strain wave propagates through rock, dislocations in crystals can move around, and that energy's lost from the seismic wave. Fluids can be pumped around, and that energy's lost. You have to account for those loss mechanisms. What had been done at Caltech was to understand the phenomenological representation, how we could use that to adjust our elastic propagation calculations for anelasticity, and you would have to tune it for the empirical representation that would map to the data, but it had strange effects in that it affected the travel time of waves as well as the amplitudes. Professors Anderson, Minster and Kanamori had demonstrated that this could, for example, cause you to infer different velocity structures if you looked at short-period body waves versus long-period surface waves. The effects of anelasticity predicted the differences along with the amplitude corrections needed to correctly estimate the source radiation. When I arrived in '78, work was still going on in applying the new attenuation representations. It was transformative, as it reconciled inconsistent models from earlier studies and established that the seismic response of Earth is frequency-dependent. I found that fascinating and worked with Don Anderson on attenuation in the core apart from my work with Kanamori and Helmberger. They had already adopted parameterizations for representing attenuation in the types of waves they were looking at. That was an important advance in matching observed waveforms and arrival times.

I think the bigger context of change was the transformation from analog to digital data that was happening. I was involved with Professor Kanamori and Jeff Given, who was a graduate student, in what I think was one of the first studies using digital recordings to study an earthquake space-time rupture. This involved an earthquake in the Gorda Plate off of Northern California. That established for me how much better quality the data were than the hand-digitized data I'd been working on for other studies. That analysis also indicated the future, as it suggested that, "You're going to be able to automate this," because these digital data would be coming quickly through various transmission procedures. Rapid finite-fault modeling would be possible within a few hours rather than spending months to do the analysis, as we were doing in these early stages. You could feel that that was going to transform the field, and over the next decades it did.

Another topic of the time was the raging controversy over nuclear testing. The Reagan administration was pounding the table asserting that the Soviet Union was violating the 1976 Threshold Test Ban Treaty. At coffee discussions it was made clear that the main players for underground nuclear test monitoring are seismologists. It is a key function they serve for this critical national security issue. Assessing whether the administration was misinformed by earlier paradigms for how seismic waves would behave for one-dimensional Earth models was a big topic.

ZIERLER: I asked about debates. What was your sense at the time of some of the settled science, debates from previous academic generations that, by the time you got to the Seismo Lab, were basically understood and accepted?

LAY: I think seismologists had settled an early controversy over what would be the right force representation for a shear dislocation like faulting. There had been ambiguity over whether it was a single or double couple. That had been resolved. Professor Helmberger had already convincingly demonstrated that routine analysis procedures used for early seismic recordings involving simply computing an amplitude spectrum and looking at the shape of the spectrum to characterize the source was grossly inadequate. The simple analysis needed to be modified to include the effects of the waves going up from the source, reflecting off the surface, and traveling down, interfering with the direct waves. This was best achieved by synthesizing the complete ground motion in the time domain. At the time, there weren't that many institutions in the country with strong research programs in seismology, perhaps a couple dozen. Now, the Incorporated Research Institutions for Seismology, a consortium supported by the National Science Foundation, includes all ~120 PhD-granting seismology programs in the US. A large number of those include graduates from Caltech who have found faculty jobs as programs expanded. There's now a much greater extent of capability across the discipline, which has been great because the person-power had to expand in order to exploit the bountiful new digital data and the advancing computer capabilities, which allow you to automatically process more and more data with greater resolution. That big expansion happened after I left Caltech, and has persisted throughout my career. There had only been a handful of Don Helmberger Ph.D. students who had graduated before me. Many more were to come and have had very successful faculty careers elsewhere. I was just in the early stages of that.

ZIERLER: What were some of the most important instruments at the Seismo Lab when you were there?

LAY: Caltech had been operating since the 1930s a handful of instruments around Southern California, which had been used by Professor Charles Richter to introduce the magnitude scale. The Seismo Lab had slowly been augmenting the network in collaboration with the USGS into a distribution of stations that were, at the time, just transforming to digital recording and telemetry to the Caltech site. There were then, I think, on the order of 80 stations of the Southern California network operated jointly by Caltech and the USGS. Those signals were coming in and being displayed on an instrument you may not know called a Geotech Develacorder, which was a display that showed the wiggles as they were being received, and they were simultaneously being written to, I believe, initially, paper tape. But while I was there, they were starting to record on 16-track tapes, big reels of magnetic tape. The network was in the process of transformation to digital recording, and that, of course, immediately made the data much more accessible.

When digital recordings became available, you suddenly had large quantities of pretty robust data. Still glitchy with telemetry dropouts from the telephone lines or the microwave transmission, but good enough to systematically process for seismic wave analysis. The network analysts were taking those signals and starting to routinely monitor lower and lower thresholds of seismicity in Southern California. That had commenced before I got there, but it was accelerating in capability while I was there. It made the lab an even more important earthquake center. Caltech had always been known in the Los Angeles area as a source of earthquake information, but the new activity with the USGS really codified that. Any moderate earthquake in Southern California would bring a lot of media to the Seismo Lab. Most famous was Connie Chung, who was a very beautiful reporter.

When she'd come in, some of us graduate students would all note, "Oh my gosh, she's so beautiful." She would always talk with Professor Clarence Allen, who was a really good spokesperson, and a geologist working in the Seismo Lab who knew a lot of earthquake science. We envied Clarence getting to spend time with Connie Chung. But the lab was certainly a hub for disseminating earthquake information. That was the most visible instrumental activity. Of course, there were instruments out in the field, but many of the students would never go out and see them. There was a pool of technicians that deployed and maintained them. There wasn't initially that much engagement of the graduate students, in part because the faculty tended not to work with the regional short-period data very much in those days. Bernard Minster did involve students in the use of the data, and increasingly it was used for earth structure studies by my classmates Tom Hearn and Marianne Walck, who both demonstrated the power of a large aperture seismic array to resolve Earth structure. I worked with some of the data on core-mantle boundary structure as well. We were all working on the computer using the signals that had been recorded and sent to the Lab, not actually going in the field to learn the details of the systems and the technical interfaces they had to various telemetry systems.

ZIERLER: It's going to sound like a long time ago, but what did computers look like, and how did people use them at the Seismo Lab when you were there?

LAY: That did transform tremendously, even in the five years that I was in the Lab. When I first got there, there was a main campus IBM computer. It was over in a different building. In order to run your code, you would write your mostly Fortran code–I still code in Fortran because it's powerful and I have a lot of legacy software–and you would punch the command lines into cards that were 7 3/8" x 3 1/4" cards, and each card would have basically a line of code or some data. You would have one or more big boxes filled with cards, then walk it over to the computer center, they'd read the program and data cards through a card reader, and perhaps nine times out of ten, it would read correctly. Every once in a while, a card would shred, and you'd have to do it all over. But that would enter the code into the memory of the IBM, where it would be put into some queue, and over the next few hours, it would get around to running your code. It would then output either a graph on a plotter, an output print file, and sometimes a card deck output.

It was surprising, but it worked. You just felt nervous carrying these big boxes of cards across campus. If you tripped and fell, to shuffle the cards would be a mess. But it was rather inefficient. I did that for the first year or two when I was working on a big project with Professor Kanamori. I digitized hundreds of hours worth of surface waves. Perhaps that's why I wear glasses today, it was hard on the eyes. The process was cumbersome. But in my second year or so, we got our first local Prime Computer. It was an in-house Seismo Lab mini-computer that hooked up to simple terminals around the computer room. That gave us local control on the computation and queuing. We were no longer in line with everybody else in physics and other programs at Caltech. The Prime had some primitive user interfaces. You could, for the first time play interactive computer games. There was a game called Star Trek, and your little ship would move around in space by entering commands on a keyboard. It was fascinating because you were actually interacting with the computer through the interface (typically late at night).

A big advance was that the mini-computer enabled us to make local plots and print outputs, and even digitizing went directly into the computer from a big table where you traced the seismograms. This eliminated punch cards and was great for speeding up the science. The regional network processing used a VAX system, receiving and processing the telemetered regional seismic recordings, but I never worked with the VAX computers directly, we just ported the signals of interest over to the Prime. There was also an early Apollo system and then SUN workstations came in toward the end of my time at the lab. I used a simple Macintosh for word processing and was in the first group to not have my thesis manually typed. So my experience was right in the transition from big centralized computers of the 60s and 70s generation to moderate-sized distributed computers and eventually workstations that every department started to acquire.

ZIERLER: In the way that nowadays, data is so decentralized, not institutionally housed at any one location, when you were at the Seismo Lab, was that not the case? In other words, was the Seismo Lab really an archive of data, so much so that it would be a magnet for researchers beyond Pasadena?

LAY: To some extent, it was because there were a limited number of institutions, including the Earthquake Research Institute in Tokyo, Caltech, Lamont, Berkeley, and the USCGS that had large archives of global seismic data. The USCGS at the time was collecting the paper records from the World-wide Standardized Seismograph Network, typically on photographic paper, and they microfilmed them. They sent out microfilm copies on reels or 70 mm film chips to the academic research centers for up to 120 WWSSN stations that had been deployed in the early 1960s, mainly to monitor nuclear testing in the Soviet Union. That global network served as the main global seismic data source for ~20 years, from the early 60s up until when I arrived at Caltech. The Seismo Lab had, in a basement level, a room filled with microfilm and a microfilm printer. Each 70-millimeter film chip had a picture of a record from a full day. There were many thousands of microfilm chips there for the 120 stations, each operating six different instruments, short-period, long-period, three components. A lot of film chips for every day continuously for up to two decades. The images had many lines that had spiraled around a rotating drum. You could print a portion of the seismogram, and then go digitize it. Caltech had a very good collection and it was well maintained. Researchers would come from other schools that did not have such complete collections, so the WWSSN data was widely used. It was great to have the data in-house. I enjoyed working in there because it was like exploring. You'd go in and you never knew what you were going to find. Don Helmberger was often down there looking at data as well.

When the analog recording systems were replaced by digital recording, it was much easier to remotely get digital recordings on magnetic tape, or subsequently exabyte tape, or CD-Roms, evolving through the years. That greatly facilitated data access. Now, the global stations operating around the world, about 3,000 stations, are continuously sending their data in a digital from for all the instruments they have at each site, 24/7 to various data centers like the U.S. Geological Survey National Earthquake Information Center. The data are processed there to build earthquake catalogs, and then the data are sent to the digital archive run by IRIS, located in Seattle. When an earthquake happens, I can simply log onto that data center and can extract thousands of seismograms from multiple data centers around the world in close to real time.

This makes it remarkably easy to access global recordings of ground shaking as soon as the signals of interest have propagated through the Earth. It takes some time for signals to propagate through the planet, and that is the main latency in the process. Once the ground at a station has shaken, that signal is available to analyze, and fully automated procedures like those used by the USGS rapidly locate events, measure magnitudes, and determine the faulting geometry or force system at the source for research purposes. The catalog and data are added to a continuous archive by IRIS. You can extract signals for whatever time interval with whatever attributes of the stations you want, distances or type of instrument, do detailed analysis of the data and never leave your office. It's pretty remarkable. Relative to many disciplines, the history of seismology is such that no country had enough globally distributed seismic stations under its direct control that it could record all the signals it needed to monitor earthquakes and other events of interest. As a result, there was intrinsically a need for sharing seismic data, transcending political boundaries. It proved useful to make all data available because it provided access to sufficient data for your own national earthquake hazard or tsunami mitigation efforts. That practice, which was established in the 1880s when seismologists first started to put stations around the world, persists today. Most seismic data have open data access. It's a concept deeply ingrained into the psyche of seismologists around the world. It's even true today that if you log onto IRIS, five out of six stations running in Russia are continuously being transmitted today, despite current political tensions. Most of our national data are openly available to everybody as well. That is hopefully the way that most scientific data exchange will become in the future. Some geophysical measurements, such as precise positions obtained by geodesy are militarily sensitive, but open data policies from seismology have helped to encourage open-data policy for other disciplines. When US researchers want to go and put more instruments in a particular area, say following a big earthquake, it's usually straightforward, depending on the country, to make the data available to the whole world. Sometimes, because of the work that goes on, the cost and effort, researchers have a moratorium for releasing the data for as much as two years, giving them a first shot at processing it before it's totally open. But eventually, it's all open.

ZIERLER: With every that was going on, what was the process of determining what you would work on for your thesis?

LAY: The process was influenced by the fact that the semester after your first year, you had to present a couple of research projects, which we called propositions. I had to do three of them. Nowadays, only two are required. Those are projects you would develop with different prospective advisors, different faculty. Typically, they would be a project on which you had made some progress toward posing and answering a question. The point was to show that you had some experience, that you'd tried a couple of different things, at least three of which were presentable. I think I had worked on about five projects and ultimately presented my best three. Furthermore, it was kind of a stopgap for the faculty to see whether a grad student was just mechanically going through the work, or whether they really understood what they were doing. That faculty were seeking to assess: "What is your potential to do independent research? Based on what you've done with these three different projects, how much do you understand of what you've done, why you're doing it, and where it could lead?"

From the various open-door interactions that you had with the faculty, there was a hierarchy of more promising things that had been panning out. When you worked on it for several months, you started to get some answers that looked OK, you were understanding what you were doing, you could present it, first, to the graduate students, then to the faculty in separate presentations to demonstrate that you were capable of doing research, that you understand the work you had done so far, and you weren't simply parroting something and kind of winging it. After you do the Orals presentation, typically one or two of the projects form the nucleus of your thesis. Of the three projects I worked on, one was modeling variations of structure in the upper mantle and its relationship to nuclear-monitoring questions, another was analyzing large earthquakes in the Solomon Islands, the third was looking at the inner core boundary and the frequency dependence of reflections from it affected by attenuation. Of those three, I published the first two eventually, and I followed up on the work on the deep mantle directly from one of those projects, and that became my thesis.

At the same time, there were a bunch of other projects I pursued working with Professor Kanamori, which all got stapled together with my thesis in the end. It is very typical that your introduction to seismology is sort of project-based, you have a targeted topic that's doable for a student on the time scale a student has, but if you do several of those, you end up with a collection of works that you can assemble into a thesis. If you're doing a really theoretical thesis, that's a more conventional, "Here's the problem, here's the approach, here are the derivations, here's the solution." Some people do that kind of thesis. But that was typically rare, maybe one in ten that would be of that mode. Most of them were different topics, often loosely connected, sometimes pretty disconnected. And that was acceptable. The thing was that Caltech has really good students, so there was intrinsic competition amongst the students. They didn't want to look stupid, so you would try to produce, and hopefully it would be interesting enough work that when you presented it, it raised interesting questions and would go somewhere. You did not want it to elicit, "So what?" It was a very organic process.

I suspect, from talking with other former students, that everybody had a different experience, but most of us took the oral exam process as a screening to identify what we were most interested in, then pursued it beyond that. It was rare that you'd completely drop those three propositions and find something else because you'd spent substantial time on them. Of course, there were projects that failed miserably. One of the fun things of working with Don Helmberger was, sometimes he would give you an idea that was kind of wild, and you'd get excited and spend a couple of weeks on it, then you'd come to the realization that it wasn't going to work. [Laugh] "Let's not pursue that anymore." Everybody had that experience. There were things that didn't work, but you learned by that. You'd start to develop an instinct for what's going to be a viable research project within your improving understanding of what types of data were out there, how good those data would be, whether you could hope to solve a problem with the data limitations we have. You don't initially know that. But you find it from trying and failing. Then, there were cases where there's a project that may not have been particularly well-defined, but the data are really interesting and unfold in unexpected directions. That's how my thesis came about. I did not set out to work on deep structure in the mantle. That was serendipitous. But it was from having targeted, doable projects lead me through the data that I was able to discover something that hadn't been seen.

ZIERLER: Was your research experience really an island within the Seismo Lab, or were you more broadly integrated in GPS?

LAY: The unfortunate reality was, at the time, there were very strong-willed faculty across GPS. Some were in sort of bitter personality struggles. The egos involved made it such that it was very intimidating for a student from the Seismo Lab to go over and work or even take the classes with the geochemists. You didn't feel very welcome. I missed the opportunity, though as I got more senior, I overcame my fears, I suppose, and got to know some of those faculty. But I didn't really have a chance to work with Gerry Wasserburg or others in geochemistry, mainly because there was sort of a strong barrier between North and South Mudd. It was difficult for geochemists to broach by coming to the Seismo Lab and the reverse. People are people. Egos, even at Caltech, can be rather strong. There was some tension there. Some students overcame that and weren't affected by it, but I felt a bit intimidated because I didn't have any natural interest or inclination to explore geochemistry. I wish I had because there were times in my career when, say, interpreting lower-mantle structure, I had to work very closely with mineral physicists and geochemists at trying to interpret it, and I didn't have the foundations personally to do it on my own.

That was one of the diminishing aspects of the Seismo Lab experience, but the Seismo Lab itself was so fertile that you didn't feel a lack. I didn't feel a strong compulsion to try to learn geochemistry, and never got rebuffed. But there were disincentives to try and overcome that energy barrier on my own without a strong motivation or need to do so. I think that's greatly diminished now. There's more communication. I served on the advisory board for GPS for about 12 or 13 years, and I saw that in the late 90s to early 2000s, a lot of that changed with turnover of very senior, strong-willed, excellent faculty but who had kind of made their little fiefdoms. The younger people coming in weren't wedded to that and recognized, "That's not really the way science is working at, say, the American Geophysical Union."

We don't have this total stove-piping anymore because the nature of the problems and the disciplines is such that cross-disciplinary communication and cross-fertilization is essential for making progress. I think it's better now. I'm not been spending a lot of time in the Seismo Lab of late, but it was definitely improving relative to the conditions when I was first there. Every department has strong personalities, so GPS was not unique in this. I'm in a department where there's a much smaller percentage of faculty in geophysics than was the case at Caltech, but congeniality was always the standard here. It's been much more seamless, communications between disciplines and ability to work with faculty in other areas, than was the case originally during my exposure to Caltech. But it was what it was.

ZIERLER: What were some of the key conclusions or research themes in your thesis that may have put you on a particular track after you graduated from Caltech?

LAY: Well, based on my thesis work, I became a faculty member of the University of Michigan largely having scored a fairly big scientific breakthrough in deep-earth structure. That was what I was recognized for. It didn't change anybody's life, it wasn't profound. Nobody really cared except geophysicists studying the earth for whom it was exciting. There was almost nobody in the US, maybe one or two faculty, looking at deep-mantle structure at the time I was working on it with Don Helmberger. There were a couple of programs working on it in Europe. After my work, it began to become a big thing as part of the unveiling of three-dimensional structure that was going on by methods like seismic tomography, which was imaging the 3D structure at very long wavelengths, while I was working at short wavelength.

Both scales were important new attributes of the structure of the interior that was rapidly moving us to a better understanding than we had from our earlier one-dimensional models dating back to the 1940s. I was riding on the momentum of that, and my initial student supervision capitalized on the fact that there was a lot of accumulated and incoming data. You could have students looking at difficult parts of the world trying to augment what I'd done in my work with Professor Helmberger in my thesis. And there were a lot of low-hanging fruit to pick. Once you knew what to look for, you could go and find it. Discovering what to look for was the harder part. I was able to pursue many follow-up studies.

Of course, I was also sustained interest in the experience I'd had working with Professor Kanamori on big earthquakes. The events we studied had happened in the early 1970s and were recorded on analog recordings. Hence, I had to digitize them. I was really interested in big earthquakes, and Hiroo's procedures for studying very big earthquakes were the best anybody had done. And digital data were now making the analysis much less tedious. But the earth stopped having really big earthquakes for some time. After my arrival at Caltech, the earth just went a lacuna of almost no really big earthquakes around the world for two decades. If we look back in time at the number of big earthquakes through the past century, where we have pretty complete detection of them all, it's a very low occurrence interval. Perhaps random misfortune that the earthquakes weren't happening. It's actually good that they didn't happen because if they had, we didn't have all that many digital instruments out to capture them yet. During that lacuna, which persisted up until almost 2000, we had only a few magnitude-8 earthquakes globally.

There were a bunch of magnitude 6s and 7s, but they weren't great earthquakes like I was interested in, bigger than magnitude-8. During that time, fortunately, seismologists had been putting out more and more of these upgraded force-feed back digital seismic recording capabilities and transmission capabilities. The seismic network was getting better and better. It was recording the magnitude-7 earthquakes, and we could study those, and many people did, developing improved analysis procedures. But the network was almost entirely in place before big earthquakes kicked back in. At the same time, GPS had been invented, and geodesists had started to put out GPS sensors around the world in increasing numbers. They had been monitoring slow plate motions and the buildup of deformation before the big earthquakes, then they captured the big earthquakes when they came. Thus, a technological renaissance was happening during the absence of big earthquakes, preparing us for recording the signals when they did kick back in.

Also, tsunami recording advanced greatly. Oceanographers started to put sensors in the deep ocean to record the pressure at the bottom of the ocean that could tell how much water was above that site, providing recordings of passage of sea waves. Those got deployed for the first time in the late 90s, and we had a number of them, slowly building up more and more.

In 2004, the great Sumatra earthquake occurred, which ruptured a 1,300-kilometer-long fault. It was a magnitude 9.2 event that killed over a quarter-million people, mostly through the tsunami that was generated. It was fairly well-recorded by campaign GPS measurements and by tsunami sensors, although only a few were operational for that event. The earthquake was recorded globally by seismic recordings that stayed on scale, digitally recording the full motion for the first time for a magnitude-9 earthquake. I worked on that event from the minute I heard about it. It was sort of this bonanza of all these data we'd been waiting decades for. It was a kick in the teeth big earthquake, and there was so much data. It was frustrating that in the first day, the methods various groups had tuned to handle magnitude-7s that had been happening, which are events that are over in less than ten seconds. This event went on for something like 450 seconds. It overwhelmed all of our techniques for analyzing the signals, and we had to retool everything to deal with such a huge earthquake.

We just had not had earlier examples of it. Within a couple of months, we published several papers in Science that unveiled a magnitude-9earthquake with all the seismological data. By then, we were able to fully exploit the broadband signals for the long rupture, after adapting our codes, and that was tremendously exciting. Since that time, I've been working on big earthquakes, along with deep-earth structure, finally fulfilling the excitement that I'd gotten from Professor Kanamori, and I've been able to work with him on many events. I'm still learning from him in every interaction. We've been able to apply the improved techniques to work with digital data for many events from 2004 to 2015. The number of magnitude-8 earthquakes occurring around the world doubled in number per year relative to the prior century. It was up by a factor of five relative to the low interval from when I was became a graduate student in 1978 up to 2001. Every six months or so, we were having a magnitude-8 earthquake with some new story to tell from all the data we had. It just was a very exciting time. I worked with many students, post-docs, and colleagues to try to exploit all of the seismic, geodetic, and tsunami observations to come up with models that are much better than what we were able to do back with my hand-digitized datasets. The early studies are almost look embarrassing now, but you can see the progress, so you can feel good about that, if nothing else.

ZIERLER: For the last part of our talk, a few broadly retrospective questions about your career in the Seismo Lab. First, it's been an underlying current in our conversation, but just to ask the question directly, what did you learn at the Seismo Lab in your approach to the science there that has stayed with you ever since?

LAY: I think the thing that's stuck with me that anybody who looks at my body of work would say is, "He became an observationalist at Caltech, and he stayed an observationalist." I've enjoyed the fruits of the labor of the people who developed the instruments, deployed them, and made it all available. I wasn't a participant in that directly, though I've helped to run organizations that funded it and did it, but I wasn't doing any of the real work. That observational enthusiasm is boundless. I could keep going. I still get excited about any study I do now when there's a new event with a new set of data, and you don't know where it will take you. There will often be unexpected attributes. Almost every big earthquake I've looked at has been interesting or weird in some way. But we use the same techniques to look at this recent Tonga eruption. We can analyze the signals, and it's complicated and weird, but we believe we can understand the process pretty well. I'm as enthusiastic about turning those observations into interpretations of the underlying physical process as I was from the very beginning. The clarification I had was that that was going to be more rewarding to me than working on theory or abstract models for aspects of mantle convection or magnetohydrodynamics of the core. I was well-prepared to do those, but the data never really would've been there, so it would've been theoretical and computational. What turned me on about seismology and sustains me is that it's driven by data recording the ground shaking, and turning that recording into information about the source is fun and very rewarding.

ZIERLER: I asked you about settled science and debated science when you were at the Seismo Lab. How have those conversations since turned out in the intervening years?

LAY: I didn't get into all the things we're still debating that were topics then, but we began to recognize that there was a lot of heterogeneity in how earthquake faulting happened. That didn't surprise people. The geologists have long gone out in the field and seen complicated faults. Seismologists recorded that complexity and demonstrated that the ground-shaking is really complicated. And we know that F = MA takes us from the source to the recordings. The forces that are operating are very complicated, as is the distribution of slip, the associated fracturing of rock, and stresses. There are models derived from experiments in the laboratory that tell us how frictional instabilities may occur when rocks are slipping on faults. But really figuring out the absolute stresses involved and the correct prescription of the process that connects the micro scale to the macro scale is still an area of frustrating uncertainty. And it's going to be one that's very difficult for seismology on its own to resolve.

It has to be in tandem with the work in rock mechanics and hydraulics, trying to understand the role of fluids. That was present in my graduate school days, and it's still there today. What we have done is really unveil much of the complexity that's involved, but just the observation of complexity has not necessarily revealed the causal physics. That will persist. I think it's going to be difficult to answer these questions. It goes to the heart of predictability. Can you ever predict earthquakes? Or is the process so nonlinear and so hidden from us that we can't observe the things that would need to be observed in order to predict them? That's still an outstanding question for the discipline. It was certainly around when I was a grad student. We were talking about earthquake prediction a lot in the Seismo Lab during coffee, but mostly with a skeptical eye. And I think we were right, it's going to be a very hard problem to get predictable capabilities. At the same time, the tremendous progress in technology has enabled you to have an early warning system for earthquake shaking, although it is not prediction. It's a kind of post-diction. The earthquake happens. But you detect it as soon as it happens, and you can phone ahead at the speed of light and say, "The waves are on the way and will get to you pretty soon."

That still is an amazing accomplishment of having the technical capabilities set up to translate ground-shaking information from seismology into a practical, potentially life-saving capability, but it's not answering the question of whether you could've predicted the event before it began. There are still big questions in the field that will keep the pressure on for people to develop new theory, new observations, and new experiments that will lead us to greater insight. I'd say the other thing that's really transformed in the field since my day is that because there's so much data, it's easy to drown in it. In many different ways. As a graduate student, I could go to the GPS library and read every journal with relevant articles in seismology each month. I could read the complete articles. Now, I can barely read a tenth of the abstracts of all the articles. You cannot keep up. There's a flood of information. And that is the published veneer skimmed off the top. Trying to keep up with the flood of seismograms coming in and the parallel improvements in technologies, machine-learning capabilities that are trying to process the data, etc., it's very daunting.

To be frank, I'm not so sure that if I was coming into the field today and saw the volume of material that I had to become familiar with to be competent in the field, I might not walk away and go do something else. At the time I started, it was manageable. Right now, when my students come in, there's a strong tendency for them to want to specialize down a rabbit hole. You want to be good at something, you want to contribute something distinctive that is your mark, but it's getting to be difficult to define what that will be and make it matter. It's different than my experience, and the challenges are definitely evolving. The instruction and cultivation of the next generation of students that will make further breakthroughs is more challenging than it was. Professor Kanamori, Professor Helmberger, Professor Anderson could just lay out some stuff at the coffee table at the Seismo Lab, and you'd go, "Gee whiz, if I work on that, I'll be able to make important people understand that I'm capable." That's harder now.

And it's the natural evolution of any discipline that matures, and seismology has matured a lot. I used to always say, "I'm glad I didn't go into biology." You'd be one of hundreds of thousands of people out there. Are you going to be the one who does something that anybody else would care about? That's not clear. Seismology's not there yet, but it's a little bit closer to that than it was when I started. I do nostalgically see it as though I got to go through much of the golden age of seismology. That probably started maybe ten years earlier with the breakthroughs of plate tectonics. People who had that experience and have continued to today really saw it all. I got in there a little bit after that, but for the seismic quantification of signals, as I described before, I really got to see the highlights of the advance of the field, to engage it, to exploit it, and to have fun with it. The next generation's going to have to find things that are equally exciting. I'm not sure it's in implementing a new machine learning to make it all automated so that you don't have to think, or look at the data. And that's a challenge, to keep motivating really capable people to come in and do something novel and exciting, rather than just technical people who are just programming it.

ZIERLER: To the extent that you've been able to keep up with the Seismo Lab over the years, visit from time to time, how has it changed since you were there as a grad student?

LAY: Fortunately, when I go there, Professor Kanamori will still be there. He's emeritus, of course, but it's wonderful to see him. I correspond with him every week, sometimes every day, when an earthquake has happened. We're working together on the Tonga eruption right now. Many of the other faculty have passed on, so there's a big turnover. I've seen and know all the younger people who have been hired, and I've been able to follow the career development of many of them. I end up writing a lot of letters of recommendation for medals, honors, and stuff for them. I love that the Caltech environment has continued to draw in really exceptional people who keep moving the discipline forward, major breakthroughs have emerged from Caltech in terms of machine learning, improved understanding of ways to analyze complex earthquakes, improved understanding of the uncertainties in our models, many really profound, valuable contributions from the next couple generations of students who have come through that I've been able to intermittently follow, and I certainly read a lot of papers from folks there.

It's really healthy. It's very different. But whenever you go back to someplace you have nostalgia for –I went back to Rochester after 40 years or so, and just walking around the campus, thinking, "Ah, these buildings look the same"- you kind of flash back. Same type of thing when I go to the Seismo Lab or to coffee. You look around, and everybody's new, and many things change. You always think, "I liked it better because it was different, it was familiar to me." But I can't help but sense that it's very healthy. It's been impressive to sustain that. There have been good people who have come and left. But mostly, they've held on to really strong people. It's kept the Seismo Lab very important. It is true that relative to when I was there, when there were just a handful of leading institutions in seismology, there's a lot of good stuff going on elsewhere now. Far more competition, effectively, to be a leading group, and that means that you have to choose wisely the faculty who are going to do something really interesting and different rather than what might be happening at, say, UC Santa Cruz.

You want to have some distinction, and Caltech's continued to have the insight in its faculty selection that has paid off, very much as it did when I was a grad student. I could see the great faculty they had and have had, and they've continued to bring in people of exceptional capabilities and contributions. That's sort of an institutional insight that, believe me, is not shared everywhere. It's very difficult. I've been in really good programs in my career, University of Michigan, UC Santa Cruz, both very strong in Earth sciences, but I've observed many programs that sort of dissipate because of internal fiefdom attitudes or whatever, with faculty that don't say, "It's more important to hire somebody better than me." That's what you should do. Hire somebody better than you, and the place will be fine. Caltech always had that attitude, and they've been able to sustain it.

ZIERLER: Finally, last question, looking to the future. In the way that, as you reflected, you came of age during really a golden time of seismology and geophysics, to the extent that all young people should feel that because that's how progress happens, for people now, undergraduates, graduate students who are interested in earthquakes, interested in plate tectonics, what are the big questions for them as they look to chart their careers?

LAY: That issue arises pretty much every time I talk to prospective students. I share with an incoming graduate student my own kind of unclarity that I had. "Should I go to med school? Should I work on in geophysics?" What you do is make choices that constantly constrain your options. You decide not to go to med school, you're not going to do it 20 years later. It's not going to happen. You cut off options. Is selecting geophysics a good choice relative to other things you could do? You could go into computer science, go work at Google, whatever. The health of the field is such that I still feel pretty confident when I advise students, "Yeah, you can have a good career." It hasn't turned into a second-order discipline, meaning that you're just working on further refinement of well-understood ideas. Yeah, we'll get a little bit faster, a little bit more precise, a little bit better, but it won't transform again. I don't think that's the case. Even looking back ten years ago, when we weren't really yet coping with the flood of data, the machine-learning revolution and the enhanced computer power coming with graphics processors, extended access to Exabyte machines, have kept up and enabled new strategies for dealing with the vast amounts of data that I could not have envisioned. And I wouldn't necessarily have been saying, "That's going to happen in the next ten years, so you want to get on board and be a player in that." Very difficult to anticipate. But a healthy field keeps doing that, and seismology's done it amazingly.

One of the most profound advances, I thought, in seismology was when they recognized that every single bit of the recording is useful. I used to say, "I really like it when the ground shakes a lot, and there's a big wiggle. I can analyze that. I can say something." You can also analyze every time the ground is not shaking much. That actually has information, and they discovered this from what is called analysis of the noise, that it's rich in information about small deformations being generated by tides, the wind impacting the mountainside, or blowing on the trees. Every single bit of information recorded by the seismometer can be used. And that just never was apparent to me until those techniques were introduced for noise correlation and extracting that information. But it shows that you can keep data mining further and further, and they're not incremental gains. It opened up entirely new capabilities for high-resolution imaging of what we thought we would not be able to do.

That type of discovery capability is being enhanced by machine learning, so I feel like we still have momentum as a discipline for young, intelligent, motivated people to come in, assess the lay of the land, and that's a much bigger lay than it was before, so it's intimidating. But if you can sniff out the directions to go that are going to be most promising, they will be there. They will include improving our understanding of the physical processes of faulting. But it's going to draw not just from seismology but from integrating with rock mechanics and other areas. It's going to take sort of an agile open mind with a willingness to keep up with a broad range of literature to even recognize that those opportunities are emerging. And that was very much the case with noise correlation or with machine learning. It wasn't coming simply from classic seismology. We've all been doing machine learning forever, but it's become codified as a new style, capability, and strategy for data analysis. That will continue for some time before it becomes routine. Another generation or two will see a lot of progress. What it will be specifically, I cannot really confidently put my finger on, any more than I could've for those advances ten years before they happened. I will retire in a year myself. I will keep doing research, but it will probably be research I'm comfortable with and capable of doing. It won't be pushing a new frontier as much because you need a program, with a lot of students to be able to do that.

I'll be watching to see what emerges. "Why didn't I figure that one out? That one's really impressive." Which has been the case with, for example, noise correlation. I said, "Yeah, I'm impressed, but I should've been able to figure some of that before these other guys did." As for specifics, it's harder for me to put a finger on. I think we've made tremendous progress. It's become much easier to do the work. But when you have 120 university programs, a bunch of grad students in them, there aren't many low-hanging fruit left. It's a question of what is going to make you climb higher in the tree so you can get to other fruit. That's not so obvious to me. But places like Caltech are doing a good job. They've been very involved in the machine-learning developments, very involved in exploiting the noise capabilities. Not originally, but really quickly moved in that direction and have done a great job with it. It's an agile environment. The advantage of having a fairly large concentration of people is that you can move quickly. When you're in a smaller program, one or two faculty, aging faculty like myself, you're less agile, to be honest. It's even harder for me to stand up after talking to you for an hour and a half.

But it's still places like Caltech that will sniff out and pursue new areas. But they have a lot of competition. And their competition is largely of their own making. Graduates go out that are smart, and build programs, and compete, and push areas that maybe Caltech hasn't pursued. In fact, there are some things that Caltech no longer does that they used to do, and I found that out recently. I was like, "Wow, you guys can't do that anymore. You've lost that skill because you moved on." But that's OK because other places can do it. Caltech doesn't have to do everything. They need to do what's new and interesting. I can understand that transition.

ZIERLER: On that note, this has been a fantastic conversation. I'm so glad we've captured your insights and perspectives over your career for our project. I'd like to thank you so much.

LAY: It's been a pleasure to chat with you.