Professor of Computing and Mathematical Sciences and Director of Information Science and Technology
By David Zierler, Director of the Caltech Heritage Project
June 24, August 11, 22, November 30, 2022
DAVID ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It's Friday, June 24th, 2022. I'm very happy to be here with Professor Adam Wierman. Adam, it is great to be with you. Thank you for joining me today.
ADAM WIERMAN: My pleasure.
ZIERLER: Adam, to start, would you please tell me your title and affiliations here at Caltech?
WIERMAN: Sure. I'm a Professor of Computing and Mathematical Sciences at Caltech. In addition to that, I'm the Director of Information Science and Technology, which is an institute that seeks to support and engage research at the interface of computer science and other disciplines all across campus.
ZIERLER: Adam, of course, at Caltech, we have divisions and not departments. But the title Computing and Mathematical Sciences, does that suggest some dual appointment or affiliation?
WIERMAN: No. In engineering, we added departments—I don't know—five-ish years ago, six-ish years ago, and so this was one of the departments that was formed. It was actually a very, I think, strategic decision at the time too. At many schools, computer science has merged with electrical engineering. At Caltech, we explicitly chose to merge computer science with applied math, and control and dynamical systems when these departments were formed within engineering. That combination, I think, put us in a very good position for emerging foundations of data science where optimization and machine learning people are in the same department, teaching courses together, and it gives us a nice competitive advantage in that space. I was one of the first people that took the department name as my title when that happened because I definitely in my research have a bridge between applied math and computer science.
ZIERLER: At that bridge, from your educational trajectory or your research focus, is one more than the other more your home department, computing or mathematical science, or are you really right down the middle, would you say?
WIERMAN: I am probably biased towards computer science. My PhD was in computer science, I think about computer systems, but I use tools from applied math. Operations research is a second home for me outside of computer science. Operations research really is about applications of optimizations, stochastic processes, tools like that. I do a lot of work on those tools but usually with applications from computer science in mind when I do it.
ZIERLER: Now, information science and technology, are you the inaugural director?
WIERMAN: No, that's an initiative that's been around since about 2003—somewhere around there. It was formed with a big gift from the Gordon Moore Foundation and also the Annenbergs, which led to the building that I'm sitting in. It was founded at a time when Caltech really recognized that information sciences, computer science, electrical engineering, applied physics was at a transpirational point, and Caltech wanted to invest in growing strength in those areas. Initially, it was born as a kind of overview umbrella on top of four centers, each of which connected different parts of campus. One was computer science and electrical engineering Center for Mathematics of Information. One was with biology that has eventually grown into molecular programming, and some work there in the CNS, Computational and Neural Systems, and BE, a biological engineering space. One was around quantum, and then the last one was around the intersection with economics. All four of those centers grew, became endowments in their own entities and, at this point, IST is kind of an incubator. It looks to create new seeds, new centers that then grow into other things, both on the research side and the academic side, and really grow into things that either form new disciplines or connect existing traditional disciplines in new ways.
ZIERLER: Now, you mentioned the gift from Moore. Is that separate from the huge endowment that Gordon Moore gave to Caltech? Is this from the Moore Foundation?
WIERMAN: Yeah. Gordon Moore's been an amazing supporter of Caltech. This one was specific to IST when it was founded. There's been a number of other gifts that he's made over the years for broader purposes, around fellowships and other things. But this one was really specific to this umbrella institute.
ZIERLER: Adam, just a snapshot in time, what are you currently working on, or maybe I should say what do you wish you were working on if you had the bandwidth to do it?
WIERMAN: [laugh] Actually, this is a really exciting time. In my research, I tend to every five years or so reinvent myself and push into new directions. Right now, we're at a phase where, a few years ago, I wouldn't have considered myself either a machine learning researcher or a control theorist. Now, my biggest push is on the intersection of control theory and machine learning, looking at places where, driven by basically the question of if you're going to deploy artificial intelligence, machine learning in safety-critical infrastructure systems, like the smart grid, often sustainability is a big motivator for me in my research. Then today, I would run away. I would be terrified because we just don't understand the failure modes of these things. There's not reliability safety guarantees, all of these things, these AI tools that can be successful in amazing ways that sometimes just can fail spectacularly in unexpected way. I would not want them running the safety critical grid in [laugh] any way that we depend on. But I don't think it's impossible to get there, and so the big research question in our group now is how can you develop AI that is safe and reliable, and has guarantees that allow it to be used in these sorts of infrastructure systems? This really sits at the intersection of sustainability and engineered systems, artificial intelligence, and control theory.
ZIERLER: How do you define guarantees within the context of being concerned about AI? Do they sign on the dotted line? What does that mean?
WIERMAN: [laugh] That's a great question. The way to think about that is in today's systems, there's a lot of reliability guarantees that algorithms have designed into them before they're deployed. They're provably under some assumptions going to guarantee that voltage stability happens, that you're not going to violate the limits of safety around lines. There's provable guarantees that if there's a big failure in one part of the network, it won't spread to other parts of the network. You can really have provable architectural guarantees about the design of the system, and how the control policies work in different parts of the systems that keep the grid running as well as it does and as well as it has for the last 100 years, and is really a system designed around reliability as the primary objective. AI is not designed around reliability. There's a lot of really famous examples where you take a stop sign and your vision system has been trained for autonomous vehicles and to look at stop signs, and this seems like a pretty easy task, recognize a stop sign. You add a couple of pieces of tape in strategic places on the stop sign and, all of a sudden, now the vision system thinks it's a speed limit sign telling you that the speed limit is 80 miles an hour, and you blow through. These sorts of things are very hard to explain if you're using neural net type architecture or some of these other modern AI tools, and you don't understand why the model is making this mistake, and you certainly can't guarantee that the model won't make such mistakes if you deploy it. That's not acceptable if we're going to be deploying these things in our safety infrastructures for the world.
ZIERLER: The research agenda—it's so elegantly listed in various places that you display your work—your research strives to make the network systems that govern our world sustainable and resilient. First, as an origin story, intellectual development, when did you start to get interested in sustainability?
WIERMAN: Right when I came to Caltech, this was my goal. My PhD work was very much focused on scheduling and resource allocation in data centers. This was in the 2007 when I started at Caltech period, and so this was a point where the growth of data center was enormous. There was just a huge construction boom, and these things were scaling massively as the cloud computing grew in importance in deployment. There was a lot of worry at the time that there was just this completely unsustainable trajectory of the energy demanded for computing. Broadly speaking, I was always concerned about the environment, and this was always a passion for me but I didn't have a connection to that in my research. The fact that now data centers, which I understood well, which I had been working on, were becoming such a major source of potential energy usage, opened the door for me to say this is a place where I can really make an impact. I can be one of the ones that helps curtail this, and helps us rethink the way that we talk about designing data centers so that they can be more sustainable and carbon neutral. From day one at Caltech, that was one of my big questions was how do I transition my research from instead of focusing on performance response times, delays of data centers to make energy and sustainability one of the primary metrics we talk about with data centers and primary goals? At that point, there was a huge resistance to that message from industry but there was growing recognition in government agencies, NSF, etc., that this was an important problem, so it was a great time to make an impact from academia.
ZIERLER: Just to clarify, the data centers, the issue is the usage of fossil fuels. In other words, if we had a grid that was entirely reliant on solar and nuclear and wind, there wouldn't be that sustainability issue, as you define it, with data centers?
WIERMAN: That's right. I agree, yeah. At that time, in 2007, the grid was certainly not anywhere near that. There were a couple different ways to approach the problem. One was around sustainability, and this actually took a little while to get people to even think about that. The first was around just energy efficiency. At that point in time, you had a data center, which is basically a big warehouse. There was a measure that was becoming popular, which was the power usage effectiveness of a data center, PUE. This measured what fraction of the, I guess, for—to get a unit of energy to a server in a data center, how much had to come into the building. The first time people started measuring this, it was 7, 8, 9, 10 plus. You would only be getting a 10th of the energy coming into the building actually to the servers. There was a huge waste happening in power conversion, in cooling, in the infrastructure surrounding the actual work being done in the data centers. A big first order or piece of this work was just how do you redesign the whole system so that you're not wasting 90% of your energy before you're even running your server? That was one big chunk of work at the time. Another big chunk of work was within the servers. You have massive over-provisioning in data centers, and the standard was just we need to leave all servers on, running full bore at all time in case we need them. This was the way systems were designed and built, and you really didn't have a choice in the matter at the time. Another big focus was to redesign the software and the systems and the architectures so that you could ramp servers up and down in their speeds, in their usage as the work was there or wasn't there, so that you could just have a usage card that tracked the actual demand coming into the data center, which things weren't doing before. Those were the two big-picture ones. Then the third, once you have those in place, the third piece was now how do you actually make use of renewables, either locally or in the grid, to redesign these data centers, and to make them carbon neutral or sustainable? At Caltech, we have big contributions in all three of those areas as we move forward. But our eye was always on that third one. I think, actually, industry is only in recent years getting to be able to implement some of the ideas from that third one that we're deploying at a large scale.
ZIERLER: An even more fundamental question, even before these mitigation strategies to make the servers less energy intensive, what is it about servers that make them require so much energy? In other words, a concrete factory, I can wrap my head around why that needs a lot of energy.
ZIERLER: But these are silent machines that are humming away. What is it that they do that requires all of that power?
WIERMAN: That's a great question. Part of it is just your CPU, GPU, whatever it is, is a very power-hungry device, much more than you realize, and so part of it is just that. The way that's designed can either allow you to do speed scaling and ramp up and down your usage, or just be on at full bore or off. There's a lot of computer architecture work that goes into making the actual devices in the memory, the CPU, GPU, all of these scale properly. Similarly, with the network devices, the routers, the switches in the data centers, traditionally, they were just either on or off. There was no speed scaling or energy mitigation that happened as they were busier or not, and so redesigning them was another thing. But then beyond the devices, it's the heat. Servers need to be in a particular range of temperatures to have good reliability. Otherwise, their lifetimes suffer, and they'll fail more quickly, and they'll just die. You need to cool the space because they're generating a lot of heat when they run. You need to cool—and there's a lot of very, at this point, really intelligent infrastructure that goes into doing that cooling—depending on where you build your data center. Of course, this also pushes data centers to be built in Canada, in cold places. But you can't always do that. All of these things work together to just mean that a particular data center—I'm trying to think of some good statistics here. Just the energy budget for Google data centers is—what is it? It's about nine digits a quarter or so.
WIERMAN: You're talking just massive budgets in terms of money, massive budgets in terms of energy going into this. At the time, in 2007, the data centers were using about 3% to 4% of the US energy usage just in a couple dozen buildings around the US, and projections were making that this would just grow and become a massive part of the societal energy use for the US. But these changes, these days, the PUEs are 1.05 instead of 10. [laugh] At this point, energy is really going directly to servers, and not being wasted in the buildings. There's just been massive success from electrical engineering, mechanical engineering, and architectural changes in the software that has enabled that over the last decade. It's a huge success story, and so there was a big bump, but then there was actually drops despite lots of increasing building of data centers. But now that we've hit that point, the building can't become more efficient. The software can't really [laugh] become that much more efficient. It's the changing how you use and what sources you draw from when you use then is the next big research challenge for this space. This is paying attention to things like embodied carbon in the actual processes for building the servers and building the building, and managing that, and managing the lifetime of your servers. But that's also the operational carbon, and identifying when you can do workloads in times when you can use non-carbon sources.
ZIERLER: Just so I understand the trendlines, even with all of the growth in data centers over the past 15–20 years, carbon emissions because of these mitigation strategies have actually gone down or at least stayed neutral? Is that the big story?
WIERMAN: Yeah, depending on your story, it's been 10 years of fairly level—depending on your data sources, it's been 10 years of pretty level behavior. Massive growth in construction level behavior in terms of percentage of—when I'm saying "level", it's in terms of percentage of national energy use, percentage of national emissions. Of course, that might be fluctuating as well. But the percentage has stayed in the 3–5% range despite the massive growth of data centers. This is, I think, an amazing success story for these research, both industrial and academic, places. Now, the phase shift that's happened in the last 1–2 years, and is really ongoing is, I guess, all of the storytelling from the industry side for the last decade has been dominated by just efficiency of the data center buildings, and purchasing of solar credits at other places to claim that they were getting to be net zero. There's been a shift that that's not acceptable anymore, I think, in the industry. Going forward, really, there's a huge push to be carbon neutral in the operational aspects of the data center, rather than just by purchasing offsets. This is opening the door to it. I'm excited by a lot of the research that we've been doing to say, now, we need to do workload migration. We need deferral to run when and where it's sunny or windy. This is really exciting. Also, it pushes to do some of the crazy data center designs that some companies have been investigating, things like Microsoft's underwater data centers, and there's been some manure power data centers that companies like HP have looked at that look at these really interesting symbiotic loops between ways of generating energy, ways of pushing out heat. The manure-powered one, it's always been one of my favorites. You have these digesters for the manure on a farm. What do you need to start off the process? Data centers generate a lot of heat, so just pipe the heat there.
Then that process ends up powering your data center, and so you can have a very nice symbiotic loop, and avoid the methane from the manure while also avoiding the emissions from the data center. It's such a huge win. The underwater data centers are great also because, oftentimes, you need smaller data centers need urban centers, but urban centers are not necessarily the climate you want to be energy efficient in your data center build but they're often near water. Imagine basically dropping a submarine half a mile or a mile off with enough servers to have an edge data center to serve with low latency that city, that population center. You don't have to cool it anymore. It's all just there, and it's very energy efficient, right temperature, all that, and you don't have to do much. You have great delays. You don't have to do much maintenance or operation for it. Nothing's touching it, failing, and it just checks all the boxes almost immediately, except of course you worry about the reliability of doing this underwater. But Microsoft has now done multi-year tests of this, and it works.
ZIERLER: Adam, obviously, it's a huge endeavor, this success story, in terms of mitigating carbon emissions over the past 10 years. Between academia, industry, and government, who are the heroes in this story? Who are the key partners that you've worked with to make this story happen?
WIERMAN: All three have played a huge role. NSF in particular had a number of very big calls and gave a lot of support to this at that time when I was in my first three or four years of a faculty member. That was really crucial for me and many people like me at other universities to get started in this space. At the time, industry was not particularly open to these ideas. I remember two of my first students, Minghong Lin and Zhenhua Liu. We sent them off to companies for the summer to try to basically take our research, and convince people they should follow up on it. What they both found when they got to industry is the argument to deploy was extremely hard to make because they did not even, at that point, have measurements of the energy they were using in different parts of their system. They couldn't measure to understand what the impact was; how much they were using; how much they could save. You couldn't have any detailed discussions with them because they didn't even know where they were. The students actually worked to build the measurement systems in the companies during the summer, which then opened the door for conversations afterwards. Eventually, with HP, we were able to really release the first industrial-scale net zero data center where we had solar on the roof, and we were able to be on average over a long horizon about 90% off the grid with powering an active industry data center.
ZIERLER: Do we see a similar narrative happening in China right now? Are they ahead of the curve? Are they following what's happening in the United States?
WIERMAN: It's hard to tell. They have not been as active in the research space in these directions. But there's definitely interest from companies in understanding the ideas coming out. A lot of the work in the US is open-source. I don't have a really good sense of how many companies or which companies are going heavily in terms of the carbon neutral goal. In terms of the energy efficiency goal, that's certainly worldwide deployment now in terms of keeping these PUEs small because that hits your bottom line directly with every point you save there.
ZIERLER: To go back to 2007, to paint a broader picture, you've talked about how you got to Caltech. Sustainability really entered your radar. I'm thinking the historical context is 2007, it's a few years after Hurricane Katrina. Al Gore's Inconvenient Truth came out. This is like global warming, it has its moment really. It's on the national discussion at this point. The question is—it's a bit of a counterfactual—do you think that no matter where you would've gone as an assistant professor, inevitably, sustainability would've found its way to you, or were there things specifically happening at Caltech that might not have been happening at MIT, Stanford, wherever, that made this much more right in your face than it otherwise would've been?
WIERMAN: That's a hard one to answer. It was definitely on my mind as a goal before I got integrated at Caltech at all. I think it would've been something that I attacked, no matter where I went. Caltech did give me some competitive advantage, certainly. I think that probably came mainly in the form of Steven Low being here because Steven was really, I think, maybe the most exciting example of someone who is a theorist at heart but takes things to reality. I had done a lot more mathematical algorithmic work in my PhD, and I think his influence really convinced me that you can take this stuff to reality in companies, and deploy it, which if I had been at a bigger school, a CMU, like I came from in grad school, there probably wouldn't have been that push, that eye-opening that Steven gave to me. I think that's one aspect. But the other aspect is that Steven was, himself, in a transition point for his research, and was looking for his next phase, and was, at that point, really thinking seriously about energy as a push. It gave us both a chance to cut our teeth in this new area, and investigate it together. Mani Chandy at the time was also interested in going towards a smart grid—not energy in data centers but sustainable systems in general. The three of us just fell into a nice collaborative group that hit not just sustainability in data centers but sustainability in grid and water and markets and all of these areas, which created an ecosystem that could grow. I think that aspect of the people, those particular people here have definitely benefited and magnified me to be able to move quickly in the space.
ZIERLER: Do you think you scrambled up the traditional research trajectory that an assistant professor is supposed to take? In other words, this is when you should be laser-focused on your area—
ZIERLER: —of expertise from your dissertation—
WIERMAN: No. [laugh]
ZIERLER: Did you find yourself mixing it up more than you otherwise might've done?
WIERMAN: Yeah, that's a great question, actually, that maybe even highlights an answer that I could've added to the previous question. I think one thing that Caltech does for assistant professors, and this is now, in my more senior role, this is actually what I say to faculty when we are trying to recruit them to come to Caltech, in a very explicit way, is that, at many schools, the pre-tenure period is one of narrowing. It's one of pick your core area, narrow down so that you're recognized as contributing something specific to your co-area so that you can get tenure on that contribution. That's just not the way Caltech works and talks to its assistant professors at all. I was very confident very early that I would not need to be measured by my peak in one area; that my colleagues would look broadly at contributions I had made to different areas of different types, and get a picture of me as a researcher, and evaluate me with that whole picture in mind rather than saying, "OK, he does networking. Let's ask the top 10 networking people if he's contributed to networking, and ignore everything else." I think that knowledge about how I would be evaluated really let me do things very differently than my peers at other schools did. I was starting two new areas when I was an assistant professor, one in this sustainability area, and another in the intersection of economics and computing. In my mind, they were related because you couldn't solve sustainability in the broader scale without thinking about how the engineering systems work with markets, and how the markets needed to be redesigned. But that's not the way academia treats those fields at all. To the outside, it probably looks like, "Why is he doing all these different things? That's a dangerous approach pre-tenure. What's he thinking?" But I had a vision for what I wanted to accomplish, and these were two pieces of it that were distinct but would eventually merge. I was able to go after those things because I think of the Caltech culture.
ZIERLER: This messaging that Caltech, and now you in a more senior role, sends out to potential recruits, how has that worked out? What are the kinds of people who are attracted to that mindset, and who are the kinds of people that say, "No, that's just not for me"?
WIERMAN: No, I think one way that I say it very concisely is at most schools, you narrow pre-PhD. At Caltech, you broaden. Which do you want to do as a junior faculty? This is probably one of the most exciting times of your career because you're walking on your own two feet outside of your advisor, and you're getting to set tone. It's not the right time to narrow and constrain yourself. Why wait until post-tenure to have that broadening and eye-opening? I think lots of people find it appealing. I think it's something that's enabled by the small size of Caltech so that you can be distinct from everybody else here, even if you're broad. Whereas at a big school, if you're broad, you end up in everybody else's shadow rather than having your own thing that you own if you're a junior faculty at a place like CMU, where I came from. I think it's a really compelling message to a lot of people, and it emphasizes the strength of Caltech. We don't have the 20 faculty in 20 different areas that you can pull into a team locally to do something massive. But we do have the ability to let you do lots of things, and combine in cross-disciplines, and not constrain yourself as an individual researcher or as a small team doing research. I think these are different models. Some faculty like the big team, big project model. Some people like the interdisciplinary, intellectual curiosity followers model.
ZIERLER: Beyond sustainability in computer science at Caltech, more generally, circa 2007, the way that Jean-Lou Chameau emphasized sustainability across the institute, the increasing involvement of the Resnicks and the building of the Resnick Institute, what was happening campus-wide that was attractive to you that also maybe got you thinking in these new directions?
WIERMAN: The Resnick Institute at that point was in its initial phase. It was a big support source for us that we were able to work with partners, find partners with the Resnick Institute, and have them help us find community partners, industry partners. Financial support for students and postdocs was there in a pretty consistent way, which certainly made it easier to go in these directions. Then one of the things, not directly as I started working on data centers but as we broadened to look at smart grid and smart systems, beyond that, we had a couple of different situations over the years where we had workshops, broad community, industry, academia, government workshops that developed white papers and challenge documents for the community around the open questions. Grid 2020 at the time was one of the [laugh], I think, things. Now we're past 2020 but, at the time, looking five, ten years into the future, what are the big questions that researchers should be attacking?
ZIERLER: This whole portion of the conversation has focused on sustainability. To go back to the tagline, make the networked systems that govern our world sustainable and resilient, do you see resiliency as a subset of sustainability, or does it mean something different in that context?
WIERMAN: It means something different, and I think they're often related. First, starting from sustainability, if we think of adding solar and wind and unreliable sources to the grid, one of the big challenges this creates is resiliency and reliability because if you were to just do it in the way the grid is designed today, we'd be dealing with power outages constantly. That's just not something that we can handle, and so we need to be thinking about how to make the grid more resilient if we want to be able to make it more sustainable. They're connected in that way. But also more broadly, again, focusing the grid, as climate change comes, there's going to be more and more forest fires, more and more things like heatwaves that create challenges to the reliability models of the grid today, and resiliency is going to be even more important. In our data centers as well, sustainability and resiliency are tied but you care about resiliency far beyond just the integration of renewables. The moment Amazon Cloud fails, there's new stories in every major source about the outage. This is not something that is acceptable for any of these businesses, any of these services.
As you push on making them more efficient, as you push and build, providing new services, distributing things in different ways, resiliency is one of the primary metrics, and it's a place where one of the goals of my work is to do theory-inspired design. Rigor and relevance is a tagline we have. A place where we can make the most impact is a place where you need some guarantee or assurance that you couldn't get just by hacking something together based on intuition. Resiliency is a place where if you hack something together based on intuition, you're going to fail. You're not going to be able to ensure that your system actually works in these situations. It's a place where you need theory-driven design, theory-inspired design, to be able to make things work at scale.
ZIERLER: Now, taken all together, networked systems, to make them more sustainable and resilient, what aspects of your research focus specifically on retrofitting extant systems, and what aspects are blank slate, we have the opportunity to do this right from the beginning?
WIERMAN: I often do both. I think, especially, using the smart grid as a good example, there on the market design side, you can think about the blue sky approach because at some point as renewables grow, if we get to 90% integration of these renewable sources, there's no longer a marginal cost for production. The marginal cost of solar is zero. You have as much as the sun is giving you at a given time, and that's that. It doesn't matter whether you take half of it or all of it, it's the same cost for the production, which is very different than these traditional sources that have increasing marginal costs as you draw more from them up to whatever the constraint is. All of market design is based on marginal cost determining price. If you have only solar and wind, then you have zero price, and markets fall apart.
At some point, you need to really redesign these things from scratch to be able to handle high-penetration systems. At the same time, that's a while from now. In the short-time, you'd want to be able to do things that set up the right market conditions for growing renewable infrastructure, which means rewarding renewables for the services they provide for the grid, understanding how to do that, understand how they play in current market structures. We have, on the market side, work that both looks at how do you adjust today's markets to handle intermittency and unpredictability that comes with renewables, but also work that says if we really get into these high-penetration regimes, what needs to change? What do we have to completely throw out and redesign?
ZIERLER: This entire conversation, of course, has been taking place within the context of classical computing. Being at Caltech, it's inescapable. Quantum computing, IQIM, it's all around. Maybe I'll give you three options for—
ZIERLER: —how you interface or not with what's happening in quantum computing. One might be, "It's amazing. I'm as integrated as I possibly can be. I want to learn as much as possible." The other is, "I don't see any connections whatsoever." The third is, "I'm agnostic. Let's see what actually comes of all of this."
ZIERLER: You can have fourth option if you want. But those are roughly the areas where I see it.
WIERMAN: It's a fourth option. [laugh] It's a fourth option. In my area, I think there's a moment coming, and I'm trying to decide how much to, basically, commit myself to it. I have some connections and collaborations but they are just in the infancy phase around this. But one of the first places where I think quantum will actually lead to serious deployments is in networking. An initial application that is not too far away is quantum key distribution. To do that, you need a quantum internet, even in small scales in some places but you need it. This is potentially a really exciting time then because you need to develop all the things like TCP/IP, IP, all of the network infrastructure. None of it ports over to a quantum internet. It needs to be redesigned from scratch. Networking people should be the ones to do that. We have the experience. We know the theory to be able to design these things, make it work. That's a really exciting moment. That's probably a two- to five- maybe eight-year moment to build that infrastructure and build those protocols so that things can be deployed.
ZIERLER: Is that to say just once quantum computers are scalable, once quantum error-correction is achieved, then you can build a quantum internet, or you might be able to build a quantum internet—?
WIERMAN: No, this is before that. This is going to happen before that.
ZIERLER: You can do a quantum internet before quantum computers.
WIERMAN: Yeah. It will be the precedent. It will precede quantum computing. In terms of general purpose, we'll have the ability to distribute the keys needed for quantum cryptography. That's something that probably happens in small scale within just a couple years, and maybe even large scale in five to seven years. This is a fairly short window to design the new internet protocols—at least the first version of them. Some networking researchers are really starting to commit themselves to this. One of the people who was very involved in the theory surrounding all of the initial internet protocols, Don Towsley, is heavily invested now, and basically shifted his entire research agenda to quantum internet, and is trying to come up with the models and do this. The question going forward is how—I don't have, me personally, I don't have the background in quantum computing to jump in quickly. There's a huge overhead. The field is at a point where it's trying to, I think, I hope in the next year or two, develop the models that allow people like me to make the contribution on protocol design without being experts at the underlying quantum technology. I'm hoping that that moment comes sooner rather than later so I can jump in on the protocol design. But it depends on people like Don being the ones to design those models, and have the trusted abstraction to allow networking people to come in and take a shot at it.
ZIERLER: Now, the industrial leaders in quantum computing, the Amazons, the Microsofts, the Googles, these are of course the very same companies that have these enormous data centers that need to become more energy efficient. Do you have relationships or knowledge of that aspect of their business model that might actually push this along further?
WIERMAN: Yeah, I do work with a number of companies in this space. We have a partnership right now with Google. In the past, we've worked with HP and Microsoft pretty closely. We definitely make attempts to pass our work on to these big companies in terms of getting them to deploy and push these things to practice. Google, in particular, is very exciting. Just in the last year, so during COVID time, I think one of my partners, Ana Radovanovic, she's been leading their team, and they've been deploying ideas for the first time in very large scale across the Google fleet that we were writing papers on eight years ago. It's exciting.
ZIERLER: Do you see opportunities within Caltech? Are there more opportunities for collaboration in IQIM or, maybe conversely, is IQIM alive to the quantum internet, the networking things that you're more comfortable with?
WIERMAN: Actually, that last answer, just to be clear, that was not for the quantum piece. That was just for the sustainability piece. For the quantum piece, I don't know who the right partners are for that yet. I'm a little bit further afield. I'm a curious bystander waiting for my moment, is the way I would characterize it. John Preskill's right down the hall. I asked him for some pointers recently, and he gave me a couple of books that I've been reading. I don't know who the right partners are there. A lot of the partners there are government because this is quantum key distribution, so the applications are very security-oriented, I think. But it'll be interesting to see because a lot of it is sort mixed. There is the compute piece, which is distinct in terms of techniques and algorithms and pretty much everything from the networking piece. It's not the same group of people, even in the companies that are doing both of these things.
ZIERLER: Do you envision yourself getting involved in the inevitable discussions that are going to happen with like the National Security Agency, which is, as I'm sure you know—?
WIERMAN: I'm not close enough for that yet.
ZIERLER: You're not close enough?
WIERMAN: That's too far. [laugh] That's too far afield. [laugh] On the energy side, I like to do that sort of thing but [laugh] nothing to do with quantum.
ZIERLER: Another general theme to touch on, your interest in machine learning, just some machine learning 101. What is the connection between best practices in machine learning and enhancing sustainability and resilience? How do we get machine learning to work toward those goals?
WIERMAN: This is something I think a lot about. I think there's a number of challenges here. One is I think in machine learning, there's a ton of applications that people are driven by. As a research community, the algorithms and the developments that people have are driven by the industry-focused data sets and applications where there's easy testbeds available. Those tend to be things that are vision algorithms driven by whatever company's problem that they want to solve in vision or image labelling or things like this or autonomous driving. There's lots of data sets and excitement around those spaces that is driven by industry or partners releasing data. In sustainability in grid in particular, it's often the case that you can't release the data in a way that would enable broad just easy testing of algorithms. It takes the bigger community of machine learning, and pushes it in other ways. Now, that's not to say that in the energy communities, there's not a lot of focus on bringing machine learning tools. Your generic AI researcher isn't using an energy application in the testing part of their paper right now. We've been thinking about how to change that.
One of the projects that we're doing right now has that as a goal, which is to release sustainability-focused benchmarks both for general machine learning tasks that are more specifically for grid and control tests. Hopefully next fall, we'll be able to release three or four or five or six different examples that are really easy to use. OpenAI Gym is a generic framework that any AI researcher tests their algorithms using, so we'll release examples of very realistic challenging sustainability control tasks. Hopefully, this will mean that it's very easy for somebody to say, "Here's my tests done in an Atari game. But here it is on EV charging, and here it is on a sustainable data center with just a click and a run," as opposed to having to build all that infrastructure themselves. What we think, the vision behind this is there's a number of challenging aspects of these problems that is not present in the type of test environments that are common in machine learning right now. That'll draw people to use this sort of thing. Like, in the EV charging example, there's distribution shift depending on the type of year. At Caltech, our chargers are very active or not very active. When COVID hits, they drop to zero. When people come back to the office, it goes back up. There's massive time variation distribution shift, which is really hard for machine learning algorithms to satisfy and handle. We'll have very realistic examples of that for people to test on. They're constrained in ways by safety that most of these Atari games and things like that aren't. They're multi-agent, so you have different controllers in different places, which is something that, again, is not there in the research community so far. We hope that this will draw a lot of people to say, "I'm a machine-learning algorithm developer, but for the application part of my section now, I can work on important problems instead of an Atari game." [laugh]
ZIERLER: Adam, a ripped from the headlines question I can't help but ask. A few weeks ago, there was an AI ethicist at Google that made some news claiming that—
ZIERLER: —the AI has achieved sentience, that it's passed the Turing test, whatever that means. For your research, do you engage with those kinds of questions, even on a philosophical level? Is that useful or fun for you to think about in that way?
WIERMAN: I don't personally enjoy [laugh] that sort of discussion. I find that it's distracting from actually making progress and thinking carefully. Most of those discussions are not, in my opinion, based on really concrete definitions of what you mean by the terms involved. It ends up being just an argument about those things. I think both the sentience or the doom-and-gloom perspective, it's much more interesting and useful, in my opinion, to have these discussions around concrete measures that you might want to impose, or concrete places where you use it. After all, these things are just optimization solvers. Let's think about how we want to constrain the solutions that these optimization solvers are giving, and what sort of safety constraints and what sort of situations, which guarantees we should ask for that. That tends to be the language that we look at in our research and in our applications.
ZIERLER: Just to historicize your perspective, obviously, you look at machine learning as a tool to use. Is that to say that AI is not far enough along where it might not be willing to be a tool itself, or that's never in the cards? Is it far enough away where you just don't have to think about it, or you don't want to think about it because that's not ever happening as far as you can tell?
WIERMAN: [laugh] That's a tough thing to answer. But I don't see that as a near-term situation that we need to worry about.
ZIERLER: Near-term like maybe in your lifetime kind of thing—
ZIERLER: —you don't need to worry about? [laugh]
WIERMAN: [laugh] I think that there are much more likely and significant fears that we should have, and things, problems that we can go after.
ZIERLER: Adam, what aspects of economics or which economists do you work with in analyzing the financial or even the behavioral basis to the systems that you work on?
WIERMAN: At Caltech, the two economists that really I work most closely with are Federico Echenique and John Ledyard. John, in particular, was really who I should thank for my background in education in economics. When I arrived, there was this nice kind of Friday nights at the Rathskeller that the economists had. They were welcoming enough to let me join in. Every Friday, I would go over, and we would just sit around for a couple hours and talk research with the faculty and the graduate students. I learned a lot from those. Eventually, very soon after I arrived, I was co-teaching a course on the intersection of CS and economics with John Ledyard. After that, I felt like I had my legs under me, and I could start to do research in the space of it. In addition to Mani and Steven, who I mentioned being interested in sustainability, John had done a lot of work around the Enron market failures in the early 2000s in California for electricity markets. He was very excited when us engineers decided that we were going to come back into that space, and he was very happy to work with us in a lot of those directions as well.
ZIERLER: A generational question. I'm focused, of course, on Caltech from a historical perspective. Fifty years ago, all of the undergraduates wanted to be physicists, the next Richard Feynman. Today, of course, the overwhelming majority of undergraduates are in computer science. First, what do you see as the big narrative in this transition? What is it about computer science that is causing undergraduates to vote with their feet in such numbers?
WIERMAN: This is something I spend a lot of time thinking about. In my time as department chair, this was the biggest challenge and also excitement about being in computer science at the time. In terms of what's exciting, I am probably skewed by my own view. But what's exciting is, no matter what you want to do, computational tools are what you're going to use to do it in whether it's research or industry right now. Whether it's geology or astronomy or biology or chemistry or physics or just computer science themselves, you need algorithms. You need programming. You need data science. You need these things to succeed there. Whatever an undergrad is excited about, they very quickly realize to do it, they need CS, and so they start taking CS classes, they start taking on top of whatever they were thinking. Our classes are great. The material is really engaging. Caltech students are puzzle-solvers. CS is filled with really fun puzzles, and so they get excited. It's something where there's a lot of students that come in really excited about it already. There's a lot of other students that come in excited about another area, and realize when they get here that they're also excited about CS. We end up with over 50% of the students majoring in things in our department, so CS or IDS, the data science major. But more than a third, I think, nearly 40% of them at this point are double majors in something else too. It's not as if people are doing CS and forgetting the rest of the intellectual environments at Caltech. It's that they're adding to what they're doing by also doing CS. I think that's really exciting intellectually, and that's something that's special about our program. Our program is designed so that in CS 1, you're doing chemistry, biology, astronomy as your projects when you're learning to program; the same thing in CS 2. You're really seeing, day one as a student in computer science at Caltech, that scientific discovery depends on computation, whatever science you're interested in.
ZIERLER: Adam, because of the voracious job market demand for computer science majors, particularly coming out of a place like Caltech, traditionally, Caltech undergraduates, it's been off the charts in terms of their decision to go onto graduate school. Has this historical trendline skewed those transitions? Are we seeing fewer Caltech undergrads going to grad school because of opportunities in the private sector?
WIERMAN: In aggregate, yes, because as CS has become more of a higher percentage of the students, that's means that a fewer percentage are going to grad school. But the reason for that is not that there's been a change in the outcomes of CS students. Historically, CS students have been basically about a third. There's been a guaranteed third going to grad school; a guaranteed third going to industry; and then the middle third, depending on the start-up environment, splits itself between grad school and start-ups. When I arrived in 2007, this was around like 60:40. During the downturn that followed, it became a little bit more skewed to grad school, and like 50:50 to grad school and industry. Then as the start-up market picked up again, it went back to more like 60:40, 70:30 to industry start-ups versus grad school. Now, we're at about 60:40 again. That compared to Caltech as a whole is about 50:50. We tend to be slightly lagging within the CS major but not that much. Like, 35% to 40% is the low end of the students that go to grad school. I think if you compare that, where I came from, CMU is a top CS school, it's like 5% to 10% that go to grad school in CS.
ZIERLER: I'm amazed the numbers are what they are as you're telling me at Caltech. That's incredible.
WIERMAN: Caltech is off the charts in terms of computer science students pursuing research and grad school compared to our peers. I have more of my undergrad advisees that have become faculty than my advisor at CMU has that have gone to grad school. I'm not worried about that shift. Like, Caltech students go where their intellectual ideas, where they feel like they can have an impact. Depending on the time, that's industry or grad school. But we're not in a situation where industry is stealing the best minds from going on, from doing research at Caltech.
ZIERLER: What is it? Maybe just a pedagogical question. What is it about CS or the culture at Caltech generally where undergraduates are looking at big six-figure salaries as a 22-year-old, and saying, "No, I'm going to go get my PhD"? [laugh]
WIERMAN: I'll go to grad school. [laugh]
ZIERLER: What's happening at Caltech that might explain that?
WIERMAN: Part of it is selection bias. Caltech selects students for admission who are interested in research. I'm on the admissions committee right now. That's what we try to do. We try to pick people that have an intellectual curiosity that are going to go out and discover.
ZIERLER: What are you seeing in an 18-year-old's application where they're demonstrating in a strong way that they're committed to research?
WIERMAN: I guess maybe I'm make it really concrete. Imagine a high school student that spent their free time developing apps that they sell on the App Store and blah, blah, blah, versus one that spent their free time reading research papers and implementing the ideas that they saw in those research papers to see if they could make them work themselves. One is a better Caltech student, and one is a better Stanford student. That sort of thing does often come through in these applications that we're seeing. Where is your passion? Those are two extreme examples but which place does it fall? I think we admit based on that. But then also when you're here, the Caltech undergraduate experience involves research from day one. Even in CS where we're overloaded with people looking for research, we have people doing research with us during the year, during the summer. Nearly every student graduates having done research with a faculty member, and that's just not the case at other schools. That's an indication of the student interest in seeking it out but also the experience of research being a hook. If you go to grad school, you've been hooked by a research experience somewhere. Otherwise, you're going to take the salary.
ZIERLER: Another numbers question. Given the dramatic rise in percentage of CS undergraduate majors, first of all, generally, Caltech tenaciously has refused to get bigger and bigger and bigger. Correct me if I'm wrong, but CS itself has not gotten bigger, at least to keep pace with these trendlines.
WIERMAN: Of course not, no.
ZIERLER: How does the Division and the department, how does it square that circle?
WIERMAN: We could have a whole long discussion about the plans here, and where we succeeded, and where we failed. But that was my job, as the [laugh] department chair during my period, was to think about that. I guess there's two different pieces, I think you're asking, based on the educational piece. But one is, on the educational side, supply and demand, how do you provide a program that is exciting and engaging for students when you are at a different scale? The other is, on the research side, how do you scale up to meet the needs for algorithms, computer science, machine learning across campus in research agendas from every area with a small set of faculty with that as their core expertise?
But, anyway, starting with the educational side, I guess, first of all, Caltech has made a commitment to grow. There's a small growth that happens at Caltech. Our provost during his time basically made a commitment that that growth would be targeted at computer science. We have been searching slowly but steadily every year, being able to make offers, and try to grow. That has happened to some extent but not as much as we'd like. We're still well below the target of faculty that we set in Ed's time at Caltech. The reason is that the job market for faculty in CS is extremely competitive, and everywhere is looking to hire. That means that faculty move, faculty retire, which counters the growth. Also, it means that any time you're making an offer to someone that we would consider hiring, they also have offers at three, four, five, ten other top schools that you're competing with to get them to choose Caltech. We've hired extremely well. We just have some super stars that we've hired, especially in the last few years. I think that's great. But on numbers, we haven't grown that much. But in terms of junior faculty, we've certainly grown a lot. That's one piece. The other piece is Caltech has created teaching faculty positions, and we have hired teaching faculty within CMS that help us with the undergraduate curriculum, and that's made a big difference. Konsta Zuev, Adam Blank, Mel Hovik and, in addition to two that we'd had for a long time, Mick Vanier and Donnie Pinkston, they're all incredible teachers. They get perfect TQFRs for every course they teach, the students love them, and they really are a foundation for being able to cover courses at the scale that we need to cover to teach a huge fraction. At this point, if our department was a division, it would be the largest division in terms of undergraduate majors and courses, by far. There's a lot [laugh] that we have to cover in that space. But teaching faculty, some hiring growth has been key. But the other thing that we have done, which is I think the right choice at a place like Caltech, but it's very different than other places is we don't offer the traditional CS program. It's not. We long ago made the decision that we don't have to offer everything that is traditionally core CS. What we should offer is an interdisciplinary program that focuses on what we think of as the exciting research opportunities at Caltech. That means when we hire new faculty, and they come in, we don't say, "You teach compilers because we need somebody to teach compilers." We say, "Design a sequence of courses based on your research that takes an undergrad from what they need to know to the grad student, to research so that we can connect undergrads to the research that the faculty are doing as quickly as possible."
ZIERLER: That also sounds like one of the reasons why so many undergraduates in CS go onto graduate school.
WIERMAN: Yeah, exactly.
ZIERLER: They get a taste of that.
WIERMAN: Right. We could get them to taste because that's why you should come to Caltech. If you want a program where you can take every standard class, go to a big school. If you want a program where you can get into cutting-edge research as quickly as possible, and spend a year or two doing it with faculty, that's what we can offer. That's special at Caltech. That also releases some of the pressure to fill holes in a program that would be very difficult to fill with the scale that we have.
ZIERLER: Now, on the research side—that was the educational side—on the research side, how do you square that circle?
WIERMAN: The research side is a lot harder, actually, and I feel like we've made a lot of progress but we've still got a long ways to go. That's where IST, the institute that I'm a director of, plays a big part. If you have three or four machine learning faculty and, all of a sudden, everybody wants to use machine learning in their research, what do you do at a place like Caltech? You're never going to hire 30 machine learning faculty, and have that be. The question there, that's a much bigger challenge, and so you need financial resources to be able to support people in that space. You need fellowships. You need discovery grants. You need seed funds. You need all of these things to be able to let people explore. But you also need, in some way, to magnify the people.
One approach we've been using there is postdocs where Caltech is pretty distinctive in that we have more postdocs than we have faculty in the CMS department at most times. In some areas of science, that's standard. But in CS, most places have hardly any postdocs. I guess I should say these postdocs are not treated like a part of somebody's group and an employee. They're basically treated like junior faculty. They're given flexibility to work with anybody in the department. They advise students. They can teach classes. They can do all of these things. They love that opportunity because they get all that flexibility before they go and start their faculty position a year or two later. That then gives a lot of magnification where the postdoc can be a bridge to biology or to Frances Arnold's group or to Federico's group or to wherever, to help those connections, advise students on both sides, connect students, and be that bridge.
ZIERLER: Adam, I want to go back to a really interesting comment you made about when you interacted first with Steven Low, and the idea that he was doing really cutting-edge theory but he was rooted in real-world problems and experiments. First, broadly speaking, in your respective areas of expertise, how do you see that binary or that symbiotic interplay between theory and experiment? How does that work for you?
WIERMAN: It's a tough thing. There are not many people in Steven's and my area that do both. In computer science, especially the distributed systems world where we are, you tend to be someone that proves theorems and suggests algorithms, or you tend to be someone that implements a big system, and deploys it, and hacks on it, and improves it in some way. There's very few groups that try to do both. Steven was really inspiring to me because his approach was completely different than that. His approach was for our first few years, hire students that are theoretically strong, and we're going to do theory. Once we have an idea, we're going to hire some systems people, and we're going to deploy a demo. Then at that point, I'll go do a start-up. He's just done this over and over, and I'd never seen anybody else work like that from academia. It gave me that opportunity to see that one of the powers of being a faculty member is you can have phases to your team. You can have phases to your project. You don't have to be in one career all the time. I think the way both he and I think about things is that you have these different projects, these different visions, and one can be in the theory phase. A different one could be in the deployment phase. A different one could be in the start-up or industry partner phase. They can be on top of each other at different phases at different times, and that can be really exciting. I really took that home. Right now, we have that kind of space. In the learning and control work, we're very much in the theory phase. In the data center stuff, we're very much in the industry partnership phase. In the smart grid phase, we're bouncing back and forth between market design, and working with partners, and theory, back and forth in different ways. You can be in different parts with different people in your group, which is very unusual for a computer science research group to do these days.
ZIERLER: If I can draw out of you a specific example, my default for understanding theory and experiment, of course, is physics. Einstein theorizes general relativity. LIGO validates it, something like that. What is a theoretical proposition in your field for which there is an experimental or observational goal that can either prove or disprove it? How does that work?
WIERMAN: That's great. In my field, the theory is giving you an algorithm. The algorithm, in theory, is designed with particular assumptions on what it needs to work. The experiment is taking the algorithm and deploying it in the real world system, which often has lots of other layers that the algorithm has to work interchangeably with. To make it really like concrete in the space of my work, in the sustainable data centers, we did a lot of work applying online optimization to developing new online optimization algorithms. Online optimization algorithms are an algorithm that says you're given a function, a convex function at each round. They're going to change over time. You have no idea how they're going to change over time. You want to choose an action at each round so that minimizes the function you're presented but, also, the distance you were moved from one time step to the next, so that you're not moving too much but you're always getting a good cost according to these functions that are unknown and arriving over time.
This is an online algorithm's problem. You can prove that in the worst case, your algorithm gets a good guarantee. You can prove that the guarantee isn't going to scale badly with the mention of the space that you're in, all these sorts of things. But now, it's a lot of work to take that algorithm, and put it as the piece of a data center that decides how many servers you're going to have in which states at which point in time, because you can run this algorithm, you're going to get an output but then whatever server you have to state, you have to maintain them in that state. You have to then do the scheduling of all the workloads you need to keep them in that state. You have to integrate with the metrics to make sure that the things you have to deal with, the failures of servers over time, so you know what's active and what's possible and what your constraints are. A ton of work to be able to make that go, and lots of design decisions, and that can go badly or well as you're doing that adaptation. You can have a theorist spend—we spent five years working on the theory algorithms piece, and are still working ten years later on different versions of online optimization for some of our work. You could also spend five, ten years developing like the integration of that into it. [laugh] Most groups do one or the other. We were really transitioning back and forth between those at different stages, and learning from one about how to reframe the algorithmic problem, and make it more relevant, or how to deploy the algorithmic problem in different ways in different levels within the data center design.
ZIERLER: In creating an algorithm, what are the baseline understandings or theoretical boundaries where if the end goal is, we want to make this data center more sustainable, less energy intensive, how do you go about creating an algorithm that will connect to that goal? What are the starting points?
WIERMAN: I think that's actually one of our good skill sets here at Caltech. [laugh] This is the thing that makes it hard to work in that space, and also I think one of the things that we're very good at, which is—I said a slightly different answer—it's how do you look at a system design, and abstract a place where that's the real bottleneck to making progress, and there's not just a quick hack that will solve it? This requires really having lots of discussions with people who are building these systems at different levels, and understanding where they're banging their heads against the wall, where they keep trying to put on a Band-Aid, and the Band-Aid doesn't work, because that's a good indication that there's really something more fundamental going on that's stopping them from making progress there. Then the work is to abstract that from an architectural problem in the code [laugh] to a mathematical problem where if you solve it, you'll be able to close the loop, and make an impact. One way when I'm talking to students about this is, the right problem for me is one where if we can prove the theorem, it's a year before you can actually deploy it. You can deploy something and make a difference with it, as opposed to a more classical view of theory or algorithms is one where maybe somebody will find an application for it in five or ten years. That's not the sort of theory that we try to work on. We try to work on a theory where we already see the path to this making an impact in a system. It's a really hard theoretical problem, and hopefully the theory people are also really interested in it, but we see a path back to a system very quickly for, if we solve this, having an impact. Finding those problems is an art.
ZIERLER: [laugh] Adam, last topic for today, a forward-looking one. If you survey what's happening in your group, the graduate students and the postdocs, what are the kinds of things they're working on right now that might provide some window onto where the field is headed?
WIERMAN: That's great. I think the space we're looking is—coming back full circle to where we started—is this intersection of learning and control. I think the foreshadowing is a year or so, and still today, most people think of modern AI tools as being model-free. Like, there's no model it has for what's underlying the system. It's just learning some sequence of activations that seems to work well. Model-free tools have been shown now to work really well in lots and lots of situations. But, at the same time, model-based tools, the ones that make assumptions about dynamics and these sorts of things are what you need for reliability. They've been deployed in grids and airplanes and lots of places to give us the guarantees we need for these systems to be safe. But they get beaten today by these model-free approaches in the typical case. The big open question is combining model-based and model-free so that you get guarantees but also the benefits of the modern AI tools. Lots of people in my group and other groups around here are basically searching for the right way to combine these tools so that you can push that frontier, and get as good a guarantee and as good a performance as possible. I think these are hard, but I expect major breakthroughs in that direction in the next few years.
ZIERLER: I'm sensing excitement on your part.
WIERMAN: Yeah. [laugh]
ZIERLER: There's a lot of good stuff that's happening.
WIERMAN: Yes. [laugh] I'm really excited by that direction. It's a very, I think, exciting thing that will have impact for sustainability but also lots of other areas.
ZIERLER: Adam, this has been a great overview conversation. In our next discussion, I hope to go all the way back to the beginning; get a sense of where all these interests came from.
[End of Recording]
ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It's Thursday, August 11th, 2022. It is great to be back with Professor Adam Wierman. Adam, once again, it's great to be with you. Thanks for joining me.
WIERMAN: My pleasure.
ZIERLER: Adam, in our first conversation, we took a great tour of your approach to the research, and the big societal questions that are impacted by it. Today, let's go all the way back to the beginning. Let's start first with your parents. Tell me a little bit about them and where they're from.
WIERMAN: I grew up in Baltimore, just north of Baltimore. My parents are from Seattle, Washington, originally, and moved out to Baltimore , basically, soon after I was born, when I was 2. My dad is a professor at Johns Hopkins University there, so I was an academic kid. My mom works in environmental monitoring. I joke sometimes that I'm, despite my best efforts, a convex combination of their interests.
WIERMAN: Working in mathematical tools for sustainability is really right in between what they did, despite trying to do very different things initially when I started college. [laugh]
ZIERLER: Are your parents Seattle natives, or did they meet in Seattle?
WIERMAN: Seattle natives, yeah, Seattle natives.
ZIERLER: What's your father's field? What did he teach?
WIERMAN: My dad teaches applied math, an area called percolation, which is random processes over graphs. Every once in a while, we run into the same crew of people at conferences, although we are in different worlds for 99% of the time academically.
ZIERLER: Did your father involve you in his professional world at all? Did you grow up knowing what it meant to be a professor?
WIERMAN: Sort of. For me, being a professor meant traveling a lot. I would go along with them, and we would add our family vacations on after a conference, or something like that. I would be with my mom doing some touring after the conference or during the conference, and we'd go together as a family afterwards. For me, that's what it meant. It was a lot of touring. I would, over the summer, be down at Johns Hopkins just playing on the quad while he was at work, if I needed to be. I was around the university life, but he didn't really talk math with me. He didn't bring his work home in that way. We talked sports much more than math.
ZIERLER: Now, did your mom maintain a professional affiliation when you were growing up?
WIERMAN: Yeah, my mom worked in air management, air quality management. She worked for the state of Maryland for a long time. Then for the Mid-Atlantic Region, she was the director of their air management association of the training and regulation coordination across states on the East Coast.
ZIERLER: Adam, were you always into computers, even when you were a little kid?
WIERMAN: There weren't computers too much when I was a little kid. But, yeah, because my dad was a professor, we were very early to have computers at home. I spent a lot of time on them, messing around in various ways, growing up.
ZIERLER: What was the earliest computer model that you remember in the household?
WIERMAN: I don't know. The desktop model, I don't remember. But I remember having a laptop very early on that was one of these [laugh] chest-sized devices that could sit on your lap but was pretty heavy, and was nothing like the form factor that we have today but was a portable computer. That was pretty exciting at the time, to be able to carry it around and plug it in.
ZIERLER: Did you get a sense from your dad that computers could do things mathematically and scientifically that pen and paper might not be able to do?
WIERMAN: Yeah, definitely. My dad, in his work in percolation, one of the problems that he was interested in was around understanding what's called critical thresholds of lattices. You'd take a lattice or a graph that has repeated patterns in it—like triangles all meshed together, for example—and you let some process spread across it, and you ask what the probability of spreading has to be for the process to take over the graph or to die out, that critical probability. Some of his work was around isolating those thresholds perfectly, and that was often a very computational task of can you get the first digit? Now, computers have gotten better, and your algorithms got better, and now you can nail it down to two or three digits with the code that you could write there. He was definitely using a computational approach to mathematics. While I didn't understand it, that definitely came across that he was somehow using the computer for math.
ZIERLER: This intellectual heritage from both of your parents, your mom's environmental focus, even at an elemental stage when you were growing up, did you think that computers and systems and networks and things like that might be relevant for environmental mitigation issues?
WIERMAN: No, not at all. [laugh] Growing up, there was no connection in my mind between what the two of them were doing. I was not so passionate about either the math or the environmental piece until I went off to college, sort of. While I was living in their household, those were not the things that I was jumping at, often.
WIERMAN: I was good at math, and I was good at the science, but it was not the thing that I was really passionate about at that point.
ZIERLER: Now, did you go to public schools, growing up?
WIERMAN: Yeah. I was public schools all the way through elementary, middle school, high school, and it was great. Now, that's a big part of what I believe for our kids too is public education is really important for society as a whole, and that it's really important that we don't isolate kids from well-to-do families from the other kids from more economically diverse families. We do that with our own kids as well.
ZIERLER: Now, were you always more on the math and science track in school?
WIERMAN: Yeah, I was definitely good at the math and science stuff. But I don't know whether it was boredom or whether it was just things outside of school were more fun. I was much more of the type of kid in middle school and high school to get the work done, and then go outside and play basketball or soccer or whatever it was, as opposed to really get absorbed in the schoolwork in the way that I became absorbed when I went to college.
ZIERLER: Adam, when it was time to think about college, was Hopkins on your radar—
WIERMAN: Off limits. [laugh]
ZIERLER: —or it was off limits? You didn't want to go?
WIERMAN: [laugh] No, I wanted some distance. At college, the typical schools, MIT, etc., are the ones that I applied to. I think one of the challenges for me as a high school student was after I toured MIT, I didn't enjoy it, and I was much more excited about going to Carnegie Mellon because of the culture and the student life there, and my parents couldn't understand why I didn't want to go to MIT.
WIERMAN: [laugh] There was a lot of discussion before I convinced them that Carnegie Mellon was a good choice as well. By the time I was there a year or two, they were very big Carnegie Mellon supporters.
ZIERLER: Now, was there a northeast application rule for you? Would you have thought to apply to places like Stanford or Caltech?
WIERMAN: No, I was definitely biased towards the East Coast. I don't think I applied anywhere on the West Coast.
ZIERLER: Was it computer science from the beginning? Were you thinking about colleges not within specifically a computer science framework?
WIERMAN: No. My college time was very exploratory. I started off thinking that I wanted to do civil engineering because I wanted to be an architect but wasn't a good enough artist, and so I thought—
WIERMAN: —civil engineering was a good way in. Then I had a good time in civil engineering but also felt that it was not quite what I expected. During college, I went through civil engineering, psychology, statistics, math, and then finally declared a CS major my senior year, and ended up graduating with some sort of majors or minors as specialization in all of those areas when I was finally done with undergrad. It was a long, winding path that gradually got me to CS.
ZIERLER: Adam, just the timing, when did the internet come along that might've really influenced the way you thought about computers and what they could do?
WIERMAN: In middle school, I think, already I was using the internet but in the sort of LISTSERV, email, text-based format, playing around with it, and investigating things. That was fun, and I wrote a little bit of code even at that point to try things out. Then definitely in high school, I was writing things for my fantasy baseball team, and writing code to search baseball player stats, and make my organizations for doing that sort of thing.
ZIERLER: As an undergraduate at Carnegie Mellon, what were some of the big ideas in computer science that you remember?
WIERMAN: The course that got me to really realize that computer science was what I wanted was a course taught by Steven Rudich, who was a really special teacher at Carnegie Mellon. He taught a class called Great Ideas in CS, and it was, in some sense, meant to be an introductory discrete math course, like Ma 6 at Caltech. But he taught it in a way that each lecture was basically a performance, a vignette on its own, a story. It's funny. He would start every class with a magic trick 10 minutes before class in the room—
WIERMAN: —like a full-on magician-style magic trick that was amazing. Anybody from any other class would come and watch the magic trick, the classroom would be packed, and then everybody would run off to their classes if they weren't in his class. [laugh]
WIERMAN: Then it meant that everybody was there on time, and everybody was super excited. Then he would start with a flourish after that. Each lecture was really designed to be a very fun problem-solving exercise, introduce some important part of discrete math, whether it be induction or recursion or generating functions, or something like this. But it was always done through a really creative game or interactive thing that you played. The homeworks in that class were notoriously difficult, but they were challenging, and forced you to be creative with math in a computational way. I just loved it, and basically talked to him through the class, and realized that that's what CS was. It wasn't just programming; it was thinking in that way. I was on board at that point. I worked hard to be able to transfer into CS.
ZIERLER: This was your first entrée into the creativity that goes into computer science?
WIERMAN: Yeah. Before that, computer science to me was programming, which was fun but it was a tool, and I didn't see that as something that I wanted to spend my life doing. But this showed me that it's more than just programming simple games and things like that, which is often what you do in your intro programming classes, or at that point what intro programming meant.
ZIERLER: The creativity aspect, would that be the intellectual kernel of you starting to think about applying computers to societal benefit?
WIERMAN: Yeah, starting to think of it as maybe a little bit of that but more just the intellectual puzzle involved in figuring out how best to do something, figuring out what you can and can't do with computers, the algorithms and complexity side of CS. Once I got drawn in by that, then it was, can I figure out algorithms that are useful that can make society better? But the first nugget was just the intellectual challenge of understanding what's possible with algorithms; what's not possible with algorithms; how efficient can they be?
ZIERLER: Adam, there's generally a cultural encouragement to go elsewhere for your graduate work.
ZIERLER: I wonder if you were a relatively latecomer to CS as an undergraduate, if that influenced your decision to stay?
WIERMAN: That was a big reason why I felt comfortable staying. I felt like I had just gotten my feet wet at Carnegie Mellon, and it was this amazing place, and there were all these people that I was just starting to get to know that I wanted to have a chance to work with and explore. It was hard to walk away from that to go somewhere else when I felt like I hadn't taken advantage of the amazing place where I was.
ZIERLER: The master's, was it a terminal master's, and then you went onto the PhD, or that was incidental on the way?
WIERMAN: That was incidental along the way at Carnegie Mellon. I went straight into the PhD program, and then the stepping stone was the master's along the way.
ZIERLER: Tell me about your advisor, Mor Harchol-Balter, how you got involved with what she was doing at the time.
WIERMAN: Mor is a really energetic, enthusiastic person. She's really a great mentor. At Carnegie Mellon, there's this process which is really nice and intense when you start the PhD program where you have I think it's two weeks, no, maybe three weeks where the faculty all present 15-minute research pitches. The first-year PhD students watch them. They have meetings in the afternoon. That all happens in the morning. In the afternoon, you meet with faculty. Then at the end, you submit a list of who you want to work with. This is like a two-way matching process for advisors, rather than coming in already matched to an advisor, which is what a lot of schools do. That was also another reason why I liked the idea of going to Carnegie Mellon because I didn't know exactly what I wanted to do yet because I was so new to CS. This matching process let me put that off.
But Mor, her talk was very much like Steven Rudich's model of extremely interactive, and just giving us puzzles. She basically just challenged us like, "Here are interesting puzzles in my research area. If you solve one, come and talk to me about it, or if you're interested in solving one, come and talk to me about it, and we'll work together on it, and we'll see, and that'll be the way that we introduce it." She got me sucked into a couple different puzzles that were really interesting, and two of them actually led to research papers in my first year with her. She was really good at getting you motivated to work. Also, one thing I really like about her group is she cares about fully developing her students. This was something that I guess knew was important, partly because my dad was an academic, It's not just who's working in the area you're working in? It's will they help you become a better writer, a better presenter, a better speaker, all of these other things? She really works with her students intensely on all of those aspects, not just giving advice on what problems you should work on.
ZIERLER: Tell me about those initial puzzles that became papers in your first year.
WIERMAN: They were puzzles. A big part of my thesis was in some sense motivated by one of them, which is a very simple scheduling problem, which is if you have a single server and jobs arriving over time to that server, if you want to schedule the jobs to minimize their response time, their average response time, which is how long on average do you have to wait before they finish service, how should you order them? What order should you do them in? There's a very long-standing result, which is that if you want to minimize the average response time, you don't actually need to know anything about the distribution of sizes of the jobs or their rival process or anything like that. No matter what all of those are, you should schedule them in the order of shortest remaining processing time. Whichever job has the least left to do, you do that one first. If a new job arrives, if it's bigger than the one you're working on, you just let it sit in the queue. If it's smaller, you switch to it right away, and start working on it. A really beautiful proof shows that that's optimal. That was one of the things she asked, like, "This is optimal. Can you prove it for me?"
ZIERLER: How do you define "optimal" in this context? Is it just efficiency?
WIERMAN: The smallest possible average response time. You took the time from when each job arrives until it finishes service. That's its response time. You average that over every job that's going to come. That's the mean response time. If you want to make that the smallest, you always do things in shortest job first order or shortest response time, first order. But the question at the time was people often didn't use this policy in practice, despite it being much better in terms of the response time, because they were worried about fairness. They were worried about the large jobs are often important. They might get starved of service, and never get finished. In that case, you're in really bad shape. The big-picture question was, is that really true? Is it really true that this is unfair to large jobs? But, of course, to answer that question, you have to define what "fairness" means, and then prove results to ask about that.
The first few years of my work was really understanding fairness, and putting axiomatic definitions for how you can quantify what fairness means in job scheduling processes, and proving that, in fact, unless you're in an overload situation, SRPT is fair to the largest jobs, and it really works very well, and these concerns people had were fairly unfounded, and that you could make little changes to it to even improve the fairness further. But it was just as fair as the standard definition of a fair policy that people were using instead of it, even out of the gates without using it. That was one of my first big results to show that. Then, as a result of that, the large companies in data centers and such were willing to start using size-based policies in their systems. Then a lot of my work in my PhD thesis was proving results about size-based policies; how to make them more practical; how to improve their efficiency; how to use prioritization in a smart way where you're still able to guarantee different service requirements or different data locality requirements, things like this that come up in real systems that aren't in that simple model.
ZIERLER: In a CS context, is the notion of fairness zero-sum? Is it by definition fair to one entity and unfair to another, and you have to make those decisions?
WIERMAN: No, there's lots of different notions that you might concern yourself with. In this particular context of job scheduling, there were, actually, four or five different definitions of fairness that came out that we were able to prove relationships between, and understand which ones were consistent and which ones were inconsistent, and these sorts of things. Some of them were very axiomatic based on making sure that in this length of time, you're getting one over nth of the service, if there are n jobs in the system. Some of them were Pareto optimal notions. Like, if I look at jobs of size x, the minmax of the response time of size should be a particular level. If you're achieving that, then you're fair because that's the best minmax that any policy can have. That's the notion in which SRPT is as fair as anything, so SRPT has the same minmax over job sizes of performance as any policy ever could. It's also strictly better in a Pareto optimal sense for every job size than the pure fair policy of always dividing the server evenly among every job that's present, which is often a notion of fairness people use. If we're always dividing things evenly among everybody, that gives a certain level of performance for every job size, class, and SRPT is better for every job size class than that policy. There's no job size class that is worse than that, and that was a very surprising result when we proved that for people.
ZIERLER: Adam, just to clarify, this is not a purely theoretical pursuit. This is industry-relevant.
ZIERLER: People want this kind of research in their businesses.
WIERMAN: That's right, and so this was, I think, another piece of what attracted me about Mor's research agenda was that she had that theory for practice mindset, where the philosophy that I have now, and that she taught me, to a large extent, was that there's lots of ways to do theoretical work and algorithmic work. A lot of it is very deep and important, and may find application in 5–10 years. But there's a different kind of theoretical work, which is the type that, if it's successful, will have application in a year or two to real systems. That's where I tend to like to live. This was work where there was a bottleneck. There are these way more efficient scheduling policies that people are scared to use because they don't want to be unfair to jobs. Is that a real thing? If it's not, then they can immediately start to deploy these types of policies, and improve their response times by a factor of 2 or 3, which is a huge win when you're talking about latency in streaming, or latency in cloud email type jobs, things like this.
ZIERLER: Now, was Mor's approach, where she had an eye toward industry and applications, was that representative of the culture of CS at Carnegie Mellon generally, or was she more unique in that regard?
WIERMAN: She was pretty unique in that regard. Carnegie Mellon has a huge CS program, compared to Caltech. Their school of CS is as big as Caltech is as a whole.
WIERMAN: There's a wide variety of cultures in CS.
WIERMAN: [laugh] There's people like her, and there's also people at every extreme. There's people on every corner of CS at Carnegie Mellon.
ZIERLER: Even from these first papers that gave you a pretty good taste of how this would actually translate to real-world relevance?
WIERMAN: That's right. It was really exciting because in her group, there were some students that were very practical, and so when I would prove these results, I could then work with them, and we would deploy a system, and then that system would be a good demo that we could show to companies. I could spend a lot of time on the proof side but, also, very quickly, see a deployment by working with other students in the group.
ZIERLER: From these first few papers, did you have direct interface with industry, with potential clients? Did you see where this was headed?
WIERMAN: A little bit, but as a first- or second-year student, I wasn't directly hands-on with the companies as much. But I had a few conversations. It was more the senior students and Mor that were working with the industry partners. But then, later on, there was definitely a like, "Go and visit industry labs, and talk to them about what you're doing, and try to get them excited about trying out some of the algorithms that you develop."
ZIERLER: To go back to that original offer from Mor that was so compelling to you—here are a few puzzles; let me know how you do with them—how did those initial puzzles relate to some of the bigger research questions she was pursuing at that point?
WIERMAN: One of the puzzles was just to prove SRPT was optimal. It's a beautiful proof, so as soon as you see it, you just want to work more with this policy. Then I showed up at her door with the proof, and showed it to her, and she's like, "Yeah, that's right. Now, what about fairness? [laugh] Would you actually want to use this? How are the log jobs created?" That led to a research problem. That was day one, and by the end, I guess, about six months later, we published our first paper, which was showing that the first result was just that the asymptotically largest jobs were treated fairly. Then a few months after that, we were able to prove that all jobs were treated fairly.
ZIERLER: I know that asymptotic has relevance in particle physics, of course.
ZIERLER: What does it mean in CS?
WIERMAN: It means lots of different things. In this case, it meant, like, if you let the job size go off to infinity, the infinite-sized job was treated fairly. Intuitively, that was an indication that all jobs should be treated fairly because the largest one should be the most likely to be jumped in front of by small jobs. It turned out that that wasn't actually true. It turned out that there was monotonicity in the fairness of jobs where you had this interesting hump, and then it came back down once you got into the asymptotic regime. But our motivation for looking at it was that we thought that that might be the worst case in terms of unfairness, and so we figured out how to do that. There were some interesting coupling arguments from applied probability. Then after we did that, we started to work on the rest. We uncovered this other kind of, very unintuitive at the time, phenomenon that the medium-sized jobs were actually the most unfairly treated by SRPT because the large jobs stayed around long enough that they were there when the system was empty other than them, and so they got lots of service then. Whereas the medium jobs could get jumped in front of, and not have that empty period to catch up afterwards, potentially.
ZIERLER: Adam, you alluded to it a little bit. Tell me the intellectual process from those initial research puzzles into what ultimately was your thesis work.
WIERMAN: There was a couple different threads, but that got me into scheduling and SRPT, and then the thesis work was looking much more broadly at scheduling than just one particular policy. It was trying to understand broad classes of policies, and understand the impact of practical things that are often missed by theory. Scheduling and queuing theory were these big areas dominated in the like '70, '80s, into the '90s, and they tended to make very simplifying assumptions about the systems that they were studying. Especially nowadays, there's multiple servers. There's data locality. There's delays. There's start-up and set-up times as the servers move in and out of power-saving modes. There's uncertainty about job sizes so that you don't know exactly how long a job is. There's failures that happen with servers that you need to be robust to. The simplistic scheduling policies and simplistic models didn't really capture the real systems, and so, in some sense, the goal was how do you modernize the theory in a way that you can make it applicable to the systems that were out there in the world today, and solve the problems that the industry practitioners are facing.
ZIERLER: Adam, I'm intrigued by the idea they're overly simplistic theories. What does that tell us about some of the biases of the theorists?
WIERMAN: [laugh] They were simplistic theory in the sense of the models making simplifying assumptions; not in the sense of the technical work being simple. The reason why the models were simple was because the work was very technical and challenging, and it had to be done in these simple models before you could even hope to go to the more complex policies and the more complex settings. But there had been a lot of developments in the technical side, especially around one of my favorite topics, heavy-tailed distributions, which meant that you could apply the work in settings that it hadn't been able to be applied before. Heavy-tailed distributions are another thing I should bring up as an entry point for me and Mor that was very appealing. It shaped a lot of what I've done where I had seen a little bit about heavy tails in my undergrad but not a ton. One of her probing questions was around what you expected the distribution of job sizes to be, and why, and what properties that meant for scheduling. What did that imply for what you wanted to do in terms of scheduling? To walk you through one of those things that I think is particularly clever, real-world jobs are heavy-tailed. They tend to be Pareto-like or parallel-like, which is very different than the Gaussian world that most of the time you're taught in undergrad probability courses.
That makes actually a huge difference for how you design systems, and how you design scheduling policies, because if you're, for example, in the job scheduling side, if you're scheduling jobs that are all Gaussian distributed, then they're all very close to the same size, and that means that just doing things in first-come, first-serve order works pretty well. Whereas if you have heavy-tailed distributions, you have some really big jobs and lots of really small ones. If you do things in first-come, first-serve order, then you might have 10 small jobs get stuck behind a really big job, and now all of them have a massive delay because of that big job. Whereas if you had just done the big job last, the 10 small jobs would have really tiny response times, and be out of the system, and the one other job would have a large response time. But you'd be much better overall for the system performance, and have much less work in the system at any given point, and much less storage that you need, and all of those things. Everything would be a lot better. How you schedule when you have heavy-tailed jobs is very important, much more so than when you have light-tailed jobs. You can take advantage of the fact that you know there's going to be some really large jobs and lots of really small jobs to do things in your scheduling policy and your system design that are very different and very interesting and simple and effective.
ZIERLER: Adam, as you mentioned, early in the graduate program, you didn't have much interface with industry, but that happened a few years later. What were the kinds of businesses that were showing interest in what you were doing?
WIERMAN: Scheduling in the cloud is the main one that I was working on at the time. This is your data centers deciding how they do load balancing, how they do the job scheduling in the web server side, for example. In our group, we released a couple of systems around web servers and video streaming and database management that all showed that size-based scheduling could be effective there. Those were our entry point to talking with the companies.
ZIERLER: Did you ever work on site? Were you ever in the mix to see what was happening, how these things were being applied?
WIERMAN: No, as a grad student, I didn't somehow want to do that. As a grad student, I would visit. I would talk to them about the ideas, and I would be very happy if they implemented them rather than me going there to do the implementation. I wanted to be proving my theorems at Carnegie Mellon. [laugh] I did a lot of that. It was a visit to talk to them about the ideas, and to present the way it works, and to show our demo, and then let them take over from there.
ZIERLER: Adam, what was Mor's style like as a graduate advisor? In other words, from the initial contact of essentially handing you a puzzle, did that dynamic remain through the dissertation or, at some point you were coming up with your own puzzles?
WIERMAN: Very much coming up with my own puzzles. She had the model of, early on, she presents problems. Then you see which ones you're interested in. But by the time you were in year 2 or 3, you were coming up with questions, and working as a peer with her. She's very hands-on. She works with you on the whiteboard. You have long weekly meetings where you're doing technical work together, and writing the paper together, and writing the code together, kind of thing. She's very involved with her students, which is really nice.
ZIERLER: Clearly, you had a lot going on with your thesis research, a lot of different topics. Would you say the thesis was more of the many papers stapled together variety, or was there a single overarching theme that everything gathered around?
WIERMAN: It was a single overarching theme. I actually put a lot of work into that. For the personal side, I had time to write a really involved thesis, so I put a lot of work into writing a thesis that was very much book-like in terms of providing an overview of how to think about scheduling. The general thing was how to think about doing analysis of scheduling policy where you're analyzing axiomatic classes of scheduling policies instead of specific policies. The motivation for that was that if you're analyzing every policy that prioritizes short jobs, then it doesn't matter if that policy has lots of complexities that make it very complex because of the system implementation or not. It still follows that axiom, and so the analysis holds, and the guarantees hold, and the bounds hold, and everything that you've proven still holds. Even if there's a lot of miscellaneous things that need to be done just in a hacky way to make it work in the system, still the theory applies. For the thesis, I overviewed a lot of the standard techniques, and then showed how modern techniques let you get results that are parallel but for broad classes of policies instead of for a specific policy. We did this for quality of service type results around 95th percentile for average case results; for distributional results; for fairness results; for predictability results, latency type predictability results. We had chapters and themes but all of them were looking at analyzing the same sort of new axiomatic classes of policies.
ZIERLER: What is an axiomatic class? What does that mean here?
WIERMAN: Here, it's inspired by the economic way of, if you think of axiomatic notions of what a voting rule should be like, the same sort of things but they're scheduling policies. For example, can you write down a set of axioms for what it would mean for a policy to prioritize small jobs over large jobs, and have that be in a strong enough sense that it's near optimal? Because SRPT is one policy that prioritizes small jobs over large jobs but that's a very specific one that requires you to know exactly the remaining time and the original size of jobs, and requires you to be able to switch whenever you want between jobs without paying any cost or having any lag. How broadly can you define a set of policies that is like that in spirit, and has performance that is still near optimal and not much worse but includes as much variety of variations as possible, and so some notion of approximately always prioritizing small jobs? You don't want it to be too strict that it can't include things that are hacky versions of it but you want it to be in that spirit. It tended to be trying to write down very simple axioms, two or three for each class, that prioritize a fair policy or a small job first policy or a large job first policy, and then see what you can prove about them.
ZIERLER: Would you say that the research was responsive, as you mentioned earlier, this need to problematize the theory so that it more accurately conveyed the complexity of real-world situations?
WIERMAN: I think part of what I was driven by was the idea that there's one version of applying this scheduling theory to practice, which is SRPT is good, so you go off and, in your policy wherever possible, try to prioritize small jobs. But, at the end of the day then, whatever policy you have, there's no rigorous bounds that I can give you on the performance. There's no prediction I can give you about the distribution of the performance you're going to have or anything. It's like you get inspiration from the theory as opposed to actually the theory being actionable in the system design and operation. The hope was that if you have classes of policies, than the policy you implement is still one of these classes, even though you've done crazy things to make it work in your system. Now, I can still give you a way of giving a QoS bound. I can still give you confidence in our goals on the performance you should expect so that you can identify when something's going wrong by if it's outside of that confidence in our goal. I can still give you all those tools that you can use in real time for your system, as opposed to just leaving you be once I give you the inspiration to prioritize small jobs.
ZIERLER: Adam, in our previous conversation, we had a great overview of how special Caltech undergraduates are, just when you look at the numbers, and their interest in the fundamental research, even when there are these massively—
ZIERLER: —high-paying jobs that are dangling in front of them. Would you say that you were similar? Did you have that kind of approach at Carnegie Mellon?
WIERMAN: I did. I graduated from undergrad during the dot-com bubble, that era, and there were not many people from Carnegie Mellon going to graduate school at that point. I forget the exact numbers, but something like 5% or 10% of the class, at most, was going to grad school. Everybody was going to industry, similar to the world today where there were lots of available high-paying jobs in industry that were attracting people.
ZIERLER: How centralized was Silicon Valley at that point? In other words, if you're graduating with a PhD from Carnegie Mellon in CS, is everybody headed to Northern California, or it's more dispersed than that?
WIERMAN: It was maybe even more centralized than it is now because now there's a bunch of different places where you can go for industry things. At that point, a lot of the start-ups that were really exciting were all just in Silicon Valley, and so there was a huge channel going from Carnegie Mellon to the Bay Area.
ZIERLER: Was there active recruiting on campus?
ZIERLER: Did you have to fend off interest in that kind of thing?
WIERMAN: [laugh] Yes, there were plenty of opportunities on that front, but I just wasn't interested in that at that point. I knew what I wanted to do. I wanted to go to grad school, and to be a professor.
ZIERLER: Even by the time you were thinking about finishing up the PhD, were you considering industry at that point, or was it very much the same track?
WIERMAN: From junior year or so in undergrad, I knew that I wanted to be a professor, and the hope was that I could pull it off. Maybe more than most, I knew that that was not a given. It was not a given to go to grad school, and come out the other side with a professorship. But I knew that's what I wanted to do. I really liked teaching and mentoring, even as an undergraduate and TA-ing and participating in that sort of thing. I wanted to be in an environment where I could do that. I had it in my head that even if I was not at a place like Caltech or MIT, I wanted to be there rather than in industry.
ZIERLER: As you mentioned, because the thesis was book-like in its intellectual cohesiveness, what was the overarching argument? What was the contribution you were making, would you say?
WIERMAN: The argument was really that it's possible to have scheduling policies, and analysis of scheduling policies that was relevant and actionable in operation of modern systems, and that you could do so not by analyzing a detailed specific model for a specific system but by analyzing these axiomatic classes that would apply to every system all at once, as opposed to going one-by-one towards analyzing the precise details of their system. But, instead, let's just say axiomatically, we want policies to work like this. This class will let us analyze Google's systems, Facebook's systems, Microsoft's systems all at once.
ZIERLER: Where have you seen applications of this idea? Where has it gone off and achieved a life of its own?
WIERMAN: Actually, on the academic side and the industry side, I think, it did very well. In industry, it's a pretty common practice now to be willing to do size-based scheduling, so you don't have to make an argument that this is the right thing to do. Most system designs have prioritization of small jobs in same way to avoid having them sit behind large jobs, and there's a recognition that that's what you have to do when you have heavy-tailed jobs. That just wasn't the case before we had that work. The idea of classes, it's fun. Actually, I was just a week ago sitting on a dissertation for one of Mor's recent students. He redid my thesis but better, in some sense [laugh]—
WIERMAN: —after 15 years. He has a really beautiful class that's even broader than the ones that I was able to do where you can prove just really deep results. The idea of scheduling, studying scheduling classifications instead of studying scheduling policies, has lasted at least 15 years, and people are still coming up with creative new ways to do that.
ZIERLER: I'm sure you're not being fair to yourself. It's probably more accurate to say that it's better as a result of some of the things—
ZIERLER: —you were doing 15 years ago.
WIERMAN: It was great. It's the next stepping stone, but it's a really beautiful work that he did, building on the stuff that I did then. It's fun. Really, there's a lot of parallels between each of his results and things that he extends and generalizes from what I had done at the time, 15 years ago. It was really fun sitting on the committee, and seeing him present it.
ZIERLER: Adam, I know that sustainability and resilience, that comes online for you later on when you're at Caltech.
ZIERLER: But I wonder at a more elemental level if some of the things that you were thinking about in graduate school just apply in these realms because it's efficiency, at the end of the day.
WIERMAN: At the time, energy wasn't something that I was thinking of as a motivator. My last year in the PhD, I was starting to think about what's next. It cropped up a little bit. But, yes, efficiency helps. If you're able to finish jobs quicker, then your server can sleep more, and you're more efficient. There's certainly a tie in terms of making these data centers more efficient as an important thing. But, at the time, it was really more about just getting performance better. The motivation was web servers, video streaming, all of these things. Latency and lag was still very noticeable and very frustrating for users and the system operators, and so there was a big push to just get rid of that.
ZIERLER: Tell me about the time, circa 2006–2007. Were data centers, were industries that were involved in this, were they already expressing concern about energy consumption? Was that already happening?
WIERMAN: The data centers not so much; the observers of data centers, yes. There were lots of studies coming out in that time, starting to point out for the first time how much energy usage data centers were having. It was crossing the 1% or 2% mark of the total electricity usage, so people were starting to take notice. There were a couple big government-sponsored projection surveys that were just being started and being talked about in 2006, and then coming out in 2007. That's where it really started to catch people's attention. It was definitely not an industry-driven recognition. It was a government- or Green Peace-style recognition that was happening, rather than from an internal out, external in.
ZIERLER: I'll test your memory. Besides Mor, who else was on your thesis committee?
WIERMAN: Alan Scheller-Wolf was a collaborator very frequently during my PhD. He was a business school faculty. For myself, I was not sure when I was graduating whether I wanted to go to an OR department or a business school or a CS department. I applied to all of them. I would've been happy in any of them but, at the end of the day, I was trying to decide between going to a business school and coming here to Caltech. It was not an obvious choice for me at the time, and so Alan was a good piece of advice for that side of life, the OR and business school side of life. Alan was there; John Lafferty, who is a statistician; and then I think Anupam Gupta was my fourth one but I don't remember for sure. I might have to look at my CV for that. You might have it up so you can tell me if I'm right.
WIERMAN: Am I missing—am I wrong?
ZIERLER: Maggs, Scheller-Wolf, and Ward Whitt.
WIERMAN: Oh, Bruce Maggs, oh, and Ward Whitt. I'm forgetting Ward. Ward was great. I didn't have Anupam. Bruce Maggs was a very inspirational person. He was one of the founders of Akamai, which is a big content distribution network. He's a theoretician and algorithms person, but he'll start at company based on routing algorithms that change the world in terms of networking. He was a big inspiration for that. Ward Whitt is one of the godfathers of scheduling and queuing theory, one of the most published, most cited people in that area from Columbia, who I got to know because Mor did a sabbatical at Columbia, and I spent a lot of time there while she was on sabbatical there.
ZIERLER: Adam, a year or two before the dot-com bubble, when things were still going really strongly in Silicon Valley, when you were thinking about academic positions, was the academic job market similarly strong? Would you see that parallel in interest from institutes of higher learning?
WIERMAN: It was, and then it tanked. While I was in grad school, the academic job market was extremely strong, and the year that I went out, it disappeared. I was pretty unlucky, in some sense, lucky in getting the job that I had—it was perfect in the way it worked out—but unlucky in the sense that the year that I was graduating, academic positions were few and far between, and were disappearing. People would post ads, and then pull them down because the position would be going away because of the crash that was going on.
ZIERLER: Oh, wow.
WIERMAN: It was a situation where a few years before, people would be getting dozens of interviews everywhere, and then the same type person the year I was out would just get a couple because there would just be 5x, 10x, a lot fewer positions available.
ZIERLER: How much of that was just about the general crash of 2008, and which was specifically tech-based?
WIERMAN: It was just that crash, and it was the same thing that happened a few years ago with the housing crash, where endowments tank, hiring positions disappear because the schools don't have the money for the start-ups. That just happened to be the case the year that I was graduating, and the year after, for the academic market.
ZIERLER: Adam, I've heard it direct from Jean-Lou Chameau about his approach to navigating during the crisis, which was, "We're hiring. We are staying strong. We're going to dip into our endowment." Did you sense, to the extent that you were interviewing other places, that Caltech's response to the crash was unique?
WIERMAN: The places that were interviewing were the places that were doing that. [laugh]
ZIERLER: Sure. [laugh]
WIERMAN: The places that were still hiring either had a need, like, people were leaving, and they just couldn't afford to not hire, or they were dipping in. It wasn't distinctive compared to places that I was interviewing, but it was definitely distinctive because they were interviewing, and I think that was important. It was clear, when I was interviewing here, that the market was crashing, but they were still wanting to grow, and they weren't going to let the market crashing impact their plan, the fact that they were going to grow.
ZIERLER: During graduate school, did you have a sense of CS at Caltech? Did that loom large in your mind at all, what was happening here?
WIERMAN: No, I didn't know Caltech when I was in graduate school. That's not completely true. I knew Caltech because a couple of my peers had been undergrads at Caltech, and they were brilliant. I had this idea that brilliant people come from Caltech.
WIERMAN: Like, everybody I had ever met from Caltech was just a brilliant person at Carnegie Mellon. But I didn't know the faculty in the department. I didn't know. It was just one of those schools that was on my list to apply to because Mor said it was good, and so I should apply there. I didn't know the details of the department at all, and so it was a lot of homework before I came out. Caltech was my first interview, so it was a lot of homework. I must say, I was a bit intimidated because two of the people here were really big names that I hadn't known personally but was aware of; big names who had been in the area that I was in, and then moved to other areas. I wasn't seeing them at conferences but I knew about all their papers, kind of thing.
ZIERLER: This amazing contrast that you drew earlier about how CS at Carnegie Mellon is bigger than Caltech in its entirety, what did you see as some of the opportunities and challenges in that stark of a divide? In other words, Caltech is small as a whole, but if you look at geology or physics, it's really not that much smaller than peer programs. But, clearly, for CS—
WIERMAN: [laugh] Yes.
ZIERLER: —this was an enormous distinction. What was going through your mind as you were making those decisions?
WIERMAN: Actually, it was a plus for me. It was one that I didn't understand but it was exciting. I didn't really know, in retrospect, what I was walking into, walking into a small department, as small it was. But I liked the idea of being in a place where you had intellectual room. I think one thing that the interview made clear and that all my intern visits made clear was that there was just not going to be pressure to be a traditional networking person or a traditional algorithms person or a traditional whatever here.
ZIERLER: Meaning you just couldn't be siloed? It's just too small?
WIERMAN: I couldn't be siloed. The big departments especially in CS where I was looking, the advice that you were hearing about tenure was make yourself known in your area, whatever that is, and then do that thing because that's how you're going to be evaluated for tenure, and that's how you're going to recruit students. That was the approach that all the junior faculty at those schools were doing. I didn't like that. As mentioned, I liked being in OR. I like doing stuff that overlaps with business. I like doing stuff that overlaps with applied math. I like not being constrained in how I think about problems, and where I publish my work, and who I work with in that way. I got the sense, rightfully so, that at Caltech, they would give me flexibility, so while I was going to be in a CS department at the time, it was not a traditional CS [laugh] department by any means. I could really be free to run any research style and research agenda that I wanted under that umbrella, and no one would look, and no one would care. Everybody would just cheer me on, kind of thing.
ZIERLER: The general theme that we talked about in our previous conversation at Caltech's insistence on supporting junior faculty, was that something that was apparent to you, even in the job talk process part, even before you joined the faculty?
WIERMAN: It was apparent to me, I think, in the second visit. The first visit for CS interviews is really focused on explaining what you do. The first visit, I gave my job talk. What was clear to me from the first visit was everybody wanted to know details. At many schools, you'll show up, and, especially people who outside of your area, you'll shoot the breeze. You'll talk very informally about stuff. Here—
ZIERLER: This was technical?
WIERMAN: This was technical. I still remember my first meeting because I got up in the morning super early in Pittsburgh. A blizzard had happened the night before, and I spent half an hour scraping off my car window, getting the ice off, and then driving to the airport. Then I show up around lunchtime on campus to 90-degree weather in January. It's sunny.
WIERMAN: I go check in at the Ath, and I check into the Einstein suite.
WIERMAN: I walk up and I just see all this Einstein memorabilia, and get very intimidated. Like, what is this place? [laugh]
WIERMAN: This is the world I'm walking in. I go down to the Ath, and I meet Steven Low for lunch. He says, "Hi." Then the next words out of his mouth are, "I'm going to miss your talk, so here's some paper. Can you show me your main theorem and how to prove it?" [laugh]
WIERMAN: I was like, "OK." [laugh] That was it. Basically, it was technical the rest of the day. That was just a very different experience than any place that I went in the level that everybody wanted to understand exactly what I had done, and how I had done it. It was exciting and fun to talk about the research in that way.
ZIERLER: This had a positive impact on you? It spoke well of culture here?
WIERMAN: Yeah. First of all, everybody cared about what I did, and wanted to understand it, because if you don't care, you just ask superficial things. You tell people about the university, and you let them go. Everybody really cared to understand what I was doing, and everybody was trying to find connections to the tools they were using, and asking me interesting questions about the math that I had been using. It was a really positive experience.
ZIERLER: Whether as a selling point or just because that's how you saw it yourself, did you emphasize the relevance to industry of this work, or was that not something to talk about at that first meeting?
WIERMAN: I talked a lot about theory and practice in my job talk and in my meeting. It was a lot of not industry specifically, although I mentioned industry taking it up, but it was a lot of like the back and forth between working with people who are doing system design and system implementation, and working on theory, and trying to make sure that the theory I proved leads to a new system design in six months, in a year, that being the research vision. That was really what I tried to convey in my spiel about myself at the time. That was what I do. That was how I approach research.
ZIERLER: Did you know you'd be coming back for a second visit when you left for the first?
WIERMAN: I got a really good feeling. My sense was that I had done a good job, and that they liked the style of research that I was doing. The Caltech offer process takes a long time. But I was getting very positive signals very quickly after my visit. I expected that, which was good because it was my first interview. It was nice to walk into the other ones feeling like this first one went very well.
ZIERLER: Now, was it only CS departments that you were considering, given the nature of the job market at that point? Given the relevance of what you were doing, were there other places where you could've been a faculty member?
WIERMAN: I applied to EE, OR, business school, and CS departments. I had interviews in each of those. When it came down to finally making a decision, I was deciding between a business school and Caltech as the places where I felt like I had the best fit.
ZIERLER: In that road not traveled, what would a business school professorship, what would that have looked like for you?
WIERMAN: It's very interesting. There's trade-offs in both directions. I'm very happy that I made the choice I made. I think it's been the right thing for me and my style. I made it for a good reason, and it was the right decision. But a business school model is a little different. In business schools, you don't have students supported by your grants in the same way that you do in engineering departments like CS. The students are supported by the teaching they do for MBAs. Students are basically, to a large extent, fully supported by the departments, and you, as a faculty, then can cut off all of the grant-writing part if you want. You can still, of course, write grants to do things if you want, but it's a much, much, much smaller part of your time commitment. Then because the students are free-form in that way, they have their own funding, there's a lot looser match often between students and faculty. The students work with faculty but they don't necessarily have a one-to-one type relationship, which is often the case.
That's something that I like. Even the way I run my group now, I co-advise a lot of students with a lot of other faculty. I like that my students work with more than one person, and it's not that my lab is isolated in a little silo from everything else. Business schools have that culture. Like, the students have their own funding, so you're not tied to someone because they're paying you out of their grant. You can work on lots of projects. You can work with somebody for a few years, and then decide that now you want to work with someone else, kind of thing. But then also because of that, groups are much smaller because students have to be supported by the department so the department decides how many students there are. You don't really decide how many students there are in your group, kind of thing. Then teaching-wise, you tech MBA classes, which is a different thing than teaching undergrads.
ZIERLER: Oh, yeah.
WIERMAN: Then, let's see, what else? Your industry collaboration is fairly similar. The publication process is a lot different. You publish in journals primarily rather than conferences like the CS model, and so you have longer papers that are maybe deeper on the individual paper level but fewer and far between compared to the CS publication slam.
ZIERLER: On the second visit, was that more of the interpersonal focus to see if you'd be a right fit for the department, especially given how small it is?
WIERMAN: That's right. The second visit, the goal was to understand the culture of the department; how junior faculty were treated; what it would be like to live in Pasadena since I had never lived on the West Coast. My wife came along at the time for that one to obviously see what she could do, and look at job choices for her as well. But the big thing was, like, this is a small department. Would I really have enough collaboration here? Do the benefits of being in a small place in terms of space outweigh the fact that you wouldn't have as many people in your area around to bounce ideas off? Would there be enough to make that happen?
ZIERLER: What were some of the takeaways that you got in terms of that second visit, learning about the culture of CS at Caltech?
WIERMAN: For me, Mani and Steven, who I think I mentioned in the last group of mentors for me, like, they were the big things. Mani and Steven, it was clear that they would be good collaborators for me in terms of having space around me but also being there to co-advise, and give advice, and help me along the way. That was, I think, the biggest thing in terms of getting me over the fear of going to a much smaller place. It wouldn't be just me on an island. There would be a few people close enough that I would have that community among the faculty. Then, also, just talking to other junior faculty about the experience, and hearing how protected junior faculty are, and supported they are from funding and administrative load, and those sorts of things.
ZIERLER: What kind of takeaway did you get on the topic of collaboration where it's a small department, what you might need to do beyond Caltech? Was that encouraged? Was that expected? Was that just a natural outgrowth to simply how small the department was?
WIERMAN: The pattern that I saw was, basically, most people had a couple people locally, and then had a collaboration network that was broad.
ZIERLER: Locally, meaning UCLA, USC?
WIERMAN: No, locally at Caltech, and then a collaboration network with collaborators broadly across other universities in the US and beyond. That was good enough for me, so I wasn't worried about that. I just wanted to make sure there would be some local, enough local, and then I felt like I had plenty of collaborators that I could work with everywhere else as needed, as desired, and that that wouldn't be a problem.
ZIERLER: Given how important in your decision-making Steven and Mani was, what were they each working on at that point?
WIERMAN: Steven, at that point, was doing networking. He was finishing up with his FAST TCP work.
ZIERLER: He had just come back from his venture at that point? Was that the timing?
WIERMAN: I think he was still partial for the first year. I think he was still partially gone and partially there. But it was clear that he was coming back, and he was ramping back up his research in his research group, which was important for me. Then Mani was working just generally distributed systems, and this was just before he was starting to do the earthquake monitoring, the seismic community network work. It was a lot about how do you manage peer-to-peer data oriented distributed systems.
ZIERLER: I wonder, between both of them, you got the immediate sense of just how broadly conceived a research agenda can be at Caltech.
WIERMAN: Exactly. It was very different than talking to the networking researchers at CMU. For them, at CMU, it was often like, "OK, in video streaming, there is this bottleneck, and we're going to fix that bottleneck." For Steven and Mani, it was really societal, like, "Here's a big societal problem. This is how my style of distributed systems work and make a difference." That was exciting.
ZIERLER: Was Steven starting to think about sustainability and EVs and that kind of stuff at that point?
WIERMAN: Not EVs, but he also, like me, was bridging into sustainability through the networks world. He was starting to think about how to make the internet more sustainable. Are there congestion control algorithms or changes you can make to routers to make the lines go on and off, and manage that sort of thing in a dynamic way? He was a like mind in that sense of basically viewing that as a promising direction but not having done a ton there.
ZIERLER: Adam, this question that we pursued previously about the narrative of undergraduates voting with their feet, from physics to where we are now with CS, when you joined the faculty, where was that transition at that point?
WIERMAN: It had not begun yet.
ZIERLER: Wow. Really?
WIERMAN: When I joined, there were—I'm going to get the number a little bit off—but on the order of 10 undergraduate CS majors.
ZIERLER: Wow. See, I would've thought, given just where tech was, you would think that it would be bigger than that. What's the takeaway? What do you make of that?
WIERMAN: There was not an undergraduate major in CS until was it 2003 at Caltech? I was arriving late 2007, so it had only been like four years. I think there had been some people moving into it, but it just wasn't a major yet. If you were applying to universities, and you looked at Caltech, you saw just a few CS people. If you were interested in CS, you went to other places or you prioritized other places, I think. Pretty much every year for the first few years, it was doubling or going up by 50% or something, at least. It was major growth the entire time I've been here.
ZIERLER: But at the point where you made the decision to join the faculty, there was no indication that that trendline was going to come?
WIERMAN: No. When you're starting as—at least for me—when I was starting as a supervisor, I wasn't really thinking that much about undergrad life. When you're talking about schools, you're looking at the research environment. You're looking at the graduate program. I wasn't deciding which school to go to based on the undergrad program, I guess.
ZIERLER: The happy decision for you, you never got far along enough in other options where you thought maybe the job market is such that I'll have to consider industry?
ZIERLER: The timing worked out for you?
WIERMAN: Yeah, it worked out. I had interviews. I probably was a little bit naively confident that given the interviews I had, I would make something work. But I also figured that if it didn't work, I would just get a postdoc, and try again the next year, kind of thing.
ZIERLER: Was Steven's venture, was that appealing to you just to know that, at that point in Caltech's history, it was no longer so ivory tower where that was something that you could do if you wanted?
WIERMAN: It was definitely nice to know that a faculty member was doing that, and it was viewed as a plus by his colleagues, not as a negative.
ZIERLER: Set the stage for me. You join the faculty, and it's always that mix of what do you extrapolate from your graduate research? Where do you slot in in terms of what's happening right now? Then the unique Caltech question is, how do you maximize this nurturing environment where you're built and you're supported for success on the way to tenure?
WIERMAN: That's right. I started right away. I showed up very unsure of [laugh] myself in terms of what I needed to do when. This is a [laugh] statement of Caltech's administrators sometimes. I showed up, and I walked to my office door the first day, and it didn't have my name on it. It had the name of a previous professor who had been denied tenure in the spring. [laugh]
ZIERLER: [laugh] Oh, no.
WIERMAN: [laugh] There was this like moment of, OK. [laugh]
WIERMAN: Who is this guy? Oh. [laugh]
ZIERLER: Maybe it was intentional, maybe it wasn't, leaving it up. [laugh]
WIERMAN: [laugh] I don't think it was intentional, but it was one of these oversights that was like, OK, this is not the warm and fuzzy feeling that I was hoping for when I walked into my office the first day. Then the thing that struck me, like, you're always a little nervous your first day, but I was showing up in August. At Carnegie Mellon, classes had started already, and so the place was buzzing, and it's giant. Of course, in Caltech, classes don't start until the end of September, and most faculty are like gone in August. I showed up, and there was very few people around, very few students, so it was a huge contrast in scale and activity. That was a little nerve-wracking at first when I showed up because since I didn't do a postdoc, I was showing up not having graduate students, not having postdocs, and so it was just me in an office. I hung around with Steven and Mani, and went to their group meetings, and met some of their students. But it was another month and a half or so before the department felt busy after I arrived. Coming from a giant place, it was one of these, OK, this is small. [laugh]
ZIERLER: [laugh] I did want to ask—you mentioned not doing a postdoc—was that common at that point to leapfrog the postdoc, or were you a unique case?
WIERMAN: It's not like a huge amount. I'd say 60–70% of people were doing postdocs, so there were in fact some people who weren't. In my case, my wife and I wanted to move once, and she was graduating with her master's from Carnegie Mellon that year. If I had gone to do a postdoc for something, she would've had to have a gap year or sort of a strange movement. We decided we'd just come, and that way she would be able to get started on her career right away, without that gap. I always advise my students to do that postdoc year because it's a much different feeling when you show up already having your students and postdocs that first year because you can just hit the ground running. The first year without them, you're carrying the load a little bit, and so you have to really find people to work with in the environment rather than having your team come in with you on day one.
The first year, for me, was a lot of collaboration. Because I didn't have my own students or my own postdocs, it was a lot of continuing projects that I had worked on on my own or that I was working on with collaborators outside of Caltech while I got to know people in Steven's group, while I got to know people in Mani's group, and so on. I still have a strong affinity for postdocs because of the model that Caltech really helped me in that first year where we have these free-floating postdocs that come in. They have a mentor but, at the same time, they are getting a lot of departmental or funding from centers or whatever it is, and so they have the ability to work with anybody that worked in the department. There were two postdocs that first year that I really bonded with. Basically, we were the same age. They had just graduated. We collaborated. We hung out. It really helped in terms of getting the research off the ground to have people like that around that were free-floating that were also looking for that kick-off in a new direction during their postdoc, just like I was as starting as a faculty member.
ZIERLER: Adam, on that point, starting up your own research agenda, how much of your thesis did you want to leave on the proverbial shelf? Even if there was more to do, you self-consciously wanted to go in new projects, and what did you want to continue on because, obviously, there are unanswered questions? There's still the work to do.
WIERMAN: I was looking to make a hard right. I was very conscious of there's papers I can write quickly that will be important to write, and that will take a year, a year and a half to write those and get them out. I had a plan for, I think, three or four papers that were like thesis-driven that I would finish up and get out the door and get published. My view was that that was my off-ramp, and that let me still publish at a consistent rate as I was ramping up new things in other directions. I was spending most of my intellectual capital in new directions but enough intellectual capital to go down the off-ramp for the thesis work.
ZIERLER: Once that month and a half had elapsed, and the department was starting to come to life, what do you recall? What were some of the ideas that were animating the faculty at that point?
WIERMAN: I think I mentioned in our last call that Mani, I think, introduced me to John Ledyard very early on, who was in the economics faculty. Mani and Steven were both very interested in the interaction of economics with networking, and I was too. I started to spend a lot of time with the economists at the Rath on Friday evenings, and just hear what they were thinking about and the way they thought about problems, and talked to them about the way CS people were doing economics, and hear them rant about how it was the wrong way for reason X and reason Y [laugh], and how we didn't understand the right way to do economics, and bounce ideas back and forth, and all this sort of thing. That was really good, and Mani really pushed me to make that introduction to John, and make those connections, which was really helpful of him. Then one of the postdocs that I really worked a lot with, Jason Marden, who's now faculty at UC Santa Barbara, adopted him as my postdoc. We spent a lot of time working together. He was at the intersection of control theory in economics, and I had never learned control theory before. I learned some control theory, or started to learn control theory from him, and we both learned economics together. Jason and me and John taught a course in economics in the winter on the interaction of CS econ, where I taught the CS view of economics, and then John came in and taught the economics view. [laugh] We went back and forth each week with a CS lecture and an econ lecture. People like Kim Border and Charlie Plott and Mani and Steven were in the class, along with the graduate students, and so I was lecturing to them about the recent papers coming out in this space. It was a really exciting way to learn a new area.
ZIERLER: Adam, if we can reverse-engineer where CS is now, and just the overall appreciation that around the Institute, it's computational everything. It's embedded all over the campus. Going back to when you first got your bearings as to how CS was connected or not within Caltech more generally, was it pretty much a fifty-fifty two-way street in terms of CS faculty seeking collaborations elsewhere, and other faculty coming to CS with specific needs? What was your sense of that dynamic?
WIERMAN: At the beginning, it was definitely outward. When I arrived, there was some connection with other departments, but it was just at the beginning of this, and Caltech was not yet a place where there was recognition elsewhere on campus of the needs in CS. There were ties. There were examples of connections. Sometimes, those were CS-driven. There are exceptions. Like, the quantum computing part was going strong at that point, and so there that was very much a two-way street. CS econ had started, and I think with me and Mani kicking it into gear there, it really grew in activity at that point. Then molecular programming was another really strong two-way street at that point with Richard and John Doyle and Shuki and others. But the connections to chemistry and astronomy and physics more broadly than quantum weren't really there yet. There was a lot of CS faculty that were getting interested in these areas, and trying to make the connections, I think, at that point. Then the balance has, each year, shifted more and more in the other direction where, at this point, there's just a huge external push for, "Can you help me with this? Can you teach my students X? I have these problems."
ZIERLER: The way you're describing it, it seems very counterintuitive circa 2022. You would think that it would be the astronomers and the economists and the biologists who had all of these computational needs to go to CS. But it's very fascinating. What then were the needs that CS faculty had where they were really outwardly driven in their research questions?
WIERMAN: It was people becoming exciting about these +X areas. If you're working in machine learning, and you're seeing that the tools can be applied to nonconvex problems of particular types, then you're searching for applications, and you're reading papers, and you're finding that there are people doing this in area X. Then you're going to seek those people out to see if they're actually interested in the new algorithms and designs. It was a lot of that kind of computational outreach, like, "I've made this new advance. I think it should be useful for you. Will it actually be useful?" as opposed to the other way where I think Caltech faculty are used to solving their own problems [laugh] in other areas, and there wasn't yet the recognition that there might be these people over in CS that can help solve your problems even better with these learning tools. The learning was not on the pedestal it is now at that point. There was three or four years before machine learning was taking over in the way, getting towards what it is today.
ZIERLER: Adam, last question for today, and maybe it'll be a bit of a cliff-hanger for our next discussion.
WIERMAN: Oh, that went quick. We're already at the end now?
ZIERLER: Yeah, there you go.
ZIERLER: As you were seeing more senior CS faculty making those inroads into other departments, when you felt like it was right for you to start looking for those kinds of collaborations, what's the origin point where you're making the connections between network systems and sustainability? How did that come together specifically from the vantage point of you realizing all of the benefits that could come from going beyond CS but staying within Caltech?
WIERMAN: It came early on because economics—it was just very clear to me early on that if I was working with people like John Ledyard and Federico, then I had a huge advantage in making progress and understanding the right to ask questions compared to my CS peers at CMU, who were doing it on an island without ever talking to the economists about what they were doing. That was first year with Jason and John and Federico, we already jumped in. By the end of that year, we were writing papers together that were involving economics and control and CS. That happened just almost day one, to the point where the first students I was recruiting, I was talking about collaborations with economists as a reason to come to Caltech. On the sustainability side, it took a little longer. Steven and I, I think, worked on an island ourselves in terms of energy and networking and distributed systems for a year or two before we started to make broader connections around campus in sustainability. Resnick Institute didn't exist yet. The initial formings were still going to be a few years into the future. At that point, even the initial forming were more focused on the technologies around solar or battery than on the system development. It took a little bit for that to connect more broadly across campus. But the econ connection was there almost immediately.
ZIERLER: Adam, for next time, we'll delve further into the econ connection. We'll see what happens next.
[End of Recording]
ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It's Monday, August 22nd, 2022. It's great to be back with Professor Adam Wierman. Adam, once again, great to be with you. Thank you so much.
WIERMAN: Thank you.
ZIERLER: Adam, today, we're going to pick up in continuing the story where you recognized early on in your tenure at Caltech that there was tremendous opportunity to take collaborations even beyond CS. I framed my previous question broadly in terms of institute-wide in asking where you saw most fruitful areas of collaboration, and so you immediately went to economics with your answer. Before we get to the human dimension about the people that you were meeting, and the ideas that you were considering, and what you found relevant, if we could zoom out a bit, what was it generally about economics that might have pulled in that direction, even decoupled from the individuals that you were interested in working with?
WIERMAN: That's a great question. I think for me particularly, economics came into focus because in my grad school, I was working on distributed systems, networking, these sorts of interacting systems. This is an area where when you're taught it as an undergrad, you learn about protocols like TCP and UDP and IP, and all the alphabet soup of these networking protocols. They are these algorithms that in a distributed way figure out shortest paths, how to route packets, how to set things up in terms of the communication structure. But then when you learn a little bit more, you realize that they don't actually work in reality the way that you learn them in classes, and the reason is always strategic interactions between the entities involved. When you think of an ISP routing traffic well, in some sense, it should be finding the shortest path. It should be routing things directly according to these protocols.
But, in reality, it ends up often doing things that are in shorthand referred to as hot-potato routing where you don't take the shortest path; you take the shortest path to get it out of your network because then you don't have to maintain it anymore, and it's a load on somebody else's network instead of yours. If you can get everything out of your network very quickly, then you don't have to build up the infrastructure in your network. You don't have to have that expense, so it's cheaper for you. These incentives of the agents, the companies, the ISPs involved in networking really meant that everything you learned in an undergrad networking course was really not the way [laugh] the network was working in practice, and people didn't have a good idea. They were often surprised by this. In reality, this would mean that you invested in the wrong parts of your network to improve them because you're trying to remove the bottleneck. But the bottleneck wasn't actually the bottleneck because of the strategic behavior, or it meant that you expected performance to be much better, and you ended up with some failure in some part of the network that shouldn't even have been used but was part of this hot-potato exchange. It just was a huge indication that all the theory, all the algorithms were designed with the wrong assumptions in mind in networking. To figure out the right ones, you needed to study economics.
ZIERLER: Adam, embedded in the question, the idea that the questions were not framed in the right way, was there a deeper history of collaboration between CS and economists that you were aware of, or was this really more of an innate realization on your part that economists needed to be brought in on these issues in a way that they weren't previously.
WIERMAN: There had been a growth, even at Caltech, which was one of the leaders in this area. Starting in about 2003–2005, there had been a lot of growth and investment at Caltech and, even before then, there had been a lot of realization that economics and CS, broadly speaking, could learn from each other. One of the, I guess, canonical example of that is computational advertising, so ad auctions. Around the turn of the century was when ad auctions were coming into fruition. When you went to a search engine, you went from this situation where, instead of companies paying for a month to have their ad shown to everybody for every search, keyword targeted search ads were showing up, and these search ads were starting to be allocated with auctions. Some of the folks at Caltech were involved in the start-ups that made this a reality. But this was a very particular form of interaction between economics and computer science around auctions, and making money with very fast auctions in a computational world. That wasn't necessarily the place where I wanted to be. That was one place where economics and CS had hit. Then another was around network science. This one was closer in spirit to me. This was understanding the structure of social networks. This is when internets and web were growing, and people had these large, complex networks, much more massive than they could study before, that somehow conveyed interaction of ideas or interaction of at least virtual interactions between people.
There was huge excitement, where now we can very easily measure and understand what these networks look like in terms of the structure and the connections. That was a really hot area around the turn of the century too, and one of the leaders in that area, Jon Kleinberg did a sabbatical at CMU while I was there. I got to talking with him. That was one area where it was, I'd say, led by CS but still involved the social network side of economics a bit. There had a been a bunch of these. But in my world, in networking distributed systems, it was really just coming out. I wasn't the only one to recognize it. But there was definitely a feeling that we needed to rethink our networking protocols with an idea that strategic interactions were important to how operation would actually happen.
ZIERLER: Adam, what were the big points of collaboration in terms of both efficiency and human behavior? In other words, what were the kinds of perspectives that economics or economists would bring in both of those areas as they related to the kinds of questions that you were framing, and seeing where there needed to be additional expertise?
WIERMAN: In mechanism design, auctions, there's this idea of incentive compatibility, which means that you have to design your marketplace in a way that the users are incentivized to tell you what their actual beliefs, their actual values for the goods that are being exchanged. If you don't have incentive compatibility, it's much harder to design an efficient system because you don't really know what people are valuing in terms of the goods they're buying or, in the networking sense, in terms of the paths in the network that they're actually trying to send traffic on. The economic lingo is the problem with the networking protocols is that they weren't incentive compatible, and so people kind of lied about where they were trying to send traffic in order to get the traffic out of their network as soon as possible. Then it would correct after the fact to its true destination. This idea of how to design allocation rules, auction designs, prices that were incentive compatible was really what the networking community needed, and it hadn't been thinking about that at all. This idea that you have to respect incentives and, if you're designing a system that's not consistent with people's incentives, they're just going to manipulate it, and that's what was happening with the routing protocols.
ZIERLER: A binary we've discussed previously in terms of research motivations, there's always the fundamental science, and then there's the ideas about possible applications. Where were you on that spectrum when you first realized the value of starting to collaborate with economists at Caltech?
WIERMAN: This was my shift towards sustainability, and a big issue in networking. This was a warm-up because it wasn't sustainability focused. But it was still the idea of how do you get the incentives correct, or the entities in the network that are making these decisions about where to send their jobs, how to schedule their jobs, how to route them through the network? But the incentive incompatibility gets even larger when you think about carbon as the [laugh] metric you care about. Once you're thinking about carbon as part of the objective, the incentives become way harder to align, and you just have to take them on head-first in terms of designing pricing structures and contract structures in addition to designing the engineering, the resource allocation, and the scheduling. That was the big picture. The markets need to be redesigned as well as the systems if you're going to be successful in this direction.
ZIERLER: What would be the envisioned products, the outcome of this research? Who would it be good for? What's the timeline in terms of seeing it out there in the world?
WIERMAN: There are many steps along the way. Looking back now, it's easy to say, oh, this was the big picture the whole time. Little steps along the way led to the big picture. But there was always the idea that a data center could, first of all, run on massive amounts of renewable energy that were local, and be adaptive to the amount of renewable energy. In the case of the partner we had at the time, HP, which does a lot of video rendering for Hollywood-style movies, they don't have to be finished at 12:01 p.m., one after you submit the job. These are jobs that are going to take days and days and days to run, and you can be flexible when you run them and when you burst them and when you put them to sleep during their lifetime. That gives you a lot of flexibility to adapt to the carbon mix of your generation footprint at any given point as a data center. We wanted to be able to do that. Knowing that you're never going to get fully renewable on sites with a data center, you want to make sure that they can be value-added for the grid. Earlier on, we talked about this in terms of virtual storage. They should be a virtual storage facility for the grid, where if the grid is running hot and needs to shed some load, the data center can help it to do that by deferring some of these batch loads that don't have time constraints an hour, and using less generation for the grid, and making it easier, and so to play that role for the grid. It's that piece where the market design really was crucial in the vision. You needed to be able to design a way of pricing the flexibility that the data center can provide so that it's encouraged to be a good servant of the grid operation instead of just being locally trying to maximize its performance for its jobs.
ZIERLER: Adam, I want to get to the clients and the systems that you're specifically thinking of out in the real world. But just so that we have our names clear on the collaborations, this is John Ledyard and Federico—help me on the pronunciation—Echenique?
ZIERLER: Echenique, and Jason Marden, who was a postdoc at the time?
WIERMAN: That's right. Those were the first three on the economic side.
ZIERLER: Let me make sure. Are the three of them working together, and you jumped in on that, or are you more one-to-one with each of them?
WIERMAN: More one-to-one with each of them. We were all part of the center at Caltech at the time, the SISL, it was named at the time, Social Information Sciences Laboratory. We were all part of a group that would meet together every Friday for seminars. Mani Chandy should also be in that group. He was also very active in SISL at the time. But then with each one, I had different collaborations or different sort of research problems. Jason and I were mainly working on something a little different than what I've described but still in the same line, which is how do you quantify how much inefficiency comes when the incentives are misaligned? How bad can things get if there's incentive misalignment? That was like how much can you control if you do it right versus you do it wrong?
ZIERLER: I'm glad that you mentioned that Mani was part of this because you emphasized previously, from Mani and Steven Low, both of them were really crucial to helping you get your bearings at Caltech. Is that to say that Mani had already established inroads with economists by the time you joined the faculty and were thinking about these things?
WIERMAN: Yeah, definitely. Mani already had connections with John especially. I'm not 100% sure, but I think they had worked together even in the early 2000s around the Enron-style crisis as well. They had a history of just knowing each other at Caltech. I don't know how much they'd done research together, but they were certainly very well aware of each other.
ZIERLER: Because all of these individuals are so significant, let's just go one-by-one. What was John Ledyard working on that convinced you that there was fruitful opportunity there for you?
WIERMAN: John is mechanism design, so auction design. John was really fun to interact with around that space because I learned a lot about important concepts on an auction design. I always characterize him as a near-Nobel winner in that the Nobel Prize in mechanism design went to a lot of things that he contributed to; he just wasn't named. For him, we taught together, and we were educating each other. He wanted to learn about the CS, and I wanted to learn about the economics, and so we co-taught a class where he lectured on traditional econ game theory, and I lectured on the stuff that computer scientists were calling algorithmic game theory. It was this interesting interaction where he would give a lecture, we would learn a lot, I would then give the CS view, and the economics faculty in the audience would all be saying, "This is what economists did in the '60s." "No, we did this one in the '80s."
WIERMAN: "No, we did this [laugh] or this is the wrong question because…" [laugh] It was one of these things where it was good that it wasn't my work because I was presenting a lot of things that the field had been done, and the economists were really enjoying picking it apart but also pointing out nice things and different things about the way that computer scientists were approaching it too, because whenever there's these two fields interacting, there's always an education process before you can contribute in ways that both recognizes as valuable. There was a lot of learning of language and learning of what questions were interesting to each community in that class.
ZIERLER: I'll just point out I love how you emphasize that John wanted to learn CS, and you wanted to learn economics, and so the natural result was, "Let's teach a class on it because there's no better way of learning than teaching it."
WIERMAN: Yeah, exactly. It was really valuable and, but the end of it, we had a group of grad students and postdocs and faculty that knew a lot about both areas. John in particular and I, most of our collaboration was around energy markets. I had a few students that were really interested in that area, and John worked with them very strongly, worked with us very strongly, and really helped us understand what sort of classical econometrics results he would think of applying, and how they would be adapted. For our end, it was a lot about why those things couldn't apply directly to a smart grid because of the physical constraints of the system, the physical constraints of the regulations and the markets. Then that led us both to doing the interesting research, was how do you get around those constraints while still keeping as much as possible of the ideas from the econ literature.
ZIERLER: Where have you seen those ideas, those collaborations with John get adopted out there in the world?
WIERMAN: Market design is interesting. You can't just implement it, and expect it to work. A lot of it comes through in ideas behind legislative policy. We actually worked closely with the California Board of Governors a few times, and had some input into some of the regulations that came out around deploying of electricity or energy storage, and things like that. We also worked closely with Southern California Edison, and some of the ideas ended up into some of their test market designs for demand response for commercial entities, and were evaluated on that front. Then with HP, we did some market designs around data center demand response specifically that were tested out in some utilities in Colorado. Those things then take on a life of their own. But the market designs, it's not like an algorithm that a company just takes and implements. It's how does this idea then percolate through the regulations that come out, and shape what the market designs look like in the future? Then the other place that we really tried to have a big impact on—and I think did—was around the recognition of exploitation of market power in electricity markets, especially emerging aggregators. In the modern world, you have these home solar aggregators that might have 100,000 solar arrays, and many with storage on people's homes.
Each individual location is tiny, and can't impact market prices. But if the whole of them works together, they can actually coordinatedly impact market prices by creating congestion on particular lines, and then driving up that geographically. Prices are geographically located, and so driving up prices in particular regions that they can then exploit. This is a very different style of market power exploits than electricity markets used to usually look at, which is just to look at which generators are very large, and then let's monitor them very closely. You can't monitor an aggregator of 100,000 rooftop solar arrays closely. Small changes in particular regions, we showed, can drive up prices dramatically, and then lead to a nice profit for the company that does that. We were giving that example to anyone who would listen, which then played a role in how the market structures for aggregators ended up in terms of the legislation.
ZIERLER: Just chronologically, what were the most intensive years when you were collaborating with John?
WIERMAN: I don't know.
ZIERLER: Was it just like with you, a short burst?
WIERMAN: Probably in early 2010. It was probably in the like 2009, 2011, 2012 period, so a little ways after I got there because it took a while to get to know the language and to build up those connections, and then we really hit the ground running as the smart grid work really started to kick off with Mani and Steven and John. The four of us were the lead core group there.
ZIERLER: Let's move on to Federico now. What was his research when you connected?
WIERMAN: Federico is a theorist's theorist, and I really loved working with him. But the work there was pure theory. Federico in general in his work looks at matching markets, which are things like kidney exchange or ride-sharing, where you have entities on two sides, and the goal of the marketplace is to put them together, and match the right person on one side with the right person on the other side. But a lot of his work is theoretically motivated, and structural around that work. For him, many of our collaborations were motivated from that class that John and I taught and, in particular, afterwards, we had a bunch of visitors come in. A very hot topic in the computer science side was around the computational complexity of markets. How hard is it to find an equilibrium, for example? Is this something that is a computationally easy task that we can do quickly with an algorithm, or something that would be computationally hard like NP-complete or something like that in the CS lingo. There were a bunch of very celebrated results on the CS side that were arguing that computing equilibria was computationally hard.
ZIERLER: Which means what in this context? What makes it hard?
WIERMAN: What makes it hard is even if I gave you a large market and a supercomputer, you wouldn't be able to compute the equilibrium prices of it, or there would be some example, some hard example, of such a market that you couldn't compute the prices. The CS people were using these results as a bit of a hammer to criticize well-established econ equilibrium concepts. For example, if I can't write a computer program to compute what an equilibrium price is for a market, how can you expect the market to arrive at an equilibrium price? It can't do something that a computer can't. [laugh] Is that a critique of these markets? These critiques never sat very well with the economists, and it was very hard for computer scientists to understand why because, in their mind, "Look, we showed you. Here's a complex market, but it's a market, and it's hard to compute, and I can scale it up, so there is this forced case example." This led to a lot of conversations over the Fridays beers at the Ath with Federico and John. Whenever some computer scientist would come and give a talk about that, we'd always debrief. But they'd say, "Oh, I don't like this style of results. It's just worst case so it doesn't actually apply," things like this. Finally, Federico and I had figured out how to convey the economist response to this kind of result in a way that explained why economists weren't bothered by that sort of result, and why that wasn't actually a critique of these equilibrium concepts. The idea was—it's a clever idea. I still like it a lot after all these years [laugh]—is to say the view of an economist is that these are models of the real world; not the real world. As models, we choose the model we use.
ZIERLER: Use for what?
WIERMAN: Use for a given situation. If you're using a model to try to analyze why prices in the grain market are reacting the way they are, you're fitting that model to data, and you're then using that model to make a prediction. In that sense, worst case doesn't always mean much, because you're choosing this model. The result we were able to prove is that, basically, given any data set, there is a model of that data set that fits it perfectly, that is computationally easy to find an actual equilibrium in. Even if there are hard and there are hard market models, you can never force yourself to use one with data. If you're observing data about a real system, there's always an easy model of that system, or a model of that system that's easy to compute an equilibrium in, and so you don't need to use these hard models to predict what the situation is going on. In some sense, the fact that there is a worst-case example is irrelevant to using these models in practice as predictive tools. This was a counterpoint to the way computer scientists talked about these computational complex results. It really had an impact. We got some attention among the CS bigwigs who were looking at computational complexity to say, like, oh, this is a really interesting perspective, and now there's a line of work in that direction that followed up on that style of analysis.
ZIERLER: Because, obviously, there are clearly applications in matching markets to the theory, as you called him, Federico's a theorist's theorist, what were some of the pleasures in just not being focused on applications, and just drilling down into the theory of it? What was useful for you in that regard?
WIERMAN: It was really great because I think a lot of my development, what I know about economists came from John and Federico. John was co-teaching. From Federico, it was because any time in a meeting where I didn't understand some analysis or the reason why economists use some definition, he could just stop for a second, and give me a 10-minute lecture on it on the whiteboard. I understood it after that. The collaboration there was extremely fruitful, both in terms of the papers we wrote but in terms of just learning from each other around these areas. I think we walked away from it like both feeling much more confident in the other person's field [laugh] than when we started out, which is an ideal output of education but also in papers. Then we did do some more applied work together. One of the projects in recent years that was outreach-oriented was applying ideas for a matching market theory to the problem of school lotteries, which is a major application area for people who are applied working in that space. But my kids go to Constantine Public Schools. As soon as I went in and participated in the lottery for kindergarten for my oldest one, I saw the way it worked, and it was not following best practices in terms of design, and it was manipulable and inefficient in various ways. Federico and I basically got together and reached out to the district, and gave them some ideas, and then, over the next nine months, worked with them on their data, evaluate those ideas, and make a proposal, and took it to the school board and city council and got it approved. Now, our ideas are running the school choice lotteries in Pasadena for every student at all levels.
ZIERLER: Oh, wow. That's a local example. Where do we see some of the output of this research more broadly in industry and in society?
WIERMAN: More broadly in industry and society, I hadn't pushed that as much. But Federico—I'm just speaking for him—Federico and Laura Doval, who was here, they also worked in kidney exchange, so matching markets in general, and have some impact there as well. But for me, the big push was just in the school choice problem. The ideas there are now, hopefully, sending to other districts. There's the local provider that Pasadena used that had to implement all our ideas. Now, they're selling their product to as many other districts as they can. [laugh] I don't know the names, but there's a bunch of other districts now that are using the same sort of algorithm and structure that Pasadena adopted because, now, it's the default for this company.
ZIERLER: Then, finally, Jason Marden, where does he slot in in all of this?
WIERMAN: Jason was a postdoc, so he was here for the first two years I was here. He's now faculty at UC Santa Barbara, still working on game theoretic interactions with control. Since I was fresh out of grad school, and he also was fresh out of grad school, we were both starting at the same time. Even though it was faculty and postdoc, we were just peers working on problems together. Like I said, it was more related to the initial motivation of networks and inside of issues and networks leading to inefficiency. We were looking at problems like that, so how much inefficiency do you give up—enormous amounts—because of incentive structures? You give up little? Are there ways through tolls and through additional structures like that that you can put pricing on particular links that will limit the inefficiency that comes from this kind of misaligned incentives of the agents and networks.
ZIERLER: Adam, just a generational question. With John Ledyard being a more senior faculty member, and Jason being a postdoc, what view did that give you in terms of just like some of the more traditional ways of approaching problems versus somebody fresh out of a dissertation thinking about these things?
WIERMAN: There's a huge difference. John was a very senior faculty member at that point. It was great to see someone who had been around for as long as he had, seen as much as he had, still be excited about new areas emerging, and wanting to jump in. At that stage in a career, it's very easy to just say, "Yeah, I know what I'm doing. I'm an expert in this. I'm going to keep doing it. I'm going to push forward." John is just not that sort of researcher. He was jumping head on into computer science and algorithms [laugh] and these sorts of tools at the late stage of his career. There are few people that have that kind of gutsiness, so it was a good signal that Caltech is willing to [laugh] go out of their comfort zone at any stage of their career, which I think was exciting. Jason was the opposite. He was fresh. It was just like here are exciting problems. What do we need to learn? Let's go at it. That's the same stage I was at where just the world is your oyster. There's nothing stopping you from jumping into some new path or some new direction.
ZIERLER: Have you maintained collaborations with Jason over the years?
WIERMAN: Yeah. We haven't written papers in a little while, but we keep in touch all the time. He was just out here a couple weeks ago, and we were chatting about work. He's my first advisee that became a faculty member, so he always will have that special check-in.
ZIERLER: Adam, a general question with funding. We talked in our first discussion just the broader narrative of, nowadays, it's computation everything. CS is part of the equation with everything. Since this is early on in the game, both in terms of a field and your career, in the collaborations with the economists, did that provide a new window into funding agencies and opportunities or, just more generally, the kinds of people who would be supportive of this work that might not be so obvious if you were more siloed within CS?
WIERMAN: It was an interesting time, actually. Even at that stage, funding for CS was growing prior to enrollments growing, I think, at Caltech. NSF-style funding was on its way up in CS and modern economics. Economists often don't have the grant-chasing model because their students are often funded through TA-ships and things like that. In doing this stuff, it was interesting to see that it was a new approach for the economists to go after NSF grants. [laugh] When I was saying, "Let's apply for a grant on this," they were like, "Oh, I don't do that but, sure, I'll see if I can help out." Even though there were senior people—Federico and John—much more senior than me, it was me leading the grant-writing process. But because of that, there wasn't a natural grant-funding side on the econ for this work, so all the funding came for more CS.
One of the lucky things or timely things was that NSF kicked out a new program for the first three years I was a faculty member that was focused specifically on growing CS and economic interactions, and so we hit that each year. It was a nice bonus in terms of having funding opportunities open to me because of being in that area. That definitely did magnify our ability to move quickly in that direction, and the incentives were aligned, and incentivized us move in that direction. Then I was very shocked when it went away, and it became actually very hard to get funding in that direction because computer scientists, for a large part, didn't have much econ background, and econ didn't have a great funding pool of their own. When that kick-off program wound down, it became much more tricky to find the right sweet spot between the areas in CS where people would appreciate—actually, "appreciate" is the wrong word—understand the motivation for the econ problems, and understand what interesting questions were in econ versus what a CS question would be that maybe wouldn't be interesting to the econ side.
ZIERLER: Adam, a very specific question based only on a hunch. I have no idea if you have visibility on this. You'll let me know. After the 2008 crisis, there was discussion at Caltech about maybe having a business school because that would be great for revenue.
WIERMAN: Funding, yeah.
ZIERLER: We would do it the Caltech way; all of these things. My question is, in working with the economists, I can imagine discussions where, if Caltech does a business school, it's going to be a Caltech business school and, by definition, that would mean very unique approaches that might include a CS perspective like you're not getting elsewhere. Were you involved in those discussions? Were you aware of how that all played out?
WIERMAN: Only I was too junior to be seriously involved. Almost like you described, there were a few conversations like that, like an if we do X, it would be an unusual one, a theoretical one, cross-disciplinary involving CS and playing to SISL's strength, but never more than that. That was not something I was wanting to be engaged with [laugh] as a new junior faculty, so I was happy to let the senior people talk about that, and just hear what I heard.
ZIERLER: The bigger question there, it's always the perennial debate at Caltech, like, do we stay small? Do we get bigger? How do we get bigger? Is the business school idea, does it crop up? Is it dead and buried? Where is that nowadays?
WIERMAN: I haven't heard anything serious about it in a long time. I think the part that is a lie in terms of a potential idea is growing finance, and understanding what a Caltech finance program would look like, finance and entrepreneurship. But I think those are the more nearer-term. If the Linde Institute grows finance in a way where we have a strong finance wing, and we have a little entrepreneurship wing, now you have legs that maybe you could build a business school around, and you could have a discussion like that seriously but, right now, you couldn't. We wouldn't have a curricular structure for that that would be tied into research, so it would be something different. I can't see it being [laugh] something that would come out in the near term. Maybe I'm wrong.
ZIERLER: As not so junior of a faculty member anymore, whether you have the bandwidth for it or not, is that a compelling idea? Is that something that you'd be interested in pursuing from a CS perspective?
WIERMAN: I think the part of that world that is compelling to me intellectually for Caltech is operations research. Operations research is a big part of business schools, and that's the piece of the brand that I think would make sense at Caltech. If you had enough of some of the other things to make it well-rounded but then you built your strength around OR, that would be compelling at Caltech because then you're bringing in optimization, you're bringing in stochastic, you're bringing in data science and statistics, these sorts of tools. I think you could build something around that that would be strong and that would benefit the rest of Caltech, and not just be its own island separated, and you'd basically need to combine those foundational tools with some entrepreneurship and finance to have a program. I do think that could be something that, say, HSS could build some strength around, but it would have to be a commitment, and size is always a challenge because you need faculty to commit to having meaningful engagement there. But on the research side, OR fits in very well with the brand of CMS, and the goal of connecting the computational side of the department with the applied math side of the department. That overlap is basically operations research, data science operations research.
ZIERLER: Beyond the obvious benefits that this would confer within the institute for that future alumni pitch, here's why we should have a program—
ZIERLER: —what the obvious needs out in the world for which a Caltech focus on OR would really be impactful?
WIERMAN: I think the OR piece, the argument is very much the same as why you need data science and machine learning. If you're trying to build a modern-day scientist, they need their statistical tools, they need machine learning tools, but they also need to know how to make decisions using those tools, and that's often what we miss in the current Caltech programs, is that kind of business-oriented decision-making on top of the statistical foundations. We can teach you all statistical tools; all the machine learning tools. We can teach you why they work. You can know how to prove and correct and design your new ones. But the piece of then making that operational in a entrepreneurship or business or finance tech environment, rather than in a technological development environment, is not something that Caltech has focused on educationally.
ZIERLER: In identifying this area that Caltech is not doing, is the response there more we should be doing that, or is the response more if that's what you're looking for then you should just go to Stanford?
WIERMAN: It depends on who you talk to. I think, increasingly, it's shifting towards the we need to do that because if you want these tools to have an impact on society, yes, there's one path which is straight through research. But even for research, so many Caltech faculty and grad students go and take their work, their research ideas out into the world in a company or towards an impact on regulation or things like this. To have some training around that decision-making process, I think, would be very beneficial for Caltech's impact, the impact of ideas coming out of Caltech, beyond academia.
ZIERLER: Just walk me through some Caltech administrative culture.
WIERMAN: [laugh] Oh, no.
ZIERLER: If this is something that you would want to pursue, how do you do that? Do you talk to the division chair? Do you bring in outside funding? How would you get that off the ground if you wanted to?
WIERMAN: This is curious. [laugh] This is not the conversation I was expecting, but let me think. We've done this in CMS. I like to create new programs, and so we have created PhD programs. We've created undergrad majors and minors and things like this. The difference from a master's-oriented program at Caltech, let's say philosophically, is that if you're in a master's program, an MBA or a master's, your next step is 99% not a research step. That's just different philosophically than the next step for nearly all of Caltech programs today. Nearly all of our programs today, the next step, there's a reasonably large percentage of the people that do it that go on to a research path. Things are optimized for that approach. Having a program that, by its definition, has a next step that is not that is different. I think that's the biggest struggle here. Could administration division chairs, provost, etc., concretely say, "Yes, we want to train specifically towards that career path," rather than training towards a research path then saying, "You know if you go out into a company with your idea, that's great, and you're going to be smart enough to figure it out," as opposed to training specifically for that path? But the structure would be get faculty buy-in, get people to sign off, and then take it up the chain, and see. Be ready to argue for it, and argue the pluses and minuses, and give the case for how it would benefit Caltech. This would be a tougher one. Like I said, that would be a long-term. But an OR master's program or an OR-oriented academic program is not so different from the CMS PhD program we created. In a different school, that might've been called an operations research PhD program because it does involve optimizations, stochastics, and such from the applied math side as well as the data science, machine learning from the CS side.
ZIERLER: Have you supervised students who've expressed an interest in like a terminal master's kind of—
ZIERLER: —let me learn OR, and I want to go out into industry? What's your response to that kind of interest?
WIERMAN: We are thinking about one; not a business-oriented one but thinking about a couple of terminal master's in CS. Faculty bandwidth has been one of the biggest challenges to making them happen.
ZIERLER: Not necessarily that they're not interested; they just haven't had the time to think about it deeply, you're saying?
WIERMAN: Yeah, and to get it through. Actually, not quite that. It's more the, like, do we have enough faculty to operate it on top of what we're already doing? The one that I think has the most legs is a terminal research master's program for Caltech undergraduates, and I hope that we can make this happen where the Caltech undergraduates who want to spend an extra year doing research at Caltech can get a master's, and then can take that research either towards a start-up or towards a PhD program at the end of that fifth year. Then that's a big win for everybody involved. Caltech faculty get researchers that they've begun to work with. They get extra time with them. The undergrads get that experience before they go to a PhD or before they go out to a start-up to take their ideas with them. Everybody's all in favor of it. It's just a question of framing it in a way where with our current faculty size and our current ratio to students already that we're not creating even more load for our classes, for our faculty time in a way that hurts other educational programs that we're running.
ZIERLER: Looking at peer institutions, have these trendlines already been developed elsewhere? Would Caltech be playing catch-up to some degree?
WIERMAN: Yeah, the big schools have many master's and different undergrad programs because they're big, and they can do that. We certainly wouldn't be leading the way with a 4+1 master's program. There are many of them around. But our students often do more research in their four-year undergrad program than other schools in their 4+1 master's program. I don't think we're behind educationally, it's just we don't have a program specifically called that. It would be interesting. But I think the OR branding or b-school branding is definitely something that is potentially beneficial but a hard path to its fruition at the Caltech institute level.
ZIERLER: Adam, I want to pick up on a very interesting thing you said in your early collaborations with Steven Low when it was just starting to think about sustainability and the Resnick Institute, and things like that. You mentioned that the emphasis institutionally was on hardware before systems, meaning that the kinds of things that you were working on with Steven were at the periphery of the core of sustainability research circa 2010–2011. Let me just ask generally, why would that be the case? Why would it be hardware before systems? It seems to me a little counterintuitive.
ZIERLER: It's the systems as the foundation, and the hardware from it, so I wonder if you could just explain those distinctions, why it would happen that way.
WIERMAN: I actually struggled with this at the time of it because the way it came up was often in students applying for fellowships or funding internally, and the evaluation process tending to come back with comments that clearly placed less of an emphasis on putting things together as a whole rather than the places that make up the parts. It's a very natural thing. It's easy, I think, for people to understand if I can make a battery that's 100 times more efficient, then I can store solar for days, and it makes everything better, or if I can build PV that is 10 times, I'll get 100% more efficient, then everything gets a lot better immediately. I think it's that one-sentence statement of the goal that is often very easy and compelling. Then even if you're working on a very esoteric theoretical piece of the chemical reaction that may one day lead to a battery, you can talk about that very concrete, easy-to-understand piece of impact that your research is building towards. Whereas on the system level, it's necessarily complicated and convoluted to understand why what you're doing is going to have an impact or is needed or whatever. Often, the benefit or the impact or how you combine things depends on the devices. If someone believes that they're going to be able to make the next battery, maybe they think that this piece of the systems work is less important, or whatever. It becomes just a little bit more entangled to explain it. Even though everybody, I think, though it was necessary at the time, it somehow never rose to the same level of priority as doing the whole thing. Part of that was on us being able to provide that simple line of if you can do X, then this will happen. Often, that line is a negative one for the system, like, if you don't figure out how to [laugh] integrate solar efficiently, plugging it in will just lead to more blackouts. There's this negative piece that somehow doesn't have the same positive resonance [laugh] as building the next-generation battery. The data center work, I think, often had a little bit more resonance than the grid system level work, because you could just say, look, if we don't do something for data centers, then these hundreds of buildings are going to be 10% of the electricity emissions in the US, and if we can change it, then we can keep them at 1%. That became a very, like, easy positive message for people outside of CS to understand, even if they didn't understand how we could do it or why data centers use so much energy.
ZIERLER: Before we get to how you and Steven slotted into what was already happening institutionally with sustainability, just on the one-to-one basis, what was he working on that was so obviously interesting for you at that point?
WIERMAN: When I joined, you mean?
WIERMAN: When I joined, Steven was just coming back from a start-up. He was coming back from his FastSoft start-up, which was TCP protocol-based stuff. He was looking for the next thing, and he was dabbling in energy. He basically came back, and I was expecting him to be a hard-core networking person when he got back, and he said, "No, I'm not going to do that anymore."
WIERMAN: "I'm going to do energy. I just need to figure out what in energy I'm going to do." [laugh] It was an interesting kind of year, year and a half to where, yes, he was still churning out some research in networking because he still had some students interested in that area, and there's always that backlog. But everything that he was reading, everything that we were talking about in group meetings was to try and figure out where you could have impact with our style of work in the sustainability space. I had started working on data center stuff, and he had dabbled a little bit in that with me, and he also dabbled in the extension of his start-up to energy, so not just looking on the endpoints like the data centers but the actual routers and devices in the network. Could you have TCP-like protocols that made them more energy efficient? But then, at the end of the day, he really hit on smart grid and the optimization of protocols in—protocols, they don't call them that [laugh]—but in the smart grid as being a really impactful place to go.
The time, I think, he, me, Mani, most of the people in the field were a bit naïve because there was this story that, for the internet, we went through this transition where we packetized everything, and had this architectural separation, and this gave us all this power to innovate. We'll be able to do the same thing in power. That optimism of we'll just be able to bring the IP architecture over to power lasted maybe two or three years until people realized, no, the complication of Kirchhoff's laws and complexity and non-convexity of the system, and all these things, make it so that you can't have the same architectural separation that you had in the internet in the power system. I think, at that point, there were a lot of networking people who retreated back to networking because they realized their same tools wouldn't apply. But Steven and I [laugh] and many others stayed and persisted. I think he think he found some big hits in terms of how to do optimization of these protocols but really these systems, these distributed control policies, using some of the same tools, even though the architectural separation and the architectural design was very different.
ZIERLER: Just to clarify the timing, you were already thinking about data centers before the collaborations with Steven started?
WIERMAN: Before, I guess, somewhere in there, yeah, probably a little bit before. It's hard to remember precisely. But that was my baby, and Steven joined in and helped and thought about it. He was thinking about energy and networking and distributed systems. But the data center piece specifically was something that I drove from a very early time in the group.
ZIERLER: Was that at all related, your interest in data centers, with what the economists were doing, or was that really your own thing?
WIERMAN: That was my own thing. There was some piece. With the economists, it was still the like distributed agents pushing, having their own control, and inefficiencies that come from that. In the data center world, each server is operating in a distributed way, making its own scheduling decisions; making its own resource allocation decisions: whether to go to sleep, whether to save power, or whether to run at full load. Again, there was the same sort of questions around how to design the policies in a way that they were incentive compatible for the individual servers. There was collaboration on that front with them, and there was collaboration with Steven on the policy scheduling networking side. But I was somehow at the center of all the data center discussions.
ZIERLER: Because this was coming from you, do you have a clear memory of why data centers? Did you read something in the Times about energy consumption, or what was it?
WIERMAN: Yeah. [laugh] Data centers, it was the studies—I think I mentioned this in one of our chats earlier—these studies that I think it was NSF but a government agency really put out with forecasts. Before those studies became public, I was aware enough of them coming out, and of the people involved in them that, really, it was very clear what they were going to say and what the growth trends were going to look like. Given my interest in scheduling and resource allocation for my PhD, and doing that in these distributed system and cloud environments, this just felt like a perfect fit for me where my expertise on scheduling and resource allocation could be put towards a big societal problem that needed to be solved quickly to prevent this growth from just compounding year over year.
ZIERLER: I think as you explained previously, researchers, the NSF recognized the problems, these energy trendlines. But the industry, the Microsofts, the Amazons, initially, they were not proactive about this.
WIERMAN: They were not proactive at all, initially. It took quite a while for them to respond, and they weren't even measuring to really understand how bad. This is where, actually, there was a Greenpeace report that served as a good slap on the wrist for companies, and woke them up a little bit, at least to do superficial things so that they didn't get an F on the Greenpeace energy efficiency scorecard [laugh] that they put out yearly. That came as one way to get industry involved. Thinking back, the initial conferences where all this stuff was presented, the typical feedback you would get would be huge interest from academics, and then huge complaints from industry folks basically saying, "This isn't realistic. You're never going to be able to do this, or the costs are going to be too high, or it's going to impact all this reliability." Issue after issue was raised as to why this was not something that would be feasible or viable. It took a few years for the industry response to be something like, "Maybe we could make this work, or maybe it would work but it would be more expensive to operate." It took a while to get to that point.
ZIERLER: Obviously, there's no regulatory framework? There's no pressure coming from government about, like, cars in California, we have to reduce emissions by X amount? There was none of that happening?
WIERMAN: None of that, no.
ZIERLER: Did you get involved at all because of the obvious efficacy of regulatory as a way to get industry to where you want them to go? Did you become involved at all in that from a technical perspective, saying, I think we can get here, and there need to be the mandates in order to make sure there's compliance?
WIERMAN: Not in a huge way in data centers, specifically, but definitely in the discussions. I was definitely in the room for a number of these discussions. I think the same thing that's going around in fairness and privacy right now in the algorithmic world, one of the roles I think of the cutting-edge researchers in that space were to just prove that this was feasible. You had to demonstrate the technology that this could be done before there's any door open for regulation. It took a good four years to get to that point where you could really viably say this is feasible algorithmically. This is not going to destroy the systems involved. You can demonstrate this in large-scale settings. It was a long haul just to get to that point.
ZIERLER: The sequencing at Caltech institutionally, where it was hardware before systems, how did you and Steven put that to your advantage in terms of building on what was already there, and demonstrating that you had things to share that would be quite useful?
WIERMAN: That's a good question. Maybe the best example of that was Steven's EV-charging test bed, which I had some hand in but he was the leader, and he really carried the ball on all that. This lets you test battery management, and lets you plug into grid systems, and do a dynamic adaptation of response, while also demonstrating the EV system sophistication itself. That deployment really was a way of bringing in some of the hardware that already existed in the development that happened on campus, and integrating it into a system that was adaptively serving the whole campus, I think, a valuable research test bed in its own right but also, I think, a good sort of message to campus of why this stuff was important in the role it could play.
ZIERLER: Just to fast-forward to the present, where is that balance now in terms of hardware and systems? Is it about where you would want it to be?
WIERMAN: I always like more systems. [laugh]
WIERMAN: I think the systems, people always underestimate how important they are. The negative message is not as upbeat as we might like. But the best battery technology in the world is not going to help us if we don't have a system that can dynamically adapt loads and demands to use that battery. We see it over and over again. There was a great example. The Texas examples of the last few years have been really amazing in terms of market failures, system failures, despite huge investment in very sophisticated storage and wind and solar technologies throughout the grid. There's some shining examples where some of these power walls really serve well as distributed energy resources, and protect the grid. But, in general, the system failures have been widespread in ways that they don't need to be because of underinvestment in the system architecture, and our understanding there. I think that stuff is going to continue, and become more and more dramatic unless we can get ahead on the system side. [laugh] I remember being asked a question on a panel around what is the smart grid going to look like if everything is successful? The realistic answer is, not that different. Maybe you'll have some smart light switches going on and off, but the human experience isn't that different. It's the system experience that will just be more reliable, more efficient, more green, all of these things. Unless you are bought into the importance of green energy and the efficiency of getting to a 100% renewable grid, it's not as exciting to see that number creep up as it is to have a more tangible salient outcome. We need the like smart grid product that [laugh] makes it really obvious to the consumer what they're getting from all these investments.
ZIERLER: To return to this overview discussion about the growth among undergraduates in CS, and interest in computer science, this amazing fact that you shared with me in our last discussion that it was really a tiny program for undergraduates when you joined. To understand it's massive growth in the last 15-some odd years, from your specific vantage point of branching out into economics, looking at your perspective from CS and going into sustainability, how did both of those areas contribute to what I assume must've been obvious among undergraduates about just the possibilities as CS as an undergraduate major? How would you root your particular interest with this overall trendline and growth of the major?
WIERMAN: The growth has been so robust that it's not that any one area can be tagged to particular sorts of growth, except in recent years maybe machine learning, being a huge, overwhelming motivation for a lot of the undergraduates. But I think, in general, it's more that the breadth and wealth of applications that's appealing. One of the biggest changes we made in the undergrad program early on was to get pizza courses, so to get really nice pizza courses where freshman year, when the students join, they see 10 faculty talk about 10 different research areas, often overlapping. We tell them the story of CS+X, and the impact that has. They're taking that pizza course combined with CS1, which each project is focused on a different area, whether it be biology or chemistry or astronomy or graphics within CS. They're seeing 20 different applications of CS to wildly different fields by the time they've done their first term at Caltech.
That's just in huge stark contrast to the introductory experience in other majors. To just see the breadth, I think, is one of the most compelling things. It's not that everybody wants to jump in and do sustainability as their application. But they know they can have impact on sustainability, they can have impact on astronomy, they can have an impact on all of these and, whichever where they go, this tool set is going to help them with that. I think that's the message that many undergrads find compelling, is that whatever I do, whatever application area or research area I want to focus on, I'm going to need this stuff. This stuff will help me, and give me a competitive advantage, and so let me do it. Then I'll add on a double major in some other area if I get excited about some other area specifically later on. But I know I want to do CS.
ZIERLER: Administratively, how did CS deal with these numbers? Was it gradual? Was there an avalanche one year of interest?
WIERMAN: It's been gradual. It's been gradual. It's been 10%, 20% a year every year, except for a few years. There's a been a few times where we say, "Oh, it flattened this year. Maybe we've hit our peak," and then the next year—
WIERMAN: It's been medium and steady; not slow and steady. A lot of the growth happened while I was Department Chair, or Option Rep before Department Chair. We did spend a lot of time thinking about how to respond within the constraints that Caltech has for you, and it became a multifaceted thing. Whereas the answer many other schools had was hire in proportion to undergrad growth, and just keep X% of your faculty in CS, it's not the answer at Caltech because if you wanted to hire in proportion to undergrad growth, you'd have 50% of your faculty in CS, and that doesn't make for an intellectually diverse research institution, and you can't get there in any reasonable time. It's been moderated growth in CS in the core. We have our target of growth there, which is to get to around 25 FTs. Then we had targeted growth of CS+X faculty. This happened through the Bren chairs program, where we hired six or seven faculty in a very short period of time, actually, senior faculty just post-tenure that were all of CS and something else. These are people like Aaron Ames and Soon-Jo Chung and Lior Pachter, and some very influential senior names now on campus that helped make those bridges.
This also served in the undergraduate part of supporting and providing a variety of new classes that focused on impact outside of CS, but also on the faculty level of creating bridges where people are in the CS and CMS curriculum and faculty meetings but also going and conveying and communicating with biology or astronomy, or wherever their other half is, to help build those connections, both educationally and research-wise. There were those two focuses on faculty growth. There was also the realization that we just will have to have teaching faculty to a small extent in CS to help cover that. We pushed for the creation of the role of teaching faculty at Caltech and Ravi inally was able to be successful in making that happen. Now we have official teaching faculty with teaching faculty within CMS, and I think we have nearly all of them on campus, but a couple of other departments also have some teaching faculty. That gave us the faculty level components. Then we also started to create the data science undergrad program with the idea of there's a lot of people doing CS specifically but what they really want is to use data in the service of something else. If we can make a data science program, that'll actually ease the load on CS a little bit because there will be this other avenue, and data science faculty are many across the institute, whereas there's a lot of people that would be happy to advise somebody in data science who might not feel as comfortable advising them in core CS areas.
The data science thing was intellectually very motivated and appealing for students but also served as an escape valve to have some students do that instead of CS, and open up a new door to a new set of faculty that can support that program educationally. All those things were on the educational front. But then the other pieces on the research front supporting the intellectual connections that come out, given that we have a small core in CS, and there the approach was fundraising and postdocs. Fundraising provides seed funds so that these faculty could spend less time getting grants, and more time on these collaborations. Then, also, on the postdoc side, Jason Marden is an example that we talked about today, but postdocs often serve as really nice bridges between faculty. They're faculty. We treat them as faculty without the [laugh] requirements of teaching and administrative load. They have more time, and they have that skill set. We have about 20 or 25 at a time in the department, and they always bridge faculty, and create connections and collaborations that weren't there before. They also take SURFs, and advise grad students, and all these sorts of things, teach classes occasionally, so they are very good magnifiers of the faculty effort, faculty time in terms of education and research, both.
ZIERLER: Adam, last question for today, going back to our first discussion talking about the way Caltech really supports its junior faculty. There's an impetus to success; that it's really success for all involved. Contrasting that with your new office that you just got from your predecessor who did not get tenure, when it came time for—
ZIERLER: —your tenure talk, what were your feelings, and what did you want to emphasize in terms of making the case of the significance of your research?
WIERMAN: We don't really have tenure talks in CMS in the same way that some places do. But, at the same time, there's some idea that you go out, and you do a tenure tour, and you go give talks at a bunch of places where you haven't seen people in a little while so people are aware of your work. Maybe naïvely but I think tenure was never a big stressor for me. I always had that feeling, like, I'm doing good work. Caltech, I think, can recognize it. If they can't, I'll get a job at another place that will, and it'll be fine. Tenure was never a major stressor for me. But I did go out and give my tenure tour. For me, it was really fun to go out and talk about things rather than any narrow per-paper model talk about, "Here was the vision. Here is how the theory and the practice intertwines. We were able to get things deployed, and look at where we are today." My research had hit at the right times where, at that point, I could talk about both the mathematical algorithmic work and the deployment with HP, and could point to really empirical results in industry. It was a very nice [laugh] time to be talking about the whole full life cycle of a research direction in the data center work. At that point, I didn't talk much about my economic work or the smart grid work because those were a little bit more in their infancy than the data center work, which had really hit. But I felt like that was enough to hang my hat on, so I didn't need to worry too much. For me, it was a big question. The biggest question around tenure was that I had been asked to go up early, and I was not sure whether that was really a good idea. I did go up early and, in retrospect, I wish I hadn't. [laugh]
ZIERLER: Why not? What's the difference?
WIERMAN: [laugh] The moment of getting tenure, especially, it was a time at Caltech where we had just gotten rid of the associate for most people, so it was straight from assistant to full, and it meant that now, all of a sudden, I was asked to do all these tasks—
WIERMAN: —in my research community that only full professors were asked to do [laugh], and I was no longer eligible for any of these junior faculty awards or grants that were easier to get. There was just this rude awakening of extra duties and harder funding overnight that I could've put off for a year or two if I wanted.
ZIERLER: [laugh] That's great. It's a mirage of a full professorship that you see in retrospect.
WIERMAN: Exactly. [laugh]
ZIERLER: Adam, that's great. Next time, we'll pick up post-2012. We'll take the story right up to the present.
[End of Recording]
ZIERLER: This is David Zierler, Director of the Caltech Heritage Project. It's Wednesday, November 30th, 2022. It is great to be back with Professor Adam Wierman. Adam, once again, thank you so much for joining me. Great to be with you.
WIERMAN: My pleasure.
ZIERLER: Adam, today we're going to pick up on the other side of tenure when you become full professor in 2012. I want to go back to one thing you said that was interesting. It deserves sussing out a little bit. You mentioned how in your tenure talks and at that point, you really weren't emphasizing so much the sustainability aspects of your research, and there could be two ways of looking at that. One is for you personally, you didn't feel like your research was well developed enough to talk about it. On the other side, institutionally for Caltech, for CS, maybe the field or the university wasn't really ready to connect these things in the way that you were doing. I wonder if you could explain a little more what the strategy was at that point in terms of what you wanted to talk about.
WIERMAN: It's been a little while since we chatted, so let me get myself back in the vein here. But I think there were a couple things going on. One was on the company side, which we talked about a bit, the messaging for companies. They were much more interested in hearing about how you could save them cost rather than how you could reduce their carbon footprint. Even though they were interested in sustainability, cost was the key metric. That was one piece of that aspect. Another piece was, I think, in CS as a field there, sustainability was a new thing. The idea of somebody doing power systems as a computer scientist was still not [laugh] normal, and not what people talk about. You want to speak to people where they are, and so in going out and talking about the work, you want to talk about, data center design systems, all these sorts of things. There's an important aspect of sustainability that you're trying to introduce people to. But the contributions, especially when you're talking about a tenure talk, the contributions are algorithmic. The contributions have to be understood as CS contributions in that way. Then within Caltech, I think, there wasn't a big, huge presence of sustainability yet. The expertise in terms of how to evaluate work in that area, it was just not there as much as it is now, for example.
ZIERLER: Adam, on the industry side, what's the chronology? As you made the distinction, people weren't really thinking about reduced carbon footprints. They were talking about reduced cost. When does that happen when companies that are operating these enormous systems start thinking about the carbon costs as well as the financial costs? When does that happen?
WIERMAN: Realistically, very recently. [laugh] I think there was a lot of change. The Greenpeace data series of yearly reports that really focused on carbon footprint of companies. For the whole decade of the 2010s, the way companies responded to that was just to buy solar power or buy carbon credits or things like this. In terms of operations of especially data centers, the focus was on reducing cost, and maybe cost included how many carbon credits you had to buy. But it was not in terms of reducing the carbon footprint of your actual operations as a motivator. It's really only been in the last four or five years that the carbon itself of the operations was something that was the first priority. Now, it was still a second- or a third-level priority in the decade period. But, often, the way it was viewed was if you can reduce carbon and reduce operating costs or reduce our power bill, then this is a great thing, and we'll see it in our bottom line, as opposed to the first order, let's reduce carbon because it's good for the environment and for the world.
ZIERLER: Generally, is the correlation as simple as that, saving carbon is money savings, at the end of the day?
WIERMAN: At the end of the day, that's the message we were making back in the early 2010s, was if you can do this, your electricity bill, which is skyrocketing, can be curbed, and even more by providing services to the grid. But in terms of demand response and flexibility, you'll be able to have a revenue stream from these market programs that are emerging that you're getting $0 from now. The combination, you'll be doing something good in terms of your carbon footprint, but you'll also be reducing costs and generating new revenue streams by participating in these demand response programs.
ZIERLER: Adam, in previous conversations, we've covered well how the kinds of ideas that you were developing do have a strong history at Caltech with some of your colleagues. What about more generally in the field, in CS beyond Caltech? Were other people thinking about these things? How much of an innovator was Caltech institutionally?
WIERMAN: Caltech, me and Steven were some of the first people to be going after these energy-efficient, sustainable data centers. I think there was a push that was a little bit prior to us that was on building efficiency of data centers, just reducing how much is lost in inefficiency to cooling and things like this. But a sustainability-first, carbon-first design of these data centers was something that we really brought focus on, and brought a lot of people into the field with. There were a number of workshops that we organized at major conferences, and a number of programs that we organized that brought people of like minds together, and kicked off how to think about algorithms for carbon-centric computing.
ZIERLER: Were there companies in particular, or even corporate leaders, who understood the message who were ahead of the curve, and you could use them to get the message out more broadly?
WIERMAN: Yeah, that's why we ended up working with HP. I think I said a little bit in our last call about the story of my students going off to companies, and trying to take some of these ideas out, and running into situations where at many companies, they didn't even instrument their power yet, and so they had no idea how much money they could save or how much money they were spending on these sorts of inefficiencies. But HP had been doing that monitoring, and the leaders at the time, Chandrakant Patel, was a really good partner there. He really saw the value in this stuff, and thought about it himself in creative ways. He was a good partner for actually deploying some of the test beds of these things.
ZIERLER: I wonder if you could walk me through what exactly that looked like at HP. He gets the message. You had the algorithms. What does that look like?
WIERMAN: We had him come to Caltech a couple times. My student went out there and, really, there was a team of about a dozen people that he worked with. Over a span of about a year and a half, we actually worked with them to model. The hardest part actually of taking the algorithms to HP was modeling what they cared about in terms of the objective function for the scheduler within the data center. What cost did they care about modeling? What cost did they want to minimize within the scheduling rather than some other? How do they want to put a cost on performance delays, lags, switching costs in the data center, reliability of the data center, all these sorts of things? Figuring out how to turn that into an objective function that we could schedule and optimize for, and that took quite a lot of back and forth and conversations with their team, and Chandrakant was involved. Then once we had that, we could take our algorithms, and actually deploy and implement them. There was a test bed data center in the Bay Area that they were able to run. They branded it the first net-zero data center. It wasn't really net-zero. It was close. But we ran it for a couple weeks, using these algorithms, and it had a big solar array on the top. Our algorithms reduced the draw from the grid by about, if I remember correctly, 80–85% over that period of time. That was a huge success.
ZIERLER: In terms of upfront costs, both on the hardware side and the software side, what does the investment look like from the corporate perspective? They understand long-term the savings are going to be there. But what does it take to commit upfront?
WIERMAN: On their side, the initial commitment was just to test the algorithms in a data center that was not user-facing. It didn't require extra hardware or extra software. It was just the team time and the development for the algorithm that was deployed. But their long-term vision, and what they did, was then turn the software into a product that they called the EcoPOD. Then the EcoPOD was a plugin data center rack that they could sell, and you could just buy a bunch of them, and plug them into your data center, and they would adapt to these external signals about energy price and energy availability. These were sold on the market for a while. They won some Computerworld Laureate Award for HP that we were a part of. Then they were bought by major tech companies. Apple, a lot of Apple data centers at the time were running these EcoPODs to try to manage and add flexibility to their compute loads in a way that they could be responsive. Now, they may or may not have been responding to carbon. They probably weren't. But they were responding to prices, availability, these sorts of things in a flexible way, more flexible than they could be without these sorts of designs.
ZIERLER: Adam, getting the story out, from HP being a pioneer in this regard, is the message basically that any company that has a massive data center, this is relevant, these algorithms work, or is it a specific kind of data center, the kind of space that HP is operating in that limits it to some degree?
WIERMAN: That's a great question. The key differentiator for when it makes sense versus when it doesn't is how much flexibility you have in your workload, and this is flexibility to delay it over time. What HP had, which was a lot of video-rendering tasks, they're going to take days. Delaying them a little bit or making them run a little bit heavier some parts of the day, and lighter other parts of the day but still finishing them within a week is fine. When you have workloads like that that are delay-tolerant—is the phrase—then you can take advantage of that flexibility to adapt to carbon awareness or solar awareness, wind awareness, just energy prices in general. That's the key thing. If you think of a company like Google, you're not going to do this for serving YouTube videos or serving Gmail or things like this, but you can do it for your backend archival work, updating your database, updating your search algorithms. All the experimentation you do to do that can be done in a much more flexible way with some delay-tolerance and adaptability built in. It really is—
ZIERLER: In the way that—
WIERMAN: I'm sorry, go ahead.
ZIERLER: I was going to say, in the way that flexibility is a limiting factor, have the algorithms evolved so that they themselves are more flexible so that they're relevant in a wider range of uses?
WIERMAN: That's a great question. The algorithms themselves don't take much compute, and so how flexible they are isn't as important. But I think the key thing is to understand how much gain you have from doing this as a company means understanding the flexibility of your workloads, the ability to either delay them in time, temporally, or move them spatially across regions. If you think of Facebook and companies like that that have mirrors in lots of different locations, do you have mirrors that can serve the same information or are your mirrors only mirroring the local information? If you can really serve from different locations, now you can adapt how you serve that content to the energy availability, to the energy mix of the locations.
ZIERLER: Just a very ballpark figure, once the system is up and running and we have this near net-zero facility, what is it saving HP at the end of the day? What are the figures that you can tout?
WIERMAN: I'm not going to remember them off the top of my head for back then.
ZIERLER: But they're significant enough that it makes sense?
WIERMAN: Let me try to remember. Ping me if you need the exact number for this for writing something down.
WIERMAN: But a Google quarterly electricity bill is in the hundreds of millions of dollars. If you can cut even a small fraction of that out, you're making a huge amount in terms of bottom-line impact on just the operation costs of data centers, purely that. Then if you're talking about carbon footprint, of course, these things have enormous carbon footprints that you're pulling out, and so the win for carbon is even higher than the win for the budgetary impact.
ZIERLER: It's more obvious, the amounts of carbon you're saving than the amount of money you're saving?
WIERMAN: Yeah. But even the money you're saving is a huge amount of dollars and cents. [laugh]
ZIERLER: Given the obvious economic value that you're bringing, what is your relationship in partnering with these companies? Do they bring you on as a consultant? Are you doing this through a start-up? Are you just a Caltech professor looking to help the world? What is the affiliation that you have in these relations?
WIERMAN: In the 2010s, I was just a Caltech professor looking for a tenure and impact for my work. Our students went. We consulted. We wrote papers with them. But there was not a financial agreement other than gifts for research to Caltech. These days, I'm more likely to be on advisory roles for companies that are adapting these sorts of ideas. I am playing that role with a couple of companies right now in terms of helping them think about how they do their carbon awareness in their data center design.
ZIERLER: Is there a startup or an entrepreneurial pursuit that would make sense for you, or that's really not the model?
WIERMAN: In this space, I think there is, first of all, but it's massive, capitally intensive, at least the way I would envision it. I haven't gone that route yet, because to demonstrate this, you need a lot of capital to demonstrate on a few test data centers, and that's hundreds of millions of dollars demo before you get going on any major contracts. This is not your typical model for a start-up spin out from a faculty where you can start with low capital, and get it off the ground, and build up. You need to already be putting massive amounts of capital and infrastructure behind it to get going. It's, I think, something where partnering with companies and having them lead the way is the most natural path to market for these ideas.
ZIERLER: Whether it's your own company or in partnership with a company, just a broader question. For a biologist who does a start-up in biotechnology or something like that, there's a very clear distinction between fundamental research and applications, where you're specifically looking to bring a medicine to market. How would that work for you in CS in networking? What would be an intellectual distinction or not between basic science, fundamental research, and those translational applications to industry? What would that look like, in your mind?
WIERMAN: For my type of work, it's really going from the whiteboard to a big system. When you're designing algorithms on a whiteboard, improving optimality guarantees, and things like this, you're abstracting away from a lot of the details that these systems need to figure out. That's the separation between the academic ideas and the reality. When you're going and you're building a massive data center, there's so many things that need to be fit together. The algorithms that we're designing in the lab are just one piece of that. They're an important piece, but everything else needs to work well too and be integrated too.
ZIERLER: I'll ask an even more philosophical question, pushing on the metaphor. For the biologist doing the fundamental research, looking at ribosomes under the microscope, and learning something new, when you're at the whiteboard, are you learning something new about nature or is it about logic and the human mind?
WIERMAN: I think of it more like a mathematician understanding something new about abstract structures. But, really, it's computer science. It's understanding something new about what we can do with a computer. When you're proving something about an algorithm, I often really enjoy thinking about lower bounds in this space. What a lower bound means in a carbon wear algorithm is what is it possible to do? What is the fundamental trade-off you have to make between performance and sustainability and safety, for example? Is there some fundamental limit where, if you're going to ensure something is carbon-neutral, this is the best performance you can possibly hope for from any system, no matter how creative you are in designing it? If you add on a safety guarantee, does that limit the amount of sustainability you can achieve, the carbon footprint? What are the fundamental limits, and how close can we get to those fundamental limits with algorithms? In some sense, it's nature, but it's the nature of computing. What are the fundamental limits of computing? For a long time, when you thought of computing, you thought of it just in terms of what can be computed, and how fast can you compute it? This is turning completeness and NP-hardness. In the time since the early 2010s, there's now a new piece of that, which is how sustainably can you compute it? What can you compute in a carbon-neutral way, or what are the sustainability limits? What are the energy limits of a computation? Is there a fundamental trade-off between energy and time for computation, for example?
ZIERLER: Adam, back to the narrative of 2012, on the other side of tenure, as you said last time, you really didn't sweat the decision. You were doing good work. It wasn't really a thing that was stressful for you. Still, on the other side of tenure, are there now opportunities to be more adventurous in your research, unbounded from presenting your studies, and making sure that you're accepted by your fellow faculty member? What does that look like for you on the other side?
WIERMAN: I'm sure there are. It's hard. I look back. I don't think I was that constrained. But, at the same time, I certainly explore without worrying about such things now, which is nice. Maybe that's because I have tenure. [laugh] I've always liked moving around between fields, and I'm still moving around between fields quite a bit. It's great to be able to follow my interests, and follow where I think the important problems are.
ZIERLER: Are there topics that you took on circa 2012–2013 that you might link specifically to not having those conc…not concerns but not feeling bounded at all?
WIERMAN: It's hard to link it to those concerns. But I had already been thinking about economics, so I wouldn't say that. In the last few years, I've jumped heavily into learning and control, and that interface. I was never a control theorist, I was never a machine learning theorist, so it seems crazy to be doing learning and control. Maybe that falls into that category. But, at the same time, it doesn't feel like I'm leaping in a way that is crazy. It just feels like there are really important problems there, and I want to go solve them with my students. I just have that more micro attitude, rather than [laugh] thinking like I'm doing something crazy, I guess.
ZIERLER: On that point of working with your students, something we've referred to, although I'm not sure by name, RSRG, the Rigorous Systems Research Group, did that start for you at the beginning of your faculty appointment at Caltech, or that was later on?
WIERMAN: That was very early on, and actually Mani was someone that when I arrived, really helped me view it in the way that we formed it, and he came up with a name.
ZIERLER: I know it's research. We don't say RSRG. [laugh]
WIERMAN: Yeah, research. [laugh] We say research. When we came up with the name, it was really an idea of how do we get Mani and Steven and me and my students all to feel like one community at Caltech and to feel like the students can work with any of us instead of being tied to only one of us. It really worked in that respect. We have a number of students that ended up with co-advisors, or even all three of us are advisors, or one of our student's posts who's now at UIC, he had I think me, Steven, Mani, and Babak as four advisors when he graduated. I think having a community like that is so great for the students, and it really helps you as a faculty member to not get stuck in your ways. You get to hear about what they're learning from other faculty. You learn from other faculty through your students. You see how the students are being advised across the different faculty, and the students develop from that, and then you develop from that too in how you advise them because you see what they react to. It's been really, both in terms of mentoring and research output, really valuable to have that group of us together.
ZIERLER: You mentioned that there was a student who was quite important with the relationship with HP. I wonder if an additional benefit of not having your own start-up is you might not need that firewall between making sure you keep your Caltech graduate students separate from the employees at the company. By not having a startup, do you have more of a free hand to involve your students in these company partnerships?
WIERMAN: I've historically and still trying to keep an eye on the fact that students can work with multiple companies through their careers in different ways. We have, I think, three or four companies right now, four that we're working with, and different students are working with multiple of the companies in various ways now. I think that's really healthy for an academic. Increasingly, it's hard to do that because more and more companies are trying to hire faculty for their one day a week time, the big companies like Amazon or NVIDIA or Microsoft, getting that one day a week time from the faculty member. Then the faculty member has to wall things off between that company and others. Students have to decide whether they want to work with that company or not. It has all these COI implications for how groups run, and what collaborations can happen with a group across different companies. I've tried to really resist that model in my group. We work with lots of companies in many different ways, which I think is hard if you either go and, as a faculty member, take that one day a week role at a company, or do a start-up, and then have that firewall that gets created between the start-up and the lab.
ZIERLER: A general question, as it relates to all of your graduate students and the things that they've gone onto, the rough proportion who go into industry versus academia, has that been stable over time, or has that shifted, particularly as industry came alive to the way that your group would be able to help them?
WIERMAN: It's been pretty stable. I think I run a group that's a very academic-oriented group, even though we work with companies, and most students end up wanting to go to an academic position or a national lab type position. I'm trying to just verify. I don't think it's changed too much over the years. I'd say it changes more based on the type of work the student's doing. Some of my students who have done more the system-building type of work, they'll be more likely to go to industry, whereas the students that are doing the algorithmic work or the market design work are more likely to go to academia. Then I've had a few students do start-ups out of their grad or postdoc period. When they graduate, they go that way with something related to the research but not as many as some groups; just a couple.
ZIERLER: On the undergraduate side, a running thread from our conversations, something I'm very interested in is just the wellspring of interest among undergraduates in CS, and when this really started to take off. Of course, you were executive officer from 2015 to 2020. What unique vantage point did that give you, first of all, administratively in how Caltech, which is small—I remember this crazy comparison you shared with me about the size of CS at Carnegie Mellon versus the size of Caltech entirely. As executive officer, just administratively to start, how did you deal with that? How did you deal with the way undergraduates were voting with their feet?
WIERMAN: As I think back on it, one of the biggest things it did is let me see the future of Caltech in a way that no one else on campus did. There was a faculty board talk I gave about the coming wave of CS way back sometime in the mid-2010s. At the time, very few people really, I think, understood what was coming, and what I was trying to say, and all of the issues that I raised there.
ZIERLER: Adam, what was the wave? What did you see that early on?
WIERMAN: What you see that early on is there's just a massive growth. At that point, I could forecast that CS was going to be over half of Caltech in just a handful of years, and that all large classes at Caltech were going to be either in the core or in CS, CMS, and that we needed to respond to this in some way in terms of how we organized our core curriculum, how other options organized majors and minors to appeal to computational students, how we thought about admissions, how we thought about faculty hiring, how we thought about all of those things. It was very hard to get people to listen because they didn't quite see it yet, and so that was a big challenge in that situation. Also, it was a question of how to allocate resources. You have a small, tiny department with a small, tiny administrative staff managing all of these things. How do you get resources for admin staff, for TAs, for teaching? For, I guess at the time, instructors, we needed to create teaching faculty, which was a hard thing for Caltech to understand.
All of these structural things that basically only 15 of us experienced, and the rest of campus didn't see yet because it wasn't as if students were matriculating to CS from one major. It was a small drip from every major. It takes a while before those other majors get small enough that people realize that something's fundamentally different. I think we're only now at that point where everybody really suddenly realize, "Wait, my major is now really small, and this other one is huge. What's going to happen?" It was a very big challenge. It meant as EO there were a ton of discussions about why something had changed, and we needed X, and it was very hard to get that across [laugh], like how having this massive undergrad population meant we needed more support for this, more support for this, and more support for this, because it's just not the way Caltech worked to give support to such things. It was just resources were allocated the way they were.
ZIERLER: Adam, I'm sure you've heard the stories, of course, in the '50s and '60s, almost every Caltech undergraduate wanted to be a physics student. They wanted to be the next Richard Feynman. The physics department, one of their responses to this was a weeding-out process, where they're not necessarily going to get disproportionately bigger. They're just going to enjoy the crème de la crème among the undergraduate students. Was that an option? Is that no longer Caltech's culture? Is there a weeding-out process so that it doesn't eat up the entire undergraduate student body? What's your perspective on these things?
WIERMAN: I think, thankfully, we understand why that's a terrible idea now.
ZIERLER: If you could indulge me, just why? Why is that a terrible idea?
WIERMAN: For many reasons, but one of the biggest, most obvious reasons is there's huge DEI implications with this process, and you end up basically filtering for people who come into Caltech being exposed to an area, and having resources to already be ahead in that area, rather than giving strong students a chance to get excited about areas. Actually, if you look broadly at computer science when the field was first growing, it was gender-balanced, and then it hit these caps at universities, and there started to be a weed-out process. By the time that weed-out process had been implemented for a couple of years, it was 80:20 male, female. This weed-out process and the way it was done, basically, is at least correlational, and I think there's a lot of arguments to say is the cause for the gender imbalance in CS that still we've been fighting against for the last 20–30 years since then. I think an approach like that is just antithetical to what we want at Caltech. Also, Caltech has built on the idea that you come in, and you can get excited about your major, and you can major in it. We should be a place that allows that, and so our philosophy in CS has really been to do that. Now, that's coming into some challenges these days.
There is some discussion about how to ensure some level of option diversity in admissions, and do some filtering to make sure that things like that happens. I'm hoping that we do a very minimal or a level of that because we should. The educational mission of Caltech is to train the best and brightest in the areas that they're passionate about, and we should be doing that still. If they're passionate about CS, there's other things we can do to get them to explore other areas. My hope is that we can go to a model where my ideal solution would be that every option at Caltech has a minor, and that the CS major requires you to do a minor in some other area, and so that even if we have a large fraction of students majoring in CS, they're also taking a significant educational curriculum in some other scientific area at Caltech, and so we're still getting these diverse perspectives in the students at Caltech.
ZIERLER: It's a very important point you made about the DEI implications. I wonder if you can talk about that as a two-way street. In other words, Caltech owes it to its students to be inclusive, particularly in areas like CS, which, generationally, it has not been particularly. But I wonder how CS might appreciate the value of underrepresented voices, people coming from different perspectives, different cultural, religious orientation backgrounds, how that multiplicity of perspectives is actually good for CS itself.
WIERMAN: It's huge. This was one of my big pushes as Executive Officer. When I started, I believe, we were somewhere around 20% female in the CS option, and about, 2% or 3% URM in the CS option. Over the five years, we got up to 45% female, and about 20% to 22–23% URM. This was a huge push in terms of how we rethought the design of our undergraduate programs, or how we redesigned our intro courses, how we added courses that could expose people to the different ways CS could impact different areas of science, different areas of life. There were huge changes that we made in terms of how we even chose TAs to give diverse role models in our introductory classes, and just a long array of stuff. It mattered a ton. It meant that the voices when we had meetings in our classes, the projects that we saw come out of our classes, the outcomes in terms of students choosing companies when they go on from Caltech, the type of research projects that the undergraduates did, the type of courses that they chose and, as a result, the courses that faculty taught and were encouraged to taught and had the opportunity to teach, really got a lot more diverse, and changed a lot as our undergraduate population changed. It's been a huge, huge blessing for the department.
ZIERLER: Another narrative that we've touched on is there is CS all over the institute now. There's a computer science application in physics, chemistry, biology. This wellspring of interest in CS, how much of it from the undergraduates was they're purely interested in CS, and that's what they want to pursue, and how much of it is, "I'm here. I want to get some skills and tools, and then bring that back to what I came to Caltech for," whether it's mitochondria or it's stars?
WIERMAN: There's definitely both. I think compared to other CS programs, we have a much, much, much higher percentage of double majors in other areas, CS plus something else. This is a function of Caltech's core, and the fact that the students are coming in broadly interested in STEM, and can find ways of applying it. There's definitely a lot of the CS plus X interest in the undergrads as well. But I think there is a dominant core that really like CS for CS, and want to pursue either in industry or grad school CS research in what I would say is like the core of algorithms or distributed systems or AI. Machine learning these days is a big dominant one. It's a mix. I think the last time I checked, there was about a third of the undergrads that are double-majoring in some area outside of CS, and so a third, two-thirds in terms of core versus CS plus X as their main driver. But even the ones in the core, quantum, for example, is a core area, and is very interdisciplinary as well.
ZIERLER: Obviously, Caltech cannot hire all of the faculty to respond to this interest. It needs to maintain some level of balance. I understand one way to square that circle is to hire instructors, adjunct professors, and things like that. How do you ensure the quality of a Caltech education when CS has to rely to a much greater degree than other options on non-tenured faculty teaching these courses?
WIERMAN: To our experience, the teaching faculty teach better courses. [laugh]
ZIERLER: That's the answer.
WIERMAN: That's a little bit joking but it's true. Konsta Zuev and Adam Blank are the highest-rated teachers on campus, and they're teaching faculty. I think part of it is to not have adjuncts and instructors but to have a teaching faculty track that is valued as faculty on campus. This has been a big push that was one of the last things that Robbie and I did before we both moved on out of our administrative roles. I think that's crucial for Caltech that the role of teaching faculty becomes increasingly peer to tenure track faculty, even if it doesn't involve tenure. But they're roving faculty. They have all the roles. In our department, the teaching faculty are treated as voting faculty, and can be involved in hiring, be involved in every aspect of the decisions we make. I think that's very valuable. That's one thing. You're only going to get the best if you treat them like first-class citizens on campus.
WIERMAN: If you're treating them like first-class citizens, and they're only role is teaching, they're going to be great teachers. I think that's one thing. But the difference between a Harvey Mudd and a Caltech, where Harvey Mudd really hires people who are amazing teachers, is that when you come to Caltech, also you expect high-quality, Nobel Prize-level research that you can engage in as an undergraduate. That is not going to come from the teaching faculty. That's going to come from the tenure track faculty, and so you really need to make sure that the upper-level classes are being taught by tenured faculty who have a top-tier research community, and are giving that research interaction to the undergraduates. I think that's the balance we've struck. A lot of our intro courses are taught by teaching faculty, but then the courses that get you involved in research are all tenure track faculty. We're still too small but, hopefully, we'll get to grow. Hopefully, we can be more than we are in terms of size, while still not taking over the whole faculty population. [laugh] There's a long way in between those two extremes. [laugh]
ZIERLER: I'm not sure what the chronology was or where you were during the process, but I understand there was some talk—I don't know how developed it got—actually of breaking off information sciences into the seventh division of Caltech.
ZIERLER: Were you involved in those discussions, or do you have a sense of how that got started?
WIERMAN: That was before my time. I actually think my personal assessment would be it was a huge mistake that it didn't happen. I think we have suffered dramatically because that vote went the way it did within engineering.
ZIERLER: Because the idea there is if information sciences is within EAS, it artificially is constrained? Is that the basic problem?
WIERMAN: Yeah, not just artificially. It's constrained in terms of the faculty level by the competing interests of all the areas within engineering. Actually, I've been doing my own little history. I wonder if you have these numbers? I didn't even think to email you. [laugh] But I've been looking at the FT levels of the various divisions across the years, and engineering is still at the same percentage of the faculty population as it was in 1970, before computer science existed. At other schools, computer science is as big as the engineering program. Here, engineering is still the same fraction of Caltech, and includes this new field that didn't exist when that [laugh] fraction was set in the '70s. That puts a huge cap on what you can do in an area if the area emerges and whatever space it takes up has to be taken away from other areas of engineering, rather than growing as its own separate entity, and having a discussion about how big that entity should be, rather than pushing that into make a decision between aerospace and CS as to where you want your engineering school to go, which is an impossible decision for one person to make as a division chair.
ZIERLER: It's interesting that these discussions happened before your time. I'm curious, particularly when you were dual-hatted from 2016—you were EO for CS, and then you were also the Director for Information Science and Technology—who would be really better administratively positioned to make that case for a new division? Did you try to do that? Did you talk to the powers that be about that?
WIERMAN: It's a bit of a third wheel right now in the sense that there was a big push for it, and it didn't go through. The engineering division, basically, the history that I've heard of this is that the south side of the division voted against it, and the north side of the division voted for it. There were more people on the south side, so it didn't reach the level to happen. I think, nominally, nothing has changed in that regard. It's something where the people that want to go, want to go, but the rest of the division doesn't want to give up something that's an exciting, important part of Caltech. It's a very hard thing. It's a very hard thing to happen. There was a big push. I think, at some point, probably there will be another big push, but it has to be the timing is right. It's a massive financial undertaking, so it happened at a time where there was a potential donor in mind for it. Right now, there's not a potential donor. This is a hundreds-of-millions-of dollars thing to spin up a new division.
ZIERLER: The wave that you saw, a decade in the making, the trend lines, if you can extrapolate into the future, are the numbers going to continue on that basic trajectory where the issue of creating a new division will force itself, or is there a leveling out that might suggest that there's a stasis, and that it's not perfect—
ZIERLER: —but that there isn't an overwhelming case in five or 10 years to break off from EAS? What do you see those trend lines looking like?
WIERMAN: It's going to be really interesting to see. A lot of it depends on what the administration does in the next five years, because the undergrad trend lines, if left unchecked, will continue. Any moderation on the growth of CS as an undergraduate major will happen because of Caltech changing the way it does admissions. I think they might do that. Then, similarly, within the faculty graduate student side, the graduate student population grows proportionally to the faculty. The question is whether the CMS faculty will be allowed to continue growing? Hopefully, we are. But where will that target be placed? Where will that cap be placed by the administration in terms of what we're allowed to grow to? If that's at the level of the size of a division, then the division issue will end up forcing itself. If that's lower, then there's a different question. GPS, I think, is one of the smaller divisions at 30-ish, 30–35 faculty target. If I had my way, CMS would be up at 30–35 faculty in five years, and then it probably makes sense for a division discussion to happen. But at 19 FTs, like we are now, it doesn't make sense for that to be a division.
ZIERLER: I'm just thinking of all of the tech billionaires out there who might have a few hundred million dollars to spare.
WIERMAN: [laugh] Yes. There are some that might have enough that are Caltech alums. [laugh] But it takes time. I think one of the interesting things that I've struggled with in the fundraising for these areas is CS as a major at Caltech didn't exist until 2003—
ZIERLER: That's amazing.
WIERMAN: —which means our first graduates are now in their 30s.
ZIERLER: They're making good yearly salaries, but they're not quite at the endowment level yet?
WIERMAN: Yeah. Even if they've done well, they're not at the stage of life where they're considering the massive gifts for their legacy, their legacy gifts. We have a ways to go until the CS alums are at the age where that sort of thing becomes possible. Now, there's plenty of other alums that did CS-related things while they were here, and alums that maybe moved into CS areas, even if they were physicists or chemists or biologists when they were at Caltech. But the most natural giving base is your alums, and we didn't have a program until less than 20 years ago.
ZIERLER: To clarify, Adam, this would have to be a private benefactor? This is not like the NSF would come in to do something at that level?
WIERMAN: No, not at all. It would have to be a private benefactor, or Caltech making a big decision in terms of how it uses its endowment or its resources to reallocate things.
ZIERLER: We were tracing the narrative of when companies started to think about sustainability in and of itself. What about at Caltech? Clearly, with your research, you must have been alive to how Caltech institutionally approached the issue of sustainability from a science and engineering perspective. When did you start to notice that sustainability was—"buzzword" is not the quite right word—
ZIERLER: But, nowadays, you could just say "sustainability" around campus, and no one is going to look at you funny, right?
ZIERLER: When did that really start to happen around campus?
WIERMAN: The mid-2010s, really. The initial Resnick gift and set-up was a bit surprising for people, I think, and it took a little bit for people to really understand what it meant, and that it was leading a whole new wave of research across campus. But as the seed funds and the graduate fellowships and all of this spread out, it became more and more common for sustainability to be the focus of a research project on campus. Even if you were doing fundamental research in your field, to have sustainability as the motivation became more normal and more expected even. I think it was really great. But I think, as I was saying, often—or I think I said in one of the previous calls—there was this distinction between, I'd say, the way that chemists and the biologists talked about sustainability in terms of a new fuel or a new PV technology or whatever it was in that space, and the way that Steven and I and Mani were talking about sustainability in terms of redesigning the systems to deliver sustainable energy. That was a disconnect, I think, often in the early phases of Resnick where, actually, it was a number of years before any of our students or seed funds were funded by the Resnick Institute because we couldn't get across to the chemists and the biologists how this system level of thinking was so crucial for it. It just seemed more exciting to them that there was a new technology for a new battery storage or a new technology for X, even if it was very fundamental work in those areas. It took us a little while but, once we got over the hump, then I think we were happily ingratiated [laugh] into that sustainability at Caltech.
ZIERLER: I'll just note that for you and Steven and Mani, there's a duality to the word "sustainability." Obviously, the things that you're doing do contribute to the reduction of carbon emissions, but it's also sustainability in the sense of resilience, right?
ZIERLER: These systems can be sustained because of the resilience that you build into them. It's all related to climate change, at the end of the day, but it's a more even fundamental or literal definition of what sustainability actually means.
WIERMAN: That's right. The question of how do you make it sure that the grid doesn't fail when you're plugging in these new technologies, that's crucial for sustainability in both of those senses but also very different than sustainability in the motivation typically used for designing a new battery storage technology. It took a little bit to have that. As it does with any field that is being interdisciplinary, it takes a little bit to understand the lingo, and what are the important questions at different parts of the different interfaces of the fields.
ZIERLER: One thing I'm still trying to wrap my head around—I'm very interested in your perspective—when the Resnicks started to have these discussions with Caltech, what does that mean institutionally for Caltech that now there's this massive gift, there's all of this opportunity with sustainability broadly conceived? Does that mean, essentially, that every professor, everybody that has a research group that is nominally connected or plugged into the sustainability world, is that a resource? Is there gatekeeping? Are there people who would like to be in, and are not? How does this mean now that the institute is in the building phase, and we can see where these things are headed? What does that look like, from your perspective?
WIERMAN: It means Jonas has a lot of work to do—
WIERMAN: —to make sure that it doesn't feel like gatekeeping. In the ideal world, it really is, I think, close to that first, where anybody who has an interesting sustainability question related to their research can be drawn to work on that sustainability angle of what they're doing because, when they do, there are resources available to help them and their students. It's this gravitational pull where, if you're in the near-field space related to sustainability, you're pulled into that envelope because of the availability of funding and resources and everything that will help you be more effective. That's, I think, the goal, at least the way I would frame it. When you have something like this, there's a huge gravitational pull, and not only are we supporting the people who you knew were going to work on sustainability, but you're drawing more people in to work on sustainability, and you're magnifying their work to the outside world because of all the resources in the institute. But achieving that requires the administration of Resnick to do a great job at making funding available at different levels, at different times, through very transparent mechanisms so that everybody feels like they know how to go about getting money, and they know how to get started into this. So far, they've done, I think, a great job of this. There's lots of different calls for small-scale grants, medium-scale grants, big-scale grants. When they're allocating those, they're paying attention to trying to bring new people in. They're always reaching out to new people who they hear of might be doing something related to make sure they know how to apply. They're giving feedback to people who are new, who maybe just need to adjust how they're connecting, how to make the adjustments, and encouraging them to reapply, to be engaged, and then making those connections to engage them after they're funded. I think all that's been really great, actually. In our group, we've had a few small, and now a couple of those small have led to bigger ones that build on the small, and so it's been an effective pipeline so far.
ZIERLER: Now, because you've been involved in these things long before the Resnicks even probably thought about doing this for Caltech, did that make possible things that otherwise would not have been possible for you in your research group?
WIERMAN: I think, for me, it made things easier. It wasn't a zero-one, it was a 0.5-to-one kind of thing where we do our one-to-two kind of thing. The zero-to-one took us doing it, but the fact that there was funding internally that we could get to support this meant that the NSF Networking Division Directorate wasn't forward-looking enough to be supporting sustainability work early on. It meant that instead of having to shoehorn a grant on the algorithms without talking about sustainability, we could go for smaller things in other areas that had sustainability, and supplement with internal funding to keep going quicker. That funding source did make it a lot easier to work in this area without having to shoehorn into older grant models within NSF that weren't yet looking for sustainability projects. Then also when the sustainability projects came, we were ready. We had been doing it, and we had great stuff to demonstrate, and we could win the big sustainability grants in the distributed system world.
ZIERLER: To return to a really interesting point you made in an earlier conversation, drawing a contrast between CS at Caltech and at Stanford, for example, for an undergraduate at Stanford, they're much more likely to go into Silicon Valley industry startup world, whereas amazingly for Caltech undergraduates, even though there are these huge salaries that can be dangled in front of them at age 21 or 22, so many of them are just so interested in the fundamentals. They go onto graduate school directly. I was thinking about the Caltech-Stanford comparisons. Stanford, of course, just got a massive gift, even larger than the Resnicks, for its own sustainability institute. I wonder if you've been tracking that, and thinking about, from your perspective, how Caltech's pursuit of sustainability might be different just because of what Caltech represents versus what Stanford represents.
WIERMAN: I think that this was part of when they were writing things for the Resnicks. The Caltech model is new technologies, new fundamental science, and I think in the sustainability space, that's where the big wins come. You also need the more applied, more entrepreneurial work there. But if you think about the things that are going to really be game-changing, it's a new technology to do X, or a new algorithm that lets you integrate things well, or whatever it is. But it's that fundamental new scientific insight. I think Caltech's the place to do that, and that's been the focus of the Resnick piece. The Stanford vision is much more entrepreneurial industry-targeted than the Resnick one. The Resnick Institute does not have the council of companies that it's interacting with, and getting research questions driven by. It's driven by the faculty, and the Caltech research-centric mindset. I think that's a strength and also a differentiator. Both are probably needed. You need both entrepreneurial and the computer science, but I think we're doing it the right way for Caltech, and I think we'll have a huge impact because of the way we're doing it.
ZIERLER: Adam, in the 2017–2018 range, when you started to get more interested in economics issues and energy markets and things like that, what was the spark for you? What got you involved in these things?
WIERMAN: For me, it was the sense in which, increasingly, you couldn't imagine implementing the algorithms for sustainable data centers or smart grid or whatever it was without also thinking about the economic impacts on the markets for these things. For example, in a data center space, if you're adding all this flexibility, and you're trying to convince a company to do it, as I was saying earlier, one way is just the reduction in their power bill. But when their power bill isn't as simple as ours of just paying per usage, their power bill is playing in long-term and short-term markets, playing in these different ways of purchasing energy over long-term contracts, and so unless you take that on directly, you're not actually going to be able to implement the algorithms in the world. Then as soon as you start looking at them, you realize that these markets are not designed in a way to [laugh] facilitate sustainable participants, and so you realize you have to design both. You have to design the algorithms. One reason for why the markets were the way they were is because the customers couldn't be sophisticated. But you're designing the algorithms for the customers, and now they can be sophisticated. That gives you new power on the market design that you can take advantage of in the whole. If you can redesign both, then the whole thing can work a lot more efficiently. The same thing holds for electric vehicles and smart grid broadly. It really pushes you into that you need to think about the markets governing the systems in addition to the systems. If you think about both, you can have much more power.
ZIERLER: How much economics reading did you need to brush up on, and how much are—
ZIERLER: —you relying on your colleagues in HSS and elsewhere in terms of the collaboration to work on these kinds of topics?
WIERMAN: There was a lot of both. A lot of it was John and Federico giving me lessons on the whiteboard during our meetings. I did sit in on two of John's courses as well. I took courses from John while he was here, and that was great. I co-taught with John and Federico both. But there was a lot of reading. I had only had a couple of undergraduate courses in economics, and so I read a lot of the textbooks in various areas, lots of papers, and sat in on courses where I could. It was a lot of fun.
ZIERLER: Did you come to appreciate social science as real science to some degree?
ZIERLER: Did you have that perspective?
WIERMAN: Of course, I didn't have far to go there. I also did a psychology minor as an undergraduate in one of Herb Simon, who was a Nobel Prizewinner and was a founder in AI but also a Nobel Prizewinner in economics, and taught psychology courses. You don't take a course from him, and think that social science is soft. [laugh]
ZIERLER: [laugh] That's great. Just to bring the story full circle from what HP was doing at a very pioneering level a decade ago, where is the industry now? Are you satisfied with—I'm not sure what the right word is—saturation? Are the algorithms that you've been involved in developing, are they deployed in all of the kinds of industries and data centers that they need to be? How much work is there to do in this regard?
WIERMAN: There's still a lot of work to do. But, I think, during COVID, there was a moment where a group at Google headed by Radovanovic, who I knew as a PhD student in my area, and she went on to go to Google, and is now heading their sustainability team, Google had a big launch of really a full-fledged system for managing in a carbon-first way workload scheduling, not just within a single data center or across multiple data centers. It really brought together a bunch of my work with a bunch of work that they had done internally and that many other people in the space had done. It was really exciting to see an industry leader really full-on say, "This is possible not just for niche workloads within our system but for our main user-centric data centers." Google has done this now; Microsoft as well. Apple's more—it's less clear what they've done, but I think they've done stuff as well. A lot of the big companies have done something now. Most mid-sized companies have not. I'd say what the status, in broad strokes, is if you have massive amounts of research capital and teams, and you put a value on this, you can make it happen. That demonstration is really powerful. The next step is for some start-up to come in, and say, "This is our business model. You want to do this, we can build you a data center including the software stack that can let you do this in a sustainable way." I'm hoping that a few companies will start to play in that space now that the big companies have shown that it's feasible. I think the salesmanship from such a company should be much less now that big companies have demonstrated that they're doing it in a massive way, rather than just in a few small data centers and a few small products way.
ZIERLER: What role, if any, can the federal government play in the way that there are tax credits for EVs for consumers and things like that to lessen the burden of the upfront costs? Are there federal grants that can encourage these smaller companies that might not have these reserves to take this investment?
WIERMAN: Yeah, there are. There's some now already in terms of incorporating solar and these sorts of things into these designs. But I think that that's a great point that increasing those is going to be crucial. You can't have such a credit program really targeted directly at a data center type industry until it's been shown that it's feasible for [laugh] a major company to do it. I think we're at the stage where, hopefully, that can start to happen in a more meaningful way than it has. But I think, hopefully, we're nearing that turning point where there's going to be a very quick adoption rather than slow and steady adoption of these ideas. I'm pretty optimistic actually right now. It's great. NSF actually has had another big push just in the last few years, triggered by some of this where they've partnered with a few companies to have a couple of massive centers, one of which we're a part of that is focused on really making that next shift possible.
ZIERLER: We're talking about data centers, of course, in an American context. This is a global issue though. We say that it's great every new EV we put on the road here, but there's three more SUVs that get on the road in China. What about, specifically in China, do we see the adoption of these for data centers in Asia and elsewhere in the developing world where this is going to make a meaningful difference at a planetary level?
WIERMAN: In China, definitely. China, sort of government-centric, this has been a priority. They're not doing everything in the full-fledged way that Google has done that yet. But when they're building data centers, they're some of the most energy-efficient data centers. The adaptivity into the grid is, I think, the next phase for them. But it's clear that there is a government-level priority on moving in this direction. I expect that to happen, not as them being the leader but as them being the follower of the technologies that are deployed here in terms of companies like Google and Microsoft. I think that that one was a good one. Europe has been pretty successful. Some of their data centers, especially on the building efficiency side, are some of the most efficient because of the operating mix of energy they have, and the temperatures that they can place things in, and still have locality in terms of serving things. Outside of those three, there's a long ways to go, subject to those three markets.
ZIERLER: Bringing the story closer to the present, something that I'm just generally interested in, the way that academics pivoted during COVID and remote work and social isolation. Obviously, for scientists that have a physical lab, this is an extraordinarily different situation. For you in CS, how dramatic was it? The caricature could be simply you're on a computer. It doesn't matter if you're in the office or you're at home. But then the other way of looking at that is you don't have that spontaneity at the whiteboard with your colleagues. For you personally, for your students and the difficulties they must have endured, what was COVID like for you and your group?
WIERMAN: I think we were somewhere in the middle, somewhere along both of those lines. We didn't have to give away much. Everything that we're doing is either in the cloud or on the board, and so we could still meet. We could meet over Zoom. It took us a while to get used to it. But in terms of the research meetings that we had one-on-one for projects that were ongoing, we were pretty efficient. The development of students though was much harder. I think there's a lot of just mentoring and development of students that is the softer side of meeting people regularly in informal environments, chatting, having visitors come through that you can meet with and talk with to pitch them your work and hear their responses and change. The lack of that was definitely felt. The lack of exposure to different ideas, what other students were working on was really felt by the students. In our department, and definitely in my group, your world became smaller. Your world became you and your students or, for us, like me, Steven, and our students, as opposed to the CMS department, and everybody knowing what everybody else was doing. A lot of the two-step collaborations stopped happening in the same way. But productivity-wise on our projects in our team, we were very productive. We tried to fight it, so we tried to do things, like, for our group meetings, what we did was anytime there was somebody else that published a paper on Archive that we liked, we just invited the student at the faculty to give a talk over our Zoom meetings, and then have some Zoom meetings with the students informal chatting afterwards. We did that with each other group in CMS to try to rotate through and make sure we had somebody from the other groups in CMS come to our virtual group meeting, give us talks, and tell us what they were working on or what was going on in their groups, and tried to just create these lifelines between the students because just faculty, like me, were giving virtual talks in all these virtual seminars that everybody sent but that was only the senior faculty. The students and postdocs and junior faculty were really left out from that, and so they really needed their opportunity to have those places to get feedback.
ZIERLER: It's a really important point, the difficulties, particularly for graduate students during a really formative part of their career, and when it's so important for those interactions. Are you now seeing in your group something of a rebound, getting back together in person? Are graduate students making up for lost time, or is it more like everybody, not just at Caltech but graduate students across the board, everyone was to some degree in suspended animation for a year and a half? What does that look like for you?
WIERMAN: I think in our group, people, research-wise, were still moving forward still. I think part of it was me coming off of being EO, and feeling energized from that, and working with them very closely. But we really were very productive in terms of papers during the COVID time where we weren't as developing the vision, developing the presentation, communication skills, the interactivity, the understanding of the field as a whole. I think in that respect, we're still just slowly coming out of it because, as a whole, I think students have been slowly coming back to campus more but still not being on campus every day, maybe coming in only from meetings, and so the watercooler-type talk is still way less than it was pre-COVID. Just people, it's damaging. You're coming back from trauma, and the social impact of this is felt differently by different people. But it's felt and it's impacted the way people interact, and what social interactions they go to or are willing to prioritize. For years the prioritization was to stay home. Now, they're like, "Should I prioritize going to the weekly coffee? Maybe; maybe not." You're less likely to go. "Should I prioritize going to the departmental seminars, and hanging out for half an hour afterwards?" Everything's just been slow to get back in our department because it was so easy to go virtual, and now it's hard to draw everybody back in the way that we had a social well-knit community before.
ZIERLER: It's almost a cultural challenge, at this point?
WIERMAN: Yeah, it's a cultural challenge that we have to overcome. We haven't figured out how to do it yet, so we're experimenting with lots of different social interactions [laugh] and ways to get people back.
WIERMAN: It's been tough.
ZIERLER: Just to bring our talk right to the present, what are you currently working on? What are some of the big projects you have?
WIERMAN: These days, the big ones that I'm working on are at the interaction of learning and control, or learning and online algorithms more broadly. The big question here is, there's all these machine-learned approaches for solving problems in the area of energy and area of control, and in all the areas that we're talking about where someone may just throw RL or throw deep learning at the problem, and have an algorithm that seems to work well. But you can't deploy those things in a Google data center or in a grid without some guarantees about their safety, their reliability, their performance, these sorts of things. How do you do that?
The vision is to somehow find a way of designing a black box that you can wrap around untrusted learning tools to give trust to them so that whatever the learning tool is, you don't need to know how it works. You don't need to design, you know, have the internals of it. You have this box that you can sit in front that will take that untrusted advice, we call it, as input, and mix it with old-school trusted algorithms, and put out something that is as good as the machine learning when the machine learning works well but never violates the guarantees that the trusted algorithm would've given you. That's the big-picture vision. We have a few results now that you can really do this in a black box way. You can, with 20 lines of code, wrap something on top that provably will meet all of the worst-case adversarial safety guarantees for the design you used to use, but exploit the untrusted advice when it's good to be as good as the deep-learning algorithm when possible. If you can push these together now, all of a sudden, you can use RL and deep learning in smart grid applications where you never could before because it would be too unsafe. That's the idea, and it's really exciting.
ZIERLER: What are some of the technological advances, both on the software and hardware side, that make this pursuit possible now where it might not have been a decade ago?
WIERMAN: This is an algorithm, so it's not technological or software. It's pure algorithms, it's pure theory that's needed to make it happen. There's theoretical ideas that we're using. But, interestingly, there's nothing too modern that you absolutely have to use to prove that these algorithms work. It's just a brand new question. It's something motivated by a change in the way people are designing systems today. With one of the companies that we're working with on this stuff, actually, which is a great example, they literally had an internal team that was their research team that was designing deep-learning algorithms for their core problem around co-scheduling generation for a generator-type team with wind and solar and old-school, more traditional generation. Then they had a team optimizing their provably safe approach to doing this, which was what was being done today.
Their CTO has to make a decision between when do we deploy this learning team's algorithm or do we stick with the old one? They can never actually deploy the learning team's algorithm because they can't get any guarantees that can show it works well in some simulations, but it also fails sometimes. To be able to come into them, and say, "We can design a black box that will merge the output of your trusted team and your untrusted team, and give you something that's trusted, and gets you 90% of the performance gains of the untrusted device," is a huge win. That question was never asked before, and so now it's a whole new algorithmic question of, in what settings can you design such a black box? What are the limitations? What are the fundamental limits between mixing untrusted and trusted devices? How much do you have to give up of the untrusted device's performance to get trust? How does that depend on the setting that you're looking at, the algorithmic problem? It becomes a really new fundamental question for online algorithms.
ZIERLER: Adam, is this research driven by applications, or does this get you more back to graduate student days when you're really involved in fundamental stuff?
WIERMAN: This is fun. It's both, because, like I described with this company, at the end of the day, we're giving them a 20-line of code box that they can deploy that will give them safety guarantees for RL for generation scheduling. It works out-of-the-box, and gives them amazing performance, and we can deploy it. But designing it required months on the whiteboard of figuring out how to mix ideas from machine learning, from multi-arm bandits, from formal methods in ways that can achieve this best-of-both-worlds' guarantee. It's really driven—the question came from the application. The question came from, "Why are teams not deploying modern learning tools in these safety critical applications, and can we fix that?" The way a lot of people go about that is, "Can we design a particular RL policy that will achieve some safety guarantee in this specific setting of voltage control?" We wanted a more general purpose solution. We said no, because then if they changed their RL, then it won't work anymore. It won't give the guarantees. We want this black box that can just take an arbitrary input from something you trust and something you not trust. "What guarantees can we give you in this black box way?" was our question.
ZIERLER: Do you see this work as an ongoing pursuit of the sustainability work, or this is a new branch of research for you?
WIERMAN: It's going to have huge applications in sustainability. But it's an algorithmic question that has impact other places as well. Actually, some of the students that are working on it have applied it in autonomous drones and video streaming and other applications like this. But it came for us out of smart grid because one of my postdocs was working on RL and power systems, and was continually facing this. But you can never deploy RL in systems because you won't get the guarantee. These are safety critical systems. We can't use learning tools. It really gave us this motivation of saying if we're going to be able to learn, we need to use learning tools to get the efficiency gains that are possible in these systems. What's missing? Can we provide those guarantees that are missing, and then deploy these learning tools in these sustainability applications?
ZIERLER: Adam, now that we've worked right up to the present, for the last part of our talk, to wrap up this great series of discussions, a few retrospective questions. Then we'll end looking to the future. To go back to when you were a graduate student, how surprised would you have been if you had some sense of the future of where the research would've taken you? In other words, in all of these unexpected directions, how much of that is baked into the kind of scientist you are, the kind of computer scientist you are? How much of it is just about personality and just being attuned to interesting questions? How much of it is, no, you had this plan, and it makes sense to you in retrospect you were going to work on this, and then you were going to work on this, and then you were going to work at this?
WIERMAN: [laugh] I think my graduate student self would be shocked where I am because in graduate school, I was working on scheduling and resource allocation and a particular theory, and that is far in the rear view mirror at this point. I think by the time I was four or five years into faculty, then maybe I would see this path as something that would be natural in retrospect, not something I would've predicted for myself, but at least the pathway there would've felt a little bit more natural. But my grad student self would be shocked to see me working in these areas.
ZIERLER: What has given you most satisfaction, both on the application side and more on the fundamental side, in terms of all of the things that you've worked on in your career?
WIERMAN: I'm the third option there. I think, especially in the last few years, I've realized that I think the most satisfaction comes from the students and the mentor aspect of the job. I think the research, I love, and I can tell you what I like best about it. But, as I look back, I think seeing the students' paths, and seeing how I was able to help them realize things and shape them in various ways, both the undergrads and the grads, like, those are the things that I look back on, I think, most happily, and get the most satisfaction from. When I go to conferences now, and I see 15 of my alums, and they're all faculty working in these various areas, and they've had amazing success, that really just warms the heart. [laugh]
ZIERLER: If I can pull you out of humble mode, and just ask you to be quantitative, in terms of carbon saved or money saved—
WIERMAN: I don't know. [laugh]
ZIERLER: —not just you in particular but the field that you represent, if you can broaden it out a little bit, at the end of the day, when we're talking about trillions of dollars or gigatons of carbon, does this work actually move the needle? Is it actually big enough where, in the grand scheme of things, it really does make a difference?
WIERMAN: Yeah, it does. It's hard to put a number on it, whether it's trillions—I'm not good at doing that sort of massive calculation. But I think when I mentioned to you at the beginning, there was this NSF report in the early mid-2000s, 2005 or 2007 maybe, that had this curve of data center energy usage becoming just 20–30% of the energy usage of the US in 10–15 years, that not happening as a result of the field that we helped form in terms of sustainable data centers, and the energy usage of data centers. If nothing had been done on the algorithmic side and the input side, then all the compute that we do today would be taking dozens of percents of the total US electricity usage. Still, we've kept it around 2–5%, which is incredible, given the massive explosion of compute in the last decade.
ZIERLER: Right, because, otherwise, without this, it would just be continuing uncontrolled, essentially?
ZIERLER: To return to the theme of Caltech interdisciplinary culture, if you were to just take a walk through your publication list, what are the areas that it's obvious you've worked on these because you're at Caltech, and what are the areas where, no matter where you are, these are the things that are close to your heart, you would've gotten to them one way or another?
WIERMAN: I think maybe the most obvious one because of Caltech is control. I had never taken a control course. Control wasn't a big thing at Carnegie Mellon. Now, a large fraction of my work involves control theory, and that's because control has such a huge presence. John Doyle and Richard Murray and others are so big and larger than life here. I think control is a huge one. I think, even though I knew I liked the idea of going into game theory when coming to Caltech, the fact that I continued into economics in a significant way, and even beyond game theory to market design and matching markets and things like this, that was a function of Caltech too and, in particular, like John Ledyard and Federico and Mani.
ZIERLER: Finally, Adam, last question, looking to the future, in the way that you are alive to changes in the market, and the way that new systems, new networks, beg new research solutions, what's the frontier for you? What are the new solutions in industry going on that are going to require you and your group to go back to the whiteboard, and spend months coming up with new solutions?
WIERMAN: It's AI machine learning. If I think of safety critical systems and energy systems, certainly they're safety critical. The big question is, can you get to a point where you can deploy these tools that work so well in the soft applications, where you don't have to worry about safety, in the ones that are hard? I think that's a huge open question. If we can do that, then you can get far beyond the efficiency limits and the sustainability limits that we imagined five years ago before these tools took off.
ZIERLER: There's no end in sight? There's plenty to do?
WIERMAN: [laugh] There's lots to do.
ZIERLER: [laugh] Adam, this has been a terrific series of discussions. I want to thank you so much for spending this time with me. It's been great.
WIERMAN: I've enjoyed it too. Thank you so much.