@HPC Podcast Archives - OrionX.net - @HPCpodcast-86: DOE Exascale Project w Christine Chalk
Episode Date: June 26, 2024We are delighted to be joined by Christine Chalk, physical scientist at U.S. Department of Energy and federal program manager for the Exascale Computing Project (ECP) and the Oak Ridge Leadership Com...puting Facility. Christine is also responsible for budget formulation for Advanced Scientific Computing Research (ASCR) and management of the Advanced Scientific Computing Advisory Committee and the Computational Science Graduate Fellowship. Topics include the ECP project and what made it so successful, how policy turns into a budget, and the growing importance of the role of women in HPC. [audio mp3="https://orionx.net/wp-content/uploads/2024/06/086@HPCpodcast_ECP_Christine-Chalk_20240625.mp3"][/audio] The post @HPCpodcast-86: DOE Exascale Project w Christine Chalk appeared first on OrionX.net.
Transcript
Discussion (0)
It's just been the honor of my career to be able to be a part of this project as it's
wrapping up and delivering on all of the promises that the department made to the nation.
The main secret sauce I think is the teamwork.
The multidisciplinary teams, cross-institution teams, that's what the National Labs really
excel at. We have been very fortunate to
have bipartisan support. I think it's generally recognized that computing, advanced computing,
underpins everything. It underpins national security. It underpins economic security.
We saw during COVID that it can underpin public health. From OrionX in association with Inside HPC, this is the At HPC podcast.
Join Shaheen Khan and Doug Black as they discuss supercomputing technologies and the applications,
markets, and policies that shape them. Thank you for being with us.
Welcome to the At HPC podcast. I'm Doug Black of Inside HPC with Shaheen Khan of OrionX.net. And today we have with
us Christine Chalk. She is Federal Program Manager for the Exascale Computing Project,
as well as the Oak Ridge Leadership Computing Facility, which of course is the home of
Frontier, the first Exascale class HPBC system in the United States. Christine is also responsible
for budget formulation for the U.S. Department of Energy's Advanced Scientific Computing Research
Program, referred to as OSCAR, and for management of OSCAR and the Computational Science Graduate
Fellowship. Christine, welcome. Hello. It's good to be here. Thank you. Great. We've been looking forward to this for a while. Reading your bio, we know you have more
than 30 years of experience in federal science programs and project management. Let's start with
your perspective on the Exascale Computing Project, ECP, now that the program is being shuttered,
or I think it already has been shuttered. What are your views on its relative
success now as two of the first three Exascale systems have reached that performance milestone
and the third system, El Capitan, is being installed? Well, it's such an exciting time
for computing. I've just had the privilege of participating in an event on the Hill
where some of our applications were presenting visualizations of what they were able to achieve on the Exascale systems.
And it's just so exciting because we're able to see a lot more than we thought we would.
I was looking at, in particular, Jackie Chen's simulation where she's looking at a diesel fuel inside a real engine, a concept engine.
And we really thought that was well beyond Exascale. When I
first started with Oscar, she was working on laboratory flames and not in any kind of real
world geometry. And it's super impressive what has been accomplished. I also am impressed with
the project itself as a project. Someone who's been involved in more traditional construction
projects of science. I worked on the advanced photon source and the main injector, which were accelerator projects. And this was really the
department's first attempt to projectize what was essentially a research project, where we were
pushing the technology and partnering with the vendors, but there was also a lot of applied math
and computer science and application development that needed to be done to deliver the performance on those systems.
And I have to say, there was a lot of skepticism about whether or not you could projectize a big software project and keep all of those moving parts together.
But this is something the National Labs are just extraordinary at.
They have their roots in the Manhattan Project.
And if you give them a really hard problem, they'll figure out how to do it and deliver
it on time.
They'll deliver it in the budget envelope you give them.
And I have to say, I will never stop being impressed by how well they performed and how
they met every challenge and invented this as they went.
And they are writing it up.
They're writing some books on both the technology accomplishments,
but also the management accomplishments.
And it's just been the honor of my career to be able to be a part of this project
as it's wrapping up and delivering on all of the promises
that the department made to the nation.
That's awesome.
They're not here, so we can praise them.
But every time I go to one of these
conferences and one of these national labs or lab facilities, personnel show up and it doesn't
matter what level they are, whether they're the big bosses or they're like scientists,
they are just so good. They are such a source of pride. You walk away saying,
that is what we need to be doing. So it's really wonderful to get your perspective on it.
Yeah, the main secret sauce, I think, is the teamwork.
The multidisciplinary teams, cross-institution teams, that's what the National Labs really
excel at.
None of us are as smart as all of us together in a real partnership.
And that, for the Exascale Computing Project, was especially true.
We brought together the best of the weapons
simulation teams and the science simulation teams, and they worked with vendors and with industry and
with universities. And yeah, they just achieved so much. And the capability is going to be bearing
fruit for many years. Yeah, the concept of co-design, it's not just a supercomputing world
that has pioneered it. It seems to really, truly trace back to something like a Manhattan Project. And that's increasingly the way to go, just because the applications really challenge the hardware and they can inform
the potential for performance gains that might not otherwise be obvious to the chip makers and
to the integrators. So in ECP, we really leaned in on co-design and taking some of our applications
that we knew had challenges for the hardware and the direction the hardware was going to create
mini apps and to work back and forth with the vendor in these feedback loops to improve
both sides, to help the applications be ready for the hardware when it arrives, but also
to help the hardware anticipate some of the challenges and find solutions before the teams
get their hands on it.
I think what we need is a movie version of the ECP story.
It would be along the lines of Oppenheimer, but I don't know who you'd name it after.
You're onto something. Yeah.
I don't know who we'd name it after either. It'd have to be Exascale because
it was a team effort and so many names that were so important are popping into my head.
And the leadership evolved over time, both within the
project and at DOE. I was not the first ECP program manager. I was the last. And yeah, it was very much
a team effort. Although Hollywood also has a little bit of a, they're not real good at doing
computer movies. Computer stories are a little hard to tell visually, I think. On the closing down or the end period of ECP over the last months of its existence, what was the focus there?
And also, do you have thoughts on ECP's legacy impacting the next big project, DOE's next generation of leadership supercomputing systems?
Yeah, we're winding down the project and the teams dispersing,
but that means they're all going off and doing their research on their own now. And sometimes
it's still in teams, but I've been with the ECP project for the last three, almost three years.
And so that's really been the home stretch. That's been about delivering on all of the key
performance parameters of the project.
So that's making sure that all 25 apps achieved the performance improvement that they had signed up for
or the new capabilities that they'd signed up for to make sure that the software stack,
the E4S was interoperable and fully portable across all of the three systems and also the pre-exascale systems.
And actually in a lot of other hardware, we have the E4S now on web services, on Amazon
web services and on Azure as well.
And to make sure that capability was portable, interoperable, that it was open source, that
it was usable, that people knew how to find it, knew where to get help if they needed
it, and making sure that it was embedded in the apps so that they could get the most performance out of the system that
they could. One of our key performance parameters was very complicated on software integration,
and you had to demonstrate how many different packages were working together to achieve a
certain performance gain, and just documenting all of that and achieving it on Frontier and then
documenting it and then getting it reviewed by subject matter experts and then having it reviewed
by an independent review team. That's really what we've been doing for the last year or two.
And then also getting the word out, publishing papers, going to scientific conferences,
doing web podcasts, writing stories, making sure that people know
what we've accomplished and where it's available and how they can leverage that capability
themselves.
That's excellent.
And a lot of work, really.
Yes.
Especially the home stretch is the hardest part, isn't it?
It is.
My boss, when she asked me to take it over, said, you're just landing the plane.
And I thought, well, isn't landing the plane the scary part?
And she laughed.
And I now realized, because she knew I was in good hands. This team is extraordinary. And they had
their act together. They knew what needed to be done and they did it. And it hasn't been 100%
without challenges, but it's been pretty close. It's really come in well. And it's just so
impressive. Every time I look at a new report, I can't believe what
we've accomplished. Very nice. By the way, was your former boss, was that Barb Helland?
Yes, it was. Yes. So looking ahead at the next generation, this whole integrated research
infrastructure idea, how do you think ECP has sort of impacted the development of that IRI
strategy? And what are some of the key challenges, maybe I'm going to guess energy consumption,
some of the key challenges with that next generation of systems?
Yeah, there's a lot of challenges with the next generation systems.
Computing, as we approach the limits of Moore's law, it's just like you learned in math with a limit.
Every step where you're getting halfway, it's harder. it gets harder and harder to approach that limit, right? And so
getting those performance gains costs more, and they take a bigger team, they take more effort.
And for the hardware vendors, especially, but also for us and for the apps, they get more
complicated, and they get, it gets harder to get the performance out of them.
All of the low-hanging fruit was picked a long time ago, and each new gain is hard won.
So that's a huge challenge for us.
Because we are the Department of Energy, energy is always going to be one of our top priorities,
especially when we look around at the demand for power for computing and how it's it seems to be growing exponentially
because of things like ai and crypto and just more and more people doing more and more things
right on their computers and in the cloud and with each other and so we will never stop caring
about bringing that energy use down as much as we can because of the climate
impacts, but also because of the community. It became very clear to me as the OLCF program
manager that even if we could afford to deploy a 100 megawatt system, we know that the universities
can't do that. We know that even big companies can't do that. They're not going to
bring new power and cooling in every time they upgrade a computer and they have very hard energy
ceilings. And if we just run away and don't pay attention to how much energy our system uses,
we're leaving them behind. And then they can't be fast followers and they can't come and use
Frontier when they have a grand challenge because the system they have at home is too different.
And so that was an important part of the Exascale computing project as well, making sure that
we kept the industry and interagency fast follower community informed every step of
the way about what we were doing and the progress that was being made and making the
software available as we were working on it, which was, I think at times challenging for them because they,
they didn't realize it wasn't all the way done and it was hard to use, but then they could see
where we were going with it and they could see what the gains were going to be and which bits
they definitely wanted to try and incorporate into their at-home applications. And it's really
paid off. We, one of the first. Some of the first users on Frontier
were from the Industry and Interagency Council. So they were folks who were ready to use Frontier
right alongside of the Exascale Computing Project applications and make really good use of it to do
science and engineering that wouldn't have been possible otherwise.
Interesting. So it's yet another reminder
that scalability isn't just up, it's also a scalability down.
Absolutely. And that was important for the E4S, the portability. When we said portability,
it wasn't just across all three US GPUs, but it was down to someone trying to do something
on even a laptop or a university cluster. We wanted to try and make
these things as interoperable up and down the chain as possible so that it wasn't so hard to
grow up the chain, right? It wasn't going to be such a big leap. And the great thing that happened
was the emergence of AI because they also use the GPUs. And so for those that embraced the accelerated computing and
embraced the software that we made available, that made it easier to use the GPUs and to get
performance gains, they're also now ready to incorporate AI, which is where some of the
performance gains are going to be as the compute performance gets harder. That's right. That's
right. So one thing about these big systems,
and we've talked about it a little bit with some of your colleagues, is really the process of going
from policy to resources to implementation. And of course, a big part of that is really
the budget formulation that you have personal and intimate understanding of. So to the extent
that you can share with us, would you take us through
that chain, that value chain of how this stuff happens? Yeah, well, everything runs on money,
right? And you can't deliver things if you don't get the funding. And the president's budget request
is the mechanism by which we communicate with Congress and with the administration about where
it is we want to go and what we think it's going to take to get
there in terms of funding. And in building that, we need to sort of find the convergence of where
the big policy statements that the administration puts forward through something called the R&D
Priorities Memo, which comes out every summer to inform all of the research agency's budget formulation.
But it also has to incorporate many workshop reports that are community driven, where the
community is identifying what the opportunities and challenges in applied math or in AI or in
even sub subfields like uncertainty quantification, and why that's important to the administration's goals,
like connecting those dots is what happens in the agency budget formulation. And then it gets
caught up in a lot of very budget boring details. But in connecting those dots at the beginning,
that's the important thing. But it's a critical communication mechanism. And I realize that
they're not the easiest documents to read and to find what you're looking for. But there's a lot of information in the federal
budget request about what agencies think is important and why and where there's likely to be
new funding. You know, I think it's very interesting that DOE and what the labs are doing
and what Oscar's doing is one of those rare and refreshing areas of bipartisan
support in Washington. And in that sense, it must be a pleasure for you dealing with budgets and so
forth, where so much of the country is behind what you're doing. And it's not a matter of rancor.
What does Oscar stand for?
Advanced Scientific Computing Research.
Yeah. And Christine, I think you mentioned you do have the fiscal year 25 budget for that. What is the budget coming up? It's a
little over a billion dollars for OSCAR. And that's the FY25 request. And that has been sent
to Congress. And there's the four corners, we call them. So there's the authorizing committees
and the appropriations committees. And they are marking up legislation
to fund the government for FY25. And they have a lot of priorities that they have to balance.
And it's a tough job. I don't envy them the job that they have because it's a huge budget,
but there's so much that it supports and all of it's important and every trade-off is painful
for them. And we have been very fortunate to have bipartisan support. I think it's important and every trade-off is painful for them. And we have been very fortunate
to have bipartisan support. I think it's generally recognized that computing, advanced computing,
underpins everything. It underpins national security. It underpins economic security.
We saw during COVID that it can underpin public health and our ability to respond in an emergency. And we are fortunate that previous
administrations saw how important this transition to the exascale was going to be and how important
folding in big data and emerging AI technologies was going to be and that we built hardware that
was ready for all of that and that we have this ecosystem that's ready for all of that is really a great accomplishment of the federal government. And you can see other nations now running to catch up and embracing some of the technologies that we've developed and buying US products to take good advantage of it. So it's paying off, but there's more to be done. And it's a very fast paced industry.
And as I've said, as you approach some of these limits, it gets harder. So that's a challenge to
get as much out of the systems that we've deployed as we possibly can, while still continuing to be
ready for what's next is challenging without a lot of budget growth, because there isn't any budget
growth ahead. So that's a big challenge for our program. And we very much appreciate the support
that we've gotten. And I'm thrilled that Congress was able to pass the CHIPS and science bill that
very focused on the future and positioning the United States to continue to be a leader in the strategic area.
The authorization numbers for Oscar were really good in that bill, and it would be wonderful if
that turned into appropriations, but it is hard. There's a lot of competing interests that the
appropriators have to deal with, but I'm optimistic. And I also know we have really good people that
are going to figure out how to get
as much as they can for their applications. So we'll make gains with or without increases,
but it's challenging. There's a lot in the federal budget that needs doing, and it's hard to say
you're more important than somebody else. Yeah. How much of the focus is on access to
computing power and how much on the know-how to build them,
operate them, run them, et cetera.
Yeah, that's always a challenge because especially at the exascale, anyone who's been paying
attention must realize that it's not easy.
It's really not easy to deploy a cutting edge system.
It's very much a partnership with the chip vendors and the integrators. They're too big
to be built in the factory and chipped. They're built for the first time on our floor and we shake
it out together and we get it working together. The second and third one are easier, but the first
one is really hard, but everyone uses computing now. Even wet lab biology is embracing AI and big compute because you have to. There's
just a lot of data and there's a lot that you can tap into that's got predictive power for your
field. So you don't want to, you can't afford to ignore it anymore. And that means capacity
matters a lot. And a lot of agencies have leaned in on cloud to get themselves out of the business
of providing the compute. But the
Department of Energy's challenges are very bleeding edge. And that means bleeding edge
computing as well. And so I don't think we're going to go that route. We're going to continue
to do the hard job of standing it up and making sure that people are able to use it while expanding
capacity as much as we can. We're very much aware of the fact that we have the only exascale
systems in the world, and that there's a lot of people who want to use them. And we're making room where we
can for as many applications as we can. Right now, Frontier has been very stable. And so we are
allocating it the way that we would normally allocate NERSC, which is our capacity system,
our production system for the entire Office of Science. And we schedule that for as
many hours as we can. And normally the leadership systems are expected to be a little more,
have a little more downtime. But that's not what we've been doing for Frontier. So we've been,
it's that over 90% utilization and we're scheduling it for as many hours as we can get.
And we're getting a lot of good science out of it. Wow. That's excellent. That's excellent.
Another thing I know you've been involved in, and we talked a little bit in our pre-call,
is the graduate fellowship program.
It'd be great to hear about that and how it has made progress and how people can go about
participating in it, et cetera.
Yes, the computational sciences graduate fellowship is very near and dear to my heart.
It was entrusted to me, is how I say it.
I'm the program manager for it,
but it was a trust. And it gets to the question you asked about how hard it can be to use these
high-performance computing systems. It's always been hard. And fortunately, more than 30 years
ago, some very smart people decided that they needed a special program. They needed to grow
their workforce. And so they established this fellowship
and it is a partnership now with between Oscar and the National Nuclear Security Administration.
And we also were in partnership on ECP and it challenges some really bright students who want
to go to graduate school to add extra classes in applied math and computer science and to do
a practicum at one of the national labs where they
partner with working computational scientists at our labs and work on something that's outside
their thesis area to help them see that there's a lot of challenges in one field that can help
a completely different field get through a bottleneck that they didn't realize someone
else had a solution for. So cross-training computational scientists is one of the really valuable things
about that program. And also the community has grown this wonderful cohort of over 500 professional
computational scientists that work in academia and industry and the national labs and government.
And they're very much a community and they learn from each other
and they help each other out.
And it's a wonderful program.
And a lot of the leaders that we count on
in the Department of Energy's national labs
came from the Computational Sciences Graduate Fellowship.
And so I'm so happy to be entrusted with this fellowship.
And I love going to the annual program with you
because it gives me hope
for the future. The applications and the students that we pick, they're so smart and they're so
hardworking and they've got such wonderful visions for the future that it really just
recharges my batteries every year. What is the eligibility criteria and the selection process?
It is open to seniors who want to go to graduate school
in any science or engineering field.
And it's also open to first-year graduate students.
It is open to U.S. citizens and legal permanent residents.
And the main criteria is that you recognize the importance
of computational approaches to advancing the science
or engineering that you want to do,
and that you're open to taking on extra coursework and taking on extra practicum experiences to
enable you to do that at the very highest end of advanced computing. We do get, I want to say we
get about four to 500 applications each year, and we select about 30 students. We've been growing it. The workforce
challenges has been obvious, and the department has been putting more money into workforce programs,
and the CS has also benefited from that. So we added a track in applied math and computer science
to address the needs of advanced computing up to the exascale. So that's applied mathematicians
that want to focus on
algorithms for high performance computing, right? And computer scientists that want to work on
programming languages and compilers for advanced computing. So it's a very specialized fellowship
and it is very handcrafted by the Crowell Institute over the last 30 years. And the
steering committee does a wonderful job of keeping it at the cutting edge
and keeping it relevant to the department and to the nation's most important challenges. And yeah,
it's like I said, for me, it's a feel good part of my job because the kids always point to where
the future is and they're always enthusiastic about it. Sure is, sure is. Christine, could I
ask about, you had mentioned AI and this whole HBC AI thing that is sort of a, one relies on the Sure is, sure is. Has that been an issue to manage, balance to be kept? What are your views on this whole HPC AI phenomenon?
One view we've heard is that really AI is just melting into everything we do computationally.
It's going to be less of a thing on its own and more of how is it realized within a particular
type of computing.
Anyway, I'm just interested in your thoughts in that area.
Yes, I agree.
AI wouldn't be AI without computing. Anyway, I'm just interested in your thoughts in that area. Yes, I agree. AI wouldn't be AI without computing. And the fact that the more compute and the more
data you throw at it, the better your results only causes it to be more and more, right? But to me,
and AI has many things, right? There's the AI on my phone, there's little AI, but I'm really
interested in big AI. I'm really interested in foundational models that
a community can share as a resource to help them because there's too much literature to keep up
with, right? There's too much other work to keep up with. And I think that the AI is a way for
people to be able to keep up better and in a more informed way. But I'm also seeing it as an accelerator,
a new accelerator technology. As the performance gains are getting harder to realize in the
hardware, it's algorithms and software, but we've also already made a lot of progress there. But
these AIs are something new and they can help in sort of the way that adaptive mesh refinement
helped in a lot of areas of high performance computing applications where, you know, not everything that you're simulating is equally important.
And you really want to focus your compute power where the details matter, right? And I think AI
is a great way to do the other stuff as a good enough, right? So the good enough parts of your simulation so that you can save the double
precision accuracy for the pieces of your simulation that really matter, right? For fusion,
for example, it's the edge that matters the most. So maybe the AI can simulate the whole plasma
and you focus the details on the edge. And there's lots of simulations like that, where there's a lot that's happening,
and you could throw your computer at every single atom, but not every single atom is doing something
interesting. And if the AI can give you a general sense of what the other atoms are doing,
and then you can focus your compute power on the bits that really are interesting,
then that's a way to gain performance on the current
hardware. And lots of the teams in the national labs and in the Exascale Computing Project are
already coming up with just amazingly innovative ways of using these AIs to get even more performance
out of Frontier and Aurora and LCAP. So I see the creativity that's happening. I see it in the
Computational Sciences Graduate Fellowship. Some of the ways that's happening. I see it in the computational sciences graduate
fellowship. Some of the ways that the students want to apply the AI are even more innovative.
And it's exciting because it is a new tool and new tools tend to be game changers. And it's
coming at just the right time where Moore's law and Denard scaling are making things hard.
And the AIs might be a way through that. Yeah, brilliant. By the way, is there an application area with Frontier that has
really impressed you? Some research work being done on Frontier that's really blown you away?
Oh, all of it. I mean, every time I get a new accomplishment from my facility, it's exciting.
There was one recently where they were able to simulate a trillion water atoms.
And that's getting us to the point where we could simulate an entire cell, right?
And if you can simulate a basic cell, it opens up all kinds of experiments that can be done
in silicon with the kind of predictive power that you can't see otherwise.
And that was super exciting for me. Some of the digital twin work
that folks are doing for human health, Amanda Randall's has turned her cardiac work and her
blood flow work to look at metastasis and to look at how a single cancer piece can move through your
system and spread the cancer throughout your body. And it's exciting, all of that. The work that industry is doing to try and get even more energy gains from aircraft
engines, right? And the competition between the two major US companies in that business,
they're doing such amazing work. I mean, they both presented visualizations at that Hill
event that I mentioned, and it was just mind blowing. The
level of detail and the amount of information they can glean, it's huge. It's game changing.
And these are, for some of them, they're engines that are going to be too big to put into a wind
tunnel, right? So they're not going to be able to put a full scale into the wind tunnel. They have
to use the computers. And there's a lot of areas like that where they have to use the computers.
And it's like NASA's work on the Mars lander.
The atmosphere and the gravity on Mars is nothing like Earth.
So experiments that they do on Earth are an approximation at best, right?
But in the computer, they can simulate how they will actually land people on Mars.
And that's not going to be a parachute because people and all their stuff, it's going to be these retro jets. And the physics of
that is hard. And especially when you're dealing with a different atmosphere and a different
gravity and trying to control it from earth or controlling it through an AI, it's just exciting.
It makes things seem more possible. If I can change the topic a little bit,
because I was again, looking at your background and I noticed a year
at Oxford. And if you're in the mood to talk about it, I'd love to know how you enjoyed it and how it
came about and what influence it had on your career. It had a huge impact on me as a person
and on my career, actually. It was a program that my college had in my sophomore year. And it was,
their education style is very different and it is very focused on communication. And it was their education styles very different. And it is
very focused on communication, and a lot of reading, but a lot of writing, and writing
in a more synced way. It was always challenging you to write something where you've made all of
your points in less pages, right? And that is a extremely useful skill, no matter what you do for your career, but especially in policy and sort of staff positions.
I did do a fellowship as well on the Hill in 2001.
And being able to summarize something complicated and technical as succinctly and clearly as you can is just a really critical skill.
And to do it quickly, too. At Oxford is just a really critical skill. And to do it
quickly too. At Oxford, they did make us, we had to do it. We had to turn up with an essay every
week that answered a question. And that was all you got was a question. And you had to come with
an essay with the right bibliography and that succinctly summarized the critical points and
took a position. They also required you to take a position on the question. And that just an
incredibly useful skill that I didn't realize I was going to need that I still use today all
the time when I'm writing the budget document or when I'm trying to push a new idea forward.
Being able to summarize what's going on in this complex space of high performance computing
succinctly and to put it in a way that resonates with what
the administration wants to accomplish. I really feel like the roots of that skill came from the
year that I spent at Oxford. That's excellent. And then I also highly recommend to all young
people that they study abroad, anywhere abroad, just to see a different system and a different
culture and to broaden your horizons. Well, I always say I learned more
about the United States in my semester abroad than any other time. Because as you say, the contrast
is so interesting. And what you're talking about, succinct summary, succinct encapsulation of,
it sounds like journalism classes that I took, but also it's a real case where less is more,
because if you go on and on with
non-succinct writing, you're going to lose your audience really fast and you just lose your impact.
Yeah, I agree. I took a science journalism class at University of Maryland when I did my physics
degree with Bob Park, who was a delight to work with. And anyone who can't go to Oxford for a
year, I highly recommend adding a couple of journalism classes to your program of study because it is similar skills.
Like the perfect article two days after your deadline isn't very helpful.
So it forces you to meet deadlines and to be succinct and to be accurate, to make sure you've got all of your facts lined up before you start writing.
And it's useful skills. No matter what job you end up with, those are useful skills.
And start with the most important point, which many people go in the opposite direction. They
start with back. Anyway, go get started on that. No, I agree. You have to know what you're selling.
If you don't know what the punchline is, then don't bother writing it.
Yeah. Well, Shaheen, I wanted to ask
Christine about her experience in the industry and the role of women. Please do, yes. You would
like to ask her. I've been very fortunate in that I spent my career in the federal government. And
the federal government is a wonderful place for a woman to have a career, if only because they
absolutely follow all of the federal laws.
You don't have to worry about any of the discrimination or harassment that unfortunately
still exists in some areas. But I also, as a scientist who's interested in all of science,
and who is interested in the impact that science has on our country, on our national security,
on our economic competitiveness. And I started out in economics and I switched to physics. And
I'm very interested in all of that and the impact that science has and in the full breadth of
science and especially how science builds off of each other, like very distinct fields when they
converge, it can create a huge breakthrough. And that's one of the things in the fellowship as well, like learning from other areas and leaning in to help them solve
their problem because that could feed back to help you solve your problem. And my daughter,
I only have one. And my daughter asked me when she was like in second grade, what's a good career
for a woman? And apparently I said applied math. And I didn't remember saying it. But then when the
weatherman came to her school, she said she wanted to be an applied mathematician. And it caused a
stir. But I genuinely think that's a wonderful field for women. And we have a lot of women in
the national labs. We have a lot of women in Oscar because it's a team effort. High performance
computational science and applied math is a team effort that
builds on communication skills and on wanting to work with others to help others. And it resonates
with me and I see a lot of other women in the field. So I think it resonates with women. And
the National Labs is a wonderful place for a career as well. The Computational Sciences
Graduate Fellowship does
a longitudinal study every five years and interviews all of the former fellows about
where they are and how their career is going, but also their perceptions of things. And a lot of the
feedback that we got on their experience in the national labs and how everyone was listened to and respected and that teamwork
and that camaraderie and just valuing every person on the team, whether it's the electrician
who is fixing something at the back of Frontier or the computational scientist who's getting
their work done, the whole team is part of the accomplishment.
And I think that's important for a lot of women. I know
it is for me that it's not, it's not about me. It's about my team and what we're doing together.
So I think it's a very good unknown by many people area of where you can have a huge impact
and you can feel like you're contributing and feel valued and respected.
So I highly recommend folks, especially in school, finding out more about the Department
of Energy's National Labs and about government jobs.
That is really important, good advice.
And also it explains why the program has been so successful, doesn't it?
I like to think so.
And it's certainly been impactful.
I'm with you, really.
Yeah, yeah.
Excellent. Okay, well, great. It's been a delight to impactful. I'm with you, really. Yeah, yeah. Excellent.
Okay, well, great.
It's been a delight to be in this conversation with you, Christine.
Thank you.
Thank you.
For taking the time and sharing your insights and experiences.
It's really highly valuable.
Thank you.
It's been fun.
And I enjoy your podcast.
So I'm happy to finally be on it.
Thank you so much.
Thanks so much for your time.
Thank you.
That's it for this episode of the At HPC podcast.
Every episode is featured on InsideHPC.com
and posted on OrionX.net.
Use the comment section or tweet us with any questions
or to propose topics of discussion.
If you like the show, rate and review it on Apple Podcasts
or wherever you listen.
The At HPC Podcast is a
production of OrionX in association with Inside HPC. Thank you for listening.