Behind The Tech with Kevin Scott - Ashley Llorens: Artificial intelligence and robotics
Episode Date: March 30, 2021Join Kevin and this new distinguished scientist at Microsoft, Ashley Llorens, as they explore the future promise of artificial intelligence, robotics and autonomous systems. Ashley is also a hip-hop a...rtist known as Soulstice. Listen to their discussion about the parallels between careers in science and music. Kevin Scott SoulStice on Apple Music Click here for transcript of this episode.
Transcript
Discussion (0)
Can we start to set some audacious goals around enabling as many people as possible on the planet to live a long, healthy life, creating an atmosphere of shared prosperity?
And what is the role of AI in doing that? To me, these big societal narratives should be at the top level of abstraction in terms of what we're talking about.
And then everything else is derived from that.
Hi everyone, welcome to Behind the Tech. I'm your host Kevin Scott, Chief Technology Officer for Microsoft. In this podcast we're going to get behind the tech. We'll talk with some of the
people who made our modern tech world possible and understand what motivated them to create what they did. So join me to maybe learn a little bit about the history of computing
and get a few behind-the-scenes insights into what's happening today. Stick around.
Hello and welcome to Behind the Tech. I'm Christina Warren, Senior Cloud Advocate at
Microsoft. And I'm Kevin Scott.
Today on the show, we're joined by Ashley Lawrence. Ashley is a scientist and engineer
with a 20-year career in research and development of AI technologies at Johns Hopkins Applied
Physics Laboratory, and he recently joined Microsoft Research. Ashley is also a hip-hop
artist known as Solstice, and one of his songs was actually featured in the Oscar-nominated film The Blind Side.
So I know there are a lot of theories out there about why so many scientists are awesome musicians, and there's this whole part about music being mathematical and scientists being good at recognizing the rules of music composition, but I'm curious what you think, Kevin.
Well, you know, I think it's one of those mysterious things because you are absolutely right. There are a lot of programmers and computer scientists who seem to have serious interests in
music. But I don't know many of them who are so serious about their music that they have a real recording career.
And I think that is one of the things that makes, one of many things that makes Ashley special.
No, without a doubt.
And I can't wait to hear you both talk about his various areas of expertise and kind of these dueling careers.
So let's chat with Ashley.
Our guest today is Ashley Lawrence. Ashley is a scientist, engineer, and hip-hop artist.
He worked for two decades at John Hopkins Applied Physics Laboratory, developing novel AI technologies, and served as founding chief of the lab's Intelligence Systems Center.
He was recently nominated by the White House Office of Science and Technology Policy to serve as an AI expert for the Global Partnership on AI.
Besides his career in engineering, Ashley actually began his career as a hip-hop artist and serves as a voting member of the Recording Academy for the Grammy Awards.
About a month ago, Ashley joined Microsoft as a vice president, distinguished scientist, and managing director for Microsoft Research.
Welcome to the show, Ashley, and to Microsoft.
Thanks so much, Kevin. Great to be here.
So, so awesome to have you here, have you here at Microsoft. So we always start these podcasts by talking a little bit about where you got started.
So I know you grew up in Chicago.
How did you get interested in, like, I guess, either music or technology?
Yeah, yeah. So maybe we'll go in order,
and I can just kind of create two contrasting scenes for you.
So, you know, my interest in music, you know,
growing up in Chicago, south side of Chicago, south suburbs,
just really immersed in music, you know, throughout my childhood.
And hip-hop in particular was always fascinating to me.
Just the degree of storytelling, the beats, the sounds, growing up in the kind of East
Coast vibes, listening to artists like Nas, West Coast artists as well.
And it grew to be something that, you know,
I went from being a fan of to something I really wanted to contribute to, especially as I was sort
of coming of age and wanting to express myself. So, you know, the challenge with the, you know,
where we grew up and everything was we didn't really have many outlets for that energy at first, you know, and so we did what we could.
You know, we just kind of put a visual in your head. We would go to, you know, like the, what
was it, Circuit City at the time or what have you. We had like a boom box with two tape decks,
and that was the recording studio. And we were so proud of ourselves because we figured out
how to do multi-layer vocals with
a $20 mic and the two tape decks you know you record on the one tape and then you play it back
and record over yourself and you know over some instrumental you get the multi-track vocal so that
was like our you know $75 studio setup and so you know and it just kind of went kind of grew from
there um and you know the the contrasting scene of course, we can come back to like the career trajectories and things.
But the contrasting picture that I'd love to paint is just around, you know, kind of the intellectual curiosity that was really a family value for us.
And, you know, really, both of my parents were teachers.
My dad introduced me to two things early in life that really shaped my curiosity, one being theoretical physics. You know, I was quite young reading books by like Michio Kaku and not really understanding what string theory was, but really just being fascinated by it. And then Marvel Comics was the other thing. And, you know, just really, you know,
the Infinity Gauntlet, something that's been a topic of conversation within our family for a few
decades. So it was great to see it on the screen. And so, you know, I just was really driven by a
curiosity of understanding how the world works. Again, not as much of an outlet for that curiosity. And that
brings me to a story about outlets. You know, I actually ran one of my first electrical engineering
experiments by peeling the paper off of a twist tie and then sticking it into an electrical outlet
just to see what would happen, you know. So really you had these two scenes kind of unfolding.
And I say sharing some of the same fundamental motivations, you know, driven by curiosity,
a passion for understanding people and technology, and really grew to, you know, a passion for
having an impact on the world in these two different ways. Yeah. It's so interesting that so many of the people that we
chat with who have these large creative appetites,
they're creative across a whole bunch of different dimensions,
have that grounded in curiosity.
Just this voracious curiosity have that grounded in curiosity, just this sort of
voracious curiosity about how the world works and why things
are the way that they are.
I mean, it's really funny, this electrical outlet story.
I did exactly the same thing when I was four years old.
I don't think it was a twist.
It wasn't a twist tie.
And I forget what I jabbed in it, but I remember my mother screaming when there was this loud crack and her child was crying.
And that sort of fearlessness that accompanies the curiosity, I think, is – curiosity strikes me as something that you can certainly cultivate, but it's hard to create
when it's not there. Whereas I do think fearlessness, you can sort of work really
hard to become more fearless. And part of how fearless you can be is your environment. Yeah. Yeah. It's, it's, it's really, it's really interesting. Um,
and I'm always, I'm always careful not to take too much credit for a lot of just the
things that come naturally. Uh, I feel really fortunate, uh, you know, to have these kind of
fundamental drives and everything. I think as, as children, you know, we're sort of naturally
driven by our curiosity to explore.
And the outlet story is a great kind of illustration of that.
I think as adults, a lot of times, if you're not careful, the world will cause, so I think part of the part of the thing that I've learned is to just allow myself to be driven by my curiosity and to and to behave fearlessly in that way and to ask questions and and not be afraid to fail. And I think part of what I'm grateful for is just that I've managed not to unlearn those things as an adult to kind of just behave childishly in that respect at this point
in my life. And so was your dad a physics professor? He was a math teacher, just with
just a passion for math and science. But he was a high school math teacher and a track coach.
That's awesome. So awesome. So, you know, like, I'm just sort of curious, like when you when you stuck that kind of the explosion in the smoke, the little mini explosion in the smoke that happened.
And then an overwhelming sense of guilt that I kind of carried.
And, you know, I don't know if they're listening to this podcast at some point, they may be surprised to learn that this was an experiment that I conducted.
That's awesome. Yeah, I mean, it's really, you know,
we were talking about this on the last episode of the podcast
with Jacob Collier, who's also a musician
who uses technology in interesting ways.
And, you know, one of the things that we were talking about
is how you preserve the fearlessness when people are making themselves vulnerable by exploring. a set of colleagues or peers or your company or your institution that helps you,
you know, that doesn't give you, you know, just sort of uncritical positive feedback. Like that's
almost as bad as, you know, just overwhelmingly negative feedback, but they can figure out how
to strike that line between, you know, sort of saying, wow, this is awesome that you are you're exploring this curiosity.
But, you know, like here's some things that can be better.
But like, man, you should be really proud of yourself for pushing in this way.
Like, is that something that you got as a child or like something that you had to learn over time or.
It's a great question.
I think a lot about that as a parent, you know, of my own children, but also in terms of leadership in a science and technology organization.
But yeah, as a kid, I mentioned this kind of intellectual curiosity as a family value. I never really got the sense that I was asking too many questions or that stupid questions were a bad thing.
I thought my parents did a really good job of creating that kind of environment for us.
I try to do the same for my kids. You know, I just get really used to saying, that's a great question, you know, and just encouraging the asking of questions and entertaining the questions.
And it's hard because, you know, there is a such thing as boundaries that you have to try to enforce as well.
There is a such thing as, you know, what I say is going to go now, even though I've entertained, you know, your thoughts and things.
So, you know, I strike in the right balance there.
But, you know, just like, you know, I try to do for my kids, I try to create that environment in a professional setting as well,
always leading with, man, that's a fantastic question. Let's take some time to explore that.
Let's make sure we hear everybody's conflicting viewpoints before we go forward. But I say in a
similar way, you do have to kind of set a direction and go eventually. So it's that, you know, striking
that balance. Yeah. Well, and it's sort of like the two stages, I think, of creativity in a group,
like making sure that you hear
everyone's voice and everyone feels free to
be bold in their thinking even when they're vulnerable.
Super important, but then you have to make a decision.
We called it at LinkedIn, disagree and commit.
It's perfectly fine to disagree, but at some point
you have to make a choice about what you're going to do, and then everybody needs to commit
to doing that.
I like that.
You have all of this stuff that you are
curious about as a kid. What's your school look like?
Did you have a strong music program, strong science program? Like who is helping you explore
these things that you're super interested in? Yeah, it's interesting. I try not to get too
much into like local politics, but the way that school systems, school districts work in Illinois, where I'm from, is that one set of schools is supported by the tax base around it.
And it's not necessarily like a shared set of resources across schools.
So you have big inequities, you know, in terms of the level of resources. And so
I went to a school on the lower end of that spectrum, just from a resourcing standpoint.
So that was always a challenge. Like as I was leaving the school, it's better now,
but as I was leaving the school, it was in the process of like losing its sports programs,
you know, to debts and things like that. So that was a tough environment. I would say I was
still really fortunate to have some great role models. My math teacher, Mr. Amaro, I'm going to
go ahead and call out the name, was really just a great champion and would push us hard. You know,
he was this gentleman of Cuban descent that was, you know, from Florida and
spoke with an accent, but was always positive. He would wake up, he would get to school at like
6.30 in the morning and would expect you to be there at that time, you know, if you were
participating on his math team, which I did. So, you know, there was my Spanish teacher and the
principal that, you know, I got a chance to work with at student
government. So, you know, I would say maybe even in an environment that was sort of resource
constrained like that, I think it's possible to have, you know, some fantastic role models. So,
I feel fortunate there. And so, when you were thinking about going to college and in college,
like, how did you choose what to do since you
had so many things you were interested in? Yeah, yeah. You know, it's funny because
when I think back to high school and this, because you're not really choosing a career
at that point, you're choosing a major. And so you find yourself in the guidance counselor's office
and it's like, hey, if you're smart, you do math or science.
You know, I'm not sure that that's the right answer necessarily, but that's definitely
a prevailing wind, I felt. So it felt somehow more pragmatic to me to pursue
that aspect just from a career standpoint. So I wound up, and it also jived with a lot of the kind of
intellectual curiosity that I mentioned, but I sort of have always stubbornly refused to give
up the creative aspects as well. And so I took narrative writing courses and, you know, a lot
of courses on the creativity side too. I probably would have had a minor in those things
if the engineering school had allowed it. And I'd say that kind of duality tracked me, you know,
into my career. So if you fast forward a bit to 2003, I was kind of graduating from graduate school
and starting a record label at the same time. And my plan was to just move into science and technology
long enough to fund a record label
and then leave and go do music full time.
Because even having advanced to that point,
you know, after having gone through undergrad
and kind of gotten my master's and all,
I still didn't really believe that a career in engineering
was for me. Like I didn't have very many role models that were professional engineers and
scientists. So I wasn't really sure if it was a place that I could kind of be myself, you know,
be intellectually curious and be entrepreneurial and kind of have an impact on my own terms. Those things were very important to me.
So I kind of set off.
And what I found was, first of all, the opportunity to do both.
So I kind of hung on to those things for as long as I could.
For a good 10 years, I was doing both things.
And on the music side, just figuring out how to press records, vinyl records, and how to get my records into
places like clubs in Tokyo, and then how to get myself into places like clubs in Tokyo to, you
know, to do performances. And it kind of took me around the world and many adventures. But, you
know, on a parallel path, I was really figuring out how to chart my own course within science and technology,
how to be myself in doing those things, and really discovering a kind of really cool career path
right at the intersection of science and engineering, and having opportunities to be a
principal investigator, even as a fairly young scientist, you know, for the Office of Naval Research and other sponsors.
So publishing, you know, so if you think science, you know, engineering and music,
so publishing papers and academic conferences, going to, flying overseas to do those conferences,
but then leaving the poster session to go do a show at the club,
you know, presenting a research in Prague and then doing a show in Prague, you know, and then eventually, you know, being able to turn a lot of those scientific advancements
into real world technologies for the Navy and other sponsors. So just a tremendous set of
adventures as I kind of reflect on it. And what did you major in in college?
Yeah.
So my undergrad was computer engineering.
ECE is kind of a joint college anyway, but my undergrad was computer engineering.
And then my master's research focused on electrical engineering, digital signal processing. And then when I entered the professional environment,
I immediately discovered machine learning,
which eventually led me to artificial intelligence.
So it's interesting how, you know, 20 years,
20 some odd years ago, you could go through a whole degree
without ever bumping into machine learning.
I didn't discover machine learning until I got into my professional environment, but then I was kind of immediately hooked.
And early in my career, I wouldn't have dared say I was into artificial intelligence. That's
not what you said. You definitely didn't talk about neural networks at all if you wanted to
get your research funded. But as I know, machine learning for human systems, you know, kind of as decision support tools for humans to more robotics and autonomous systems, it naturally leads you to grander thoughts about artificial intelligence.
You're creating an agent that's going to go out in the world and perceive and understand its environment and act on the basis of its own perception.
And in my case, you know, I spend a lot of time developing technologies for underwater robots,
autonomous underwater systems. And so, you know, this is not a clean laboratory setting. This is the ocean. So you're creating something that has to go out there and fend for itself in an open
environment. And it naturally leads you to
grander questions about more robust and generalizable forms of intelligence, which
it's been amazing to kind of have an opportunity to think more and more about as I advance in my
career. Yeah, you and I probably were in school at the same time.
I think I'm a little bit older than you are,
but we were in school in the 90s, I'm guessing.
Yeah, you're totally right.
When I was working on my PhD,
I dropped out much to my advisor's chagrin.
You are absolutely right.
You just didn't mention neural networks if you wanted to
graduate and get your papers published. So it's really amazing to see how much has changed just
over the past 20 years and even accelerating over the past 10. So I do want to go back for a minute and sort of chat about this interplay that must have existed between your professional career and your music career.
Did they benefit each other?
Yeah, that's a great question.
And it was an interesting journey to kind of discover the interplay.
I would have said something like not so much at the very beginning of that journey,
but I think the intersections kind of have presented themselves to me over time
and have been really satisfying to see.
On the music side, you know, especially as an independent artist with a very low budget, you had to really be convincing to convince people to work with you for little or no money, or at least up front.
And so you had to develop the skill of creating a pitch for yourself, a story that people would sort of buy into, a vision for
where you were going.
And then as the executive producer of an album, you're a project manager.
It needs a budget, it needs milestones, you're going to have roles and responsibilities if
you're ever going to get anything out into the world.
And so it's interesting aspects of those things obviously transcend both. And then from a
technology standpoint, you know, if you think about audio recording and, you know, audio
engineering, it's engineering, you know, it's frequency selective signal processing, it's
filters, it's gains and amplification and all kinds of things. And so
I definitely think from a technology standpoint, from a project management standpoint,
and I'd add another dimension too, just communications and storytelling really
transcend both. And so over time, again, you know, a lot of these intersections have sort of
presented themselves. And I realized even even without knowing it at the time,
that each was benefiting from the other.
One of the things that's been fun about my music career
is sort of figuring out how things happen and how things get done.
So at one point, it was just a curiosity,
like, how do I get things into music and TV?
Just another challenge, right?
So what I discovered is that there's these agents, right?
And they bring opportunities. And
so what you have is you have movie editors, video editors, a lot of times they put these,
they get to the end of their production, they have these reference songs in there.
And so, you know, at some point they, in the blind side, they got to the end and there was a scene
that they had 50 cents in the club and they're like, well, we ain't paying the license this song.
So then they put out the call to all the agents.
So my agent comes to me and they said,
we need a replacement for 50 cents in the club.
So I call my guy in Ohio and I'm like,
okay, let's put something together.
So I made two songs for them.
One was that thing.
And it's not really my voice.
It's me replacing that song.
It's my physical voice, but not my voice as an artist, but it's not really my voice. It's me replacing that song. It's my physical voice,
but not my voice as an artist. But it's still fun. And that one wound up in the blind side. The second one I made for that spot to replace the 50 Cent song was one that actually wound up
in 50 Cent's power a few years later. So it was the two things I made to replace 50 Cent in the
reference soundtrack. So it's kind of a cool story. But two things I made to replace 50 Cent in the reference.
That sounds right.
Yeah. So it was kind of a cool story, but no, I didn't meet 50 Cent or anything like that.
It's fascinating that you said that. Like one of the reasons, I try to explain this stuff. So I
love to learn about stuff. And the process of learning for me is almost better than the end product that I end up making.
And so that whole idea about figuring out how all of this craziness works behind the scenes,
that sounds so appealing, more appealing than the song itself.
That's exactly right.
That's the fun of it for sure.
It's sort of interesting. Do you ever
feel that it's easier for
people to
engage with and maybe even have
more curiosity about the
music work that you've done
versus the
machine learning work that you're doing?
Given that
everybody sort of
understands music, right? Like, it's just something innate in us that like, we are musical, we
appreciate music, you know, to the degree of like being a fan of musical art forms and like fans of
the people who create it. But like, you don't really have that with technology in the same way.
Yeah, it's interesting.
So this was actually something I really worried about at the beginning of my career.
And I have to confess, I actively tried to keep my music out and work separate.
And I was not very forthcoming about the music side of what I was doing at work.
Because I wanted to be taken seriously.
You know, as a technologist, I was already someone who was from an underrepresented demographic,
you know, from that standpoint, you know, being a Black person, a Black male in, you
know, science and technology.
So to add that dimension too, I was like, okay, let me just focus on, you know, the science and the tech when I'm when I'm at work.
But but even that over time, you know, I grew to realize that I could kind of bring my whole self and present both sides.
But it was it was easier as I had a track record of scientific and technological accomplishments to back it up.
It was hard when I was a blank slate coming out of college.
So, you know, for better or worse, I'm not sure what the right and wrong of that was. But I did wind up,
for example, if you go on YouTube, you'll find a clip of me doing a hip hop duo with the director
of the Applied Physics Laboratory at one of the all hands addresses. And so that was like the big
coming out party at work for the hip hop side. You know,
on the technology side, though, 20 years ago or, you know, 15 years ago, you say you're doing
machine learning. People are like, oh, what's that? That sounds curious. Machines learn? I never
heard of that. Now it's a little different. You know, people know what AI is. When you say you
do AI machine learning, they're like,
oh, tell me more. I hear about this stuff all the time. And that plus the fact that technology
is forming such a profound substrate of our whole human experience now. I think people are
just more naturally engaged and curious about technology because it's such a part of our lives.
And so, whereas it started more so, you know, the kind of arts and music was something that
people relate to, now I don't necessarily see a difference, you know, in terms of the level to
which people are engaged in these two things. So, you know, I think this whole notion of
narrative, the thing that you were talking about a minute ago,
was really important.
You just got in your musical career that you are telling stories,
and you understood as a young professional that you had to tell a story about yourself.
I think with technologies like machine learning, narrative is also really important because they're complicated.
And if you go all the way to the bottom, just in terms of how they're implemented, the complexity of these things is very high. The narrative is really important because it's such an important technology and it is having such a profound impact on what the future is looking like every day as it unfolds.
That people need to be able to understand how to engage with it to sort of like, what do I think about this technology?
What do I think about policy about this technology?
What do I think about, you know,
like my hopes and my fears
for the future of this technology?
So how, you know, have you thought much about like,
you know, the story of AI?
Yeah, absolutely.
And maybe there's a couple of sides,
but there's many sides,
but maybe two sides I'll pick to explore there.
One is absolutely the idea that AI is taking us in a bold new direction as a society.
And I think it's more important than ever that we can engage around these policy questions and really around the directions of AI, definitely outside of computer science and across disciplines.
And so we do need to create narratives.
Even more than that, I think we need to create directions that we agree on, that we want to take this technology.
A lot of times I think people are discussing AI as something separate from human beings and human intelligence.
And I think we need to be thinking of these two things as complementary.
So what are our goals for these things? Can we start to set some audacious goals
around enabling as many people as possible on the planet to live a long, healthy life,
creating an atmosphere of shared prosperity? And what is the role of AI in doing that? To me,
these big societal narratives should be at the top level of abstraction in terms of what we're
talking about. And then everything else is derived from that. I think if we're going to just
let a thousand flowers bloom and see where we land on this thing, I think we could wind up with some
really unintended consequences from that. Yeah, I really, really agree. And I think,
too, if you have the wrong narrative, you could have unintended consequences as well.
Like one of the things that I have been telling people over and over again over the past handful of years is just sort of a useful, useful device about thinking about the future of AI is that AI and like especially its embodiment in machine learning, is a tool.
Just like any other tool that people have invented, it's a human-made thing.
Humans use it to accomplish a whole wide variety of tasks.
The tool is going to be as good or as bad as the uses to which we put it. And, you know, it's just very, very important, I think,
for us to like have a set of hopeful things
that we're thinking about for, you know,
those uses of AI as, you know, we have our anxieties.
And both are important.
Like you have to, you know,
it will certainly be used for bad things.
But, you know, as with any technology, like the hope is that there will be orders of magnitude more positive things and good things that people will attempt to that balance of good versus bad is the stories that we're telling ourselves right now about what it's capable of and like what to be wary about.
I think that's right on point.
And, you know, we can even ask yourself, you know, what does it mean to behave intelligently as a species?
I actually think we're getting to the point where
we can start asking ourselves and holding ourselves to, you know, to some standard there.
You know, if you just think about artificial intelligence at a low level, you know, from an
agent standpoint, you know, I think intelligence itself is the ability to achieve goals, to set
and achieve goals. And then what do you have to do? You have to be able to have some understanding
of the world around you, you know, through some mechanisms of perception, whether that's kind of our human
modalities or other kinds of modalities. You have to decide on a course of action, you know, that
best achieves your goals, and then you have to carry it out. Like, these are the things you do
to be intelligent. So when you extrapolate that to us as a species, because one of the hallmarks
of human intelligence is our social intelligence, our ability to, you know, to collectively set,
pursue goals and things like that. So I think, and I'm sort of, as you can see, I'm sort of
cursed now to see everything through the lens of intelligence and, you know, artificial intelligence.
This is just how I, my lens on the world.
But I think it's helpful. I think it's useful. I think in order to behave intelligently as a
species, we have to do some of these things that you're talking about, setting some bold
visions and directions and figuring out how to organize around those. Yeah. When I was a kid,
the thing that really inspired me, I think, to become a scientist were the stories that I was reading.
And, you know, I grew up in the 70s and 80s, and you had a whole mix of things.
Like you had a bunch of science fiction, or I read a bunch of science fiction that was sort of techno-utopian. You know, it was these future worlds that had a bunch of technology,
some of which now exist, some of which will probably never exist.
And, you know, sort of people living these crazy, interesting lives,
you know, like full of drama in these futuristic worlds.
And then you had some dystopian things as well.
You know, I always sort of think about
like these two different portrayals of AI.
There's Commander Data from Star Trek, The Next Generation,
and then there's the Terminator,
you know, from the Terminator movies.
You know, the latter is sort of the, you know,
the anxious depiction of AI.
Like what happens if you build machines that,
you know, get out of control.
And the Commander Data version of AI, I think, is really, really interesting
because I don't know whether you've watched the Star Trek Picard series,
which is the recent thing with Patrick Stewart. And Data, again, played this very, very important role
in the story that they were telling.
And the interesting thing about Data is,
even though he was an android, he was an artificial intelligence,
he was always the thing that the writers in Star Trek used
to shine a spotlight on humanity.
So he, in some sense,
was the most human character in the show.
And they sort of used his artificiality
as this plot device to sort of explore
what our human nature is.
And I think that sort of gets to
what you were just talking about.
I think AI may tell us an awful lot about who we are.
Yeah, I think that's, I think that's right. And I want to seize on the theme of dystopian thinking,
because you and I were, you know, talking previously about kind of the utopian thinking,
you know, setting goals. And I do think it's even,
you know, increasingly important to do that dystopian thinking, to do it in a way that's
constructive, you know, coming from kind of the federal science and technology space and advising,
you know, within the Department of Defense and those kinds of circles, it's easy to get focused on really kind of hyperbolic types of
concerns, like Determinator and Skynet and all those things. So I like us to do more dystopian
thinking, but I do like us to also ask the right questions, to take that mirror, you know, and kind
of reflect like you were saying with Commander
Data. And if you think about the mirror, there's a series on Netflix that I love called Black Mirror.
And so this is the reflection. I think these are the kinds of concerns we should really be
thinking seriously about. Ways in which people will use technologies, you know, in a way that
hurts people, you know, whether through positive intent or negative intent.
I love the kind of Twilight Zone-esque ways
in which the best intentions
kind of turn into unintended consequences
through things like the Killer Bees episode
as kind of a cautionary tale on autonomous systems
or the episode where the brain-computer interface
that the gentleman is using kind of, you know,
imprisons him in his own mind.
I think these are, those particular questions
may not be the right one, but that way of thinking,
you know, the dystopian thinking,
but really asking the right questions,
I think is important for us as we move forward.
Yeah. It's an interesting ethical dilemma,
I think, thinking about where the line is between
inaction and safeguarded action.
The safest thing to do in life is to sort of stay at home and don't come into contact with anyone else.
And you can sort of surround yourself in this bubble of safety
where you really can't do much.
And you can look at almost any substantial technology that we've
ever built. And like, if you let your imagination run wild, you are going to be able to imagine
like a huge number of bad things that you can do with the technology. And if you let that
imagination paralyze you and convince you not to build a technology at all, not
to leave your house at the beginning of the day to engage with the rest of humankind.
You miss out on these incredible things that help us become more human, I think, and then
solve really important problems like,
you know, that help us be healthier and live longer and like have, you know, much higher
quality of life and, you know, that supports a larger population on the planet, you know,
like, and, and, and, like, if you like just stripped away even the past 50 years worth of
technological development, like the world would be in a terrific amount of trouble,
which is something we often forget. So it's that framework that you use to decide on how you
balance positive action versus inaction and not get paralyzed in one way or the other.
I think it's right. I think you
articulated the trade space there very well.
I don't know the right answer.
Neither do I.
This is a trade space to be aware of and always conscious of.
I also think we need the right dose of
humility about our understanding of the world and ourselves.
We still have so much to learn, even about the ways that our own bodies and minds work,
much less the very rich, interconnected ecological systems that we live inside of.
I was actually moderating a panel at the National Academy of Sciences about science communication and the communication around uncertainty in science.
And, you know, the discussion, I'm not an expert in this area, but the discussion was about
genetic manipulation and doing things like putting genetic manipulations out into populations of,
say, insects or something to cause some
population level change. And the idea of making a change like that in an ecological system that's
so interconnected, you know, and the repercussions and the unintended consequences that could happen,
I'm not saying it's necessarily the right or wrong thing to do, but certainly I think we
need to approach it with the right dose of humility and to be humble as we explore that trade space that you just laid out there.
Yeah, look, I think that notion of humility is just that we need to make sure that we're not drinking our own Kool-Aid, so to speak, that we're not overconfident about what it is we think all of our technology and all of our science is telling us. One of the things that I've tried to push back on a lot over the past 12 months is
folks who are trying to take all of their incredible IQ and all of their incredible
energy and apply it to helping with the pandemic.
And it's this horrible thing.
And we've got this tremendous sense of urgency.
And even though we have that urgency,
I've tried really hard to help us or to encourage people to not throw away
our scientific process. So like the scientific process is like so
valuable because it's only allegiances to truth. And like that process of finding the truth is
like incredibly messy. Um, and like, even when you think you found truth, like you probably haven't.
And, you know, just sort of constantly reassessing like what you think you believe and like how you arrived at those beliefs is integral to the way science works.
And it's sort of unforgiving, right? Like if you insist dogmatically that this thing is true and you
just haven't used all of this apparatus of scientific rigor that we built up over the
centuries, things will spectacularly fail. So anyway, I think that we understand more than we do.
And we certainly shouldn't be sort of insisting that we've found truths we really haven't proved out.
Yeah, yeah.
It's so important.
And maybe this is a great segue into aspects around diversity and inclusion, because I do think that can really be a superpower for us if we leverage it.
Because like you said, it's so messy, you know, getting to, you know, getting from our subjective observations, even of experimental data, much less, you know, a world
experience. And everyone, even when you try to be unbiased, you bring all that inherent worldview
into everything that you do. And in some of these cases, I think our only hope is to integrate over
enough of those conflicting worldviews to try to triangulate on what is objectively true,
or at least the closest we can get there.
And that's, you know, there's so many aspects of diversity to really be focused on,
you know, demographic diversity, et cetera. But you definitely need people that think differently, that have a different lived experience, et cetera. How do we create pipelines of those
folks into these fields? How do we create safe spaces for people to
think differently and pursue their careers differently? There's just so many aspects.
But I think if we get it wrong, we're dooming ourselves, I think, to the kinds of groupthink
that happen when you have a lot of folks that, when you're not integrating, so to speak, over
a lot of these differing worldviews.
Yeah, I could not agree with you more.
I think the things that I've seen be most successful over a couple of decades as a professional in technology have been places where you have exactly what you said,
that diversity of experiences, that diversity
of perspectives, that diversity of backgrounds, and a real diversity of thinking. Who knows
how many wonderful things that you can dream up of or how many things you can inspire in others because you are both a musician and an engineer.
And having more of that, I think, is just so valuable.
There are infinite wonders in the world that we will never discover,
which is maybe one of those optimistic slash depressing things to say.
And we won't even get to the interesting part of that infinity unless everybody is
bringing the best that they have to figuring out what the future is.
Yeah, I think that's right. And, you know, even as our technology is advanced,
et cetera, I think the kinds of global challenges that we have to step up to are getting greater
and greater. And if you take this, you know, this simulation that we're all kind of involved in,
and you fast, you know, you fast forwarded enough yearsed enough years, the statistically improbable things that
we sort of are afraid of, they come to fruition. And the pandemic is an illustration. So I do think
we have to figure this out and get to the point where we're sort of activating our collective
intelligence in the best possible way. Yeah. And I think that if you think about some of these things
like preventing the next pandemic or making the impact of the next pandemic
less than everyone that came before them or climate change
or dealing with the demographic inversions
that are happening in the industrialized world.
Population growth is going to slow down in the US and Europe and China, Japan, and Korea.
It will start to slow down in India, and then it's going to start booming in Africa. And so, you know, the thing that we know is that, you know,
population growth is the thing that, you know, implies like the growth of ingenuity and creativity
and whatnot. And so, you know, like how you can start investing now in Africa so that as that
population explodes over the coming decades, like all of those energetic
young people are going to be equipped to like help, help all of us old folks, like figure out
how to solve some of the big, big challenges that we've got facing us. So yeah, I think that's
absolutely right. And, you know, even again, I'm not an expert in everything that I'm curious about, but it also seems to me that a lot of our economic systems are sort of predicated on this notion of an expanding population.
Yep.
You know, there's population trends start to invert, like you're saying, we need to do some serious thinking about how our economic markets and things allocate resources and all that.
So there's some really substantive and fundamental things I think we're going to need to be rethinking here. You've mentioned a couple of times actually, like one of the things that we should with
a tremendous amount of urgency be focusing on is like how you can use
machine learning to help with healthcare and aging.
I read this article last year in the New York Times about
the elder care crisis in Maine, where there just aren't enough people
to help take care of the aging population there.
And so it's not a wage thing or anything.
For no amount of money,
you can hire enough people to take care of all of the elderly.
And this is going to play out everywhere soon.
I think in Japan, they already are seriously thinking about the
technology that has to be built to help make sure
that you can have the elderly living a
dignified life in their later years
and you still are able to pick up the slack and productivity in the population
because you have fewer workers than you've ever had before.
And you preserve the ability of what workers that you do have to be able to do their jobs
and do the creative things that they're doing
to build the future of those societies
without being completely consumed
with taking care of their parents and older relatives.
And like the only thing that sorts that out is technology.
Otherwise you'd just have a,
you have a collapse in like the quality of life
because there just aren't enough people
to do all of the work.
Yeah, yeah, absolutely.
And then there, I think we have a really amazing opportunity in automation and autonomous systems where AI and robotics meet,
you know, the ability to ingest data and do analysis, make decisions, but also to carry those decisions out,
whether they're in, you know, we have a lot of automation emerging
in factory settings and controlled environments.
But, you know, I think a lot of the challenge,
you know, comes in these more open environments,
these less structured environments,
environments with people,
and really making robotics and automation work
in those kinds of settings.
And I just love to see more and more people,
more and more of our intellectual capacity thinking about some of those challenges in a human-centered way.
Yeah, I could not agree more. So we're almost up on time here. And so I think the last thing I
like to ask everyone is what you do for fun or in your spare time it's sort of a weird
question because I think almost everyone I chat with like thinks that their work
is fun and they don't have much spare time but I asked the question anyway
well I so I'm definitely one of those folks I'm definitely one of those people
that thinks my work is fun but I love being a father. I love hanging out with my kids. And, you know, in pandemic times, it's been all about video games and virtual social experiences through video game platforms have been just the thing that, you know, has kept the social fabric together, you know, among our kids and their friends circles and everything. So I'm someone who's, who grew up on video games and, you know, had a Nintendo, you know, and an Atari and all
those things. So I love games. And then just kind of getting into whatever my kids are into,
you know, in general, which right now is these kind of virtual games, which I don't,
I'll be honest with you that those kinds of platforms I'm not so crazy about, but I love
spending time with my kids.
So I kind of just, again, make a point to kind of get into whatever they're into.
So what are your what are your kids favorites?
Oh, man, they're into Roblox right now.
Yeah. You know, and Adopt Me is like a game in Roblox and Piggy.
I don't know if you've seen this uh you know so it's those things
and it's like okay well i guess i have to to get into these things to spend time with them sometimes
like i can't come into their room i have to come into the virtual server to to see them uh but i
sometimes try to pull them back towards more console oriented games too because that's kind
of my sweet spot but yeah it's really it's really fascinating. Like, I've got a 10-year-old and a 12-year-old,
and the 10-year-old is like a legit Roblox tycoon.
You know, he's learning all sorts of stuff about economics,
and, you know, like, he has this facility with virtual worlds
that will go with him the rest of his life.
It's really fascinating to hear that your kids are into that too.
It's an interesting shift.
Yeah, and a sign in the times for sure.
Well, thank you so much,
Ashley, for taking time to chat with us today.
I'm so glad that you're here at Microsoft and just can't wait to be able to work more with you over the coming years.
Likewise, Kevin. I really appreciate the opportunity. I had a lot of fun today.
Awesome.
Well, that was Kevin's conversation with Ashley Lawrence.
And I was so fascinated by everything you talked about. First of all,
he is brilliant. And as you were kind of saying at the top of the show, it's really rare that
you see someone who has achieved kind of parallel careers in two very different fields. Not that
there might not be, as you said, kind of a mysterious connection between music and
between computer science, but they are on the face of it, how they work very, very different. I think
that's just so stunning that someone like Ashley exists, honestly. Yeah, it's so great to see him
have the success that he's had in both of these things that he has passion about.
I think a lot of us get pushed into one direction or the other fairly early in our career.
I know you, for instance, you're a computer programmer and you are a writer and a journalist.
And those are also two things that look very different from
one another on the surface. And I'm guessing that through your career, and I'm guessing Ashley had
this as well. And like, I have my own experience with it. Like you get encouraged to like, oh,
you got to focus, you got to focus, you got to pick one thing. And like, I always love seeing
people like Ashley, where it's like, no, I actually don't have to pick one thing. I'm going to do both.
No, I totally agree. And you're exactly right. I think most of us, we do have one thing that we
have, we either have to pick or we have to focus more on. And I love that he's had these parallel
careers. But it was really also interesting to me hearing him talk about how at the beginning,
he tried to keep, you know keep his music life separate from his
technology work because he wanted to be taken seriously. And I'm glad he doesn't have to do
that anymore, that he can share his full self. But it also makes me think, okay, in technology,
people really need to be a lot more open-minded about the different backgrounds and interests that people have.
Oh, I could not agree with that more. And like when he said that, it really, really,
really resonated with me because, you know, a little bit of it I'm guessing is imposter syndrome
and, you know, you sort of feeling very uncomfortable early in your career about
whether or not you belong in the place
that you've chosen to be. And like part of it is like legitimately these
professions like have these notions of you know this is what it means to be a
blah and it's you know whether it's a medical doctor, a lawyer, a computer
scientist, academic, a programmer at a tech company.
And the reality is, like, if we were just much more open about things and encourage people to
be their authentic selves early on and, like, help them understand that, like, we all feel
like imposters at some point or the other, that maybe we'd have more people doing more interesting things.
For sure.
And I think it also, and this is really evident in the work that Ashley does,
it's so important in AI to have different perspectives.
And I think that's why it's amazing that we have someone like him
who is an expert on that area who also has different perspectives than other people might, you know, who've come into this because of his curiosity that he was talking about that he's had since he was a kid.
I have a feeling that you would need to be curious to be able to do the things that he does.
You need to have that sense of asking why and wanting to learn more. And I
love that. Yeah, for sure. And I agree. It is critically important in technology in general,
and particularly with AI. These things that are going to have a high degree of influence over
what the future looks like, you just really need a diverse set of people helping you build those things.
Just because the technology itself has so much impact, like you want it to be in the
hands of as many people as humanly possible.
For sure.
For sure.
Well, I loved the conversation and I can't wait to see what Ashley does now that he's at Microsoft.
Yeah, me too.
Okay, that's it for today's episode.
Thank you so much to Ashley for joining us today.
And send us a message anytime at BehindTheTech at Microsoft.com.
And be sure to keep listening.
See you next time.