Behind The Tech with Kevin Scott - Rana el Kaliouby, Founder of Affectiva, Deputy CEO of Smart Eye, and Emotion AI Pioneer
Episode Date: February 14, 2023Empathy and emotions are at the center of how we connect and communicate as humans—in fact, 93 percent of how we communicate is accomplished through non-verbal means. Emotion AI (a subset of AI that... measures, understands, simulates and reacts to human emotions) pioneer Rana el Kaliouby is on a mission to bring more emotional intelligence to the digital world. In this episode, Kevin talks with Rana about her upbringing and how she got into tech, her experience and challenges co-founding Affectiva and as Deputy CEO of Smart Eye, and the importance of infusing IQ and EQ into technology. Rana el Kaliouby | Smart Eye Kevin Scott Behind the Tech with Kevin Scott Discover and listen to other Microsoft podcasts.
Transcript
Discussion (0)
.
Empathy and emotions are at the center of
how we connect and communicate as humans.
Our emotional experiences drive our decision-making,
whether it's a big decision or a small decision,
it drives our learning, our memory,
it drives this human connection.
Hi, everyone. Welcome to Behind the Tech.
I'm your host, Kevin Scott,
Chief Technology Officer for Microsoft. In this podcast, we're going to to Behind the Tech. I'm your host, Kevin Scott, Chief Technology Officer for Microsoft.
In this podcast, we're gonna get behind the tech.
We'll talk with some of the people
who have made our modern tech world possible
and understand what motivated them to create what they did.
So join me to maybe learn a little bit
about the history of computing
and get a few behind the scenes insights
into what's happening today.
Stick around.
Hello, and welcome to Behind the Tech. I'm co-host Christina Warren,
Senior Developer Advocate at GitHub. And I'm Kevin Scott.
And today we have a super exciting guest with us, Rana L. Kalyubi, who is the co-founder of Affectiva. She's the deputy CEO of SmartEye and an emotion
AI pioneer, among so many other things. I am just thrilled to be speaking with Rana today.
You know, we've talked so much about AI on the show, but the convergence of AI and human emotion
is an angle I don't think we've ever really covered before, certainly not to the extent
that I'm hoping to get into with Rana.
I can't wait to listen to your conversation with Rana. So let's dive in.
Rana El-Khalyoubi is an Egyptian-American scientist, entrepreneur, author, and an AI
thought leader who is on a mission to bring emotional intelligence to our digital world.
She was the co-founder and CEO of Affectiva, which was acquired by SmartEye in 2021,
where she now serves as the deputy CEO.
Rana is also a general partner at AI Operators Fund, an early-stage AI-focused venture fund,
as well as an executive fellow at the Harvard Business School.
She's the author of Girl Decoded, a memoir following her personal journey as a woman in technology.
Rana, thank you so much for being here with us today.
Thank you for having me. I look forward to it.
So we always kick these conversations off by talking about how you first got interested
in science or technology. Like, was it when you were a little kid? Tell us a little bit about that.
Yeah, so I am originally Egyptian. I was born in Cairo, and my parents, kind of a cute story,
my parents met in a programming class. My dad taught COBOL, and my mom in the 70s decided to learn programming. She must have been one of the very first Egyptian women to get into tech,
and they met during that class, got married and moved to Kuwait.
So they both had careers in tech.
And so we, I have two little sisters and we grew up surrounded by technology.
And I remember like playing Atari video games.
And my dad had these like early huge camcorders.
And I was always fascinated by how technology brought our family together.
It connected us.
And this theme of how technology connects people has been really kind of a uniting thread across my career.
That's sort of very interesting.
And, like, I'd love to dig into that a little bit because some people, the fascination, early fascination with technology is about video games.
And for some people,
it's about, you know, just being able to communicate with other people. So like,
was there a particular thing? Or was it just that technology was this gravitational pull that
everyone around you was fascinated with? Yeah, I definitely think this whole like,
I remember that we had this like little blue plastic chair
it probably cost like i don't know like a dollar or something but my dad used to plop me on it and
and i think it was when i gave my first speeches ever and he would sit there with the camcorder
and record me like you know rambling um and and so that stuck with me i remember the first
program i wrote was a basic uh program and i i I, I, I'm, I'm Muslim by,
by background, but I, I wrote a program that kind of created a Christmas tree. And I remember that,
right? So I just think it's, it's, it's, it's various things. And it's just being surrounded
by technology and being fascinated by technology.
I think exposure is important, too. Yeah, I could not agree more. I'm convinced that for
well, it's certainly an advantage, right? Like one of the things, one of the big things that I
think folks struggle with is if you get interested in technology or programming
as a little kid, you just sort of have years and years of experience with it before you get to
college and you have to declare a major. And like what is easy for you in college might be very hard
for someone who didn't have that exposure. And it has nothing to do with capability. It's just, you know, experience.
So, like, I mean, it's wonderful that you had that environment where you, I mean, it sounds like it wasn't just computers. It was a whole bunch of technological things.
Your dad and mom must have been very enthusiastic nerds, which is great.
Yeah.
And I also think it takes away that fear, right?
A lot of people have fear of technology. And when you were in high school or middle school?
How did you sort of pursue this interest as you were a kid?
Did a lot of programming in middle school and high school. Not a lot, actually some.
And then I decided to study computer science at the American University in Cairo. So I went back to Egypt and kind of declared majoring in computer science and a minor in electrical engineering.
But again, I remember clearly, like, I just kept getting drawn to this human-computer interaction space, right?
That touch point between humans and machines.
I was less interested in the operating system and the computer architecture classes.
And I just became really fascinated.
But what is that touch point going to look like?
And how can we create it and architect it in a thoughtful way
that adds value to us as people?
And so I had a very clear plan.
I said, okay, I graduated top of my class.
I said, okay, I'm going to go abroad, get a PhD,
and come back and teach at the American
University in Cairo, AUC.
Inspired by actually a role model.
So there weren't a lot of women faculty in the department.
There was only one woman, Dr. Hoda.
And I was like, oh my gosh, she's so cool.
Like she's so smart, but fashionable and, you know, kind.
And I was like, I want to be like her. So she kind of
became my hashtag goals. And I went abroad to get my PhD thinking I'd come back and teach.
And then I always like to say I outgrew my dreams. My dreams evolved.
That's awesome. And did you feel supported in the environment that you were in? So like you had your mother obviously
as a role model and your dad was a programmer. You had this amazing computer science professor
who was a role model. But was there struggle? And like all of us have struggle, right? But, you know, like, did you feel like you were sort of supported?
I think by and large, I feel very blessed and lucky in that my parents, you know, were three daughters, right?
And in a culture where often there are expectations on being a woman that are different than the expectations for being a man.
But I do feel like my parents were pretty, in a way, they were very traditional, but they were also kind of very supportive of our education.
It's the biggest investment they've made.
They put all their money into our education.
So I feel very lucky that way.
And I can almost like draw a straight line from their investment to where I am today. We traveled a lot as kids, they invested in our learning, we went to international schools. So we were very exposed to kind of Western way of teaching and thinking. you know after undergrad I I got married very young I was 19 um and um there you know when I
then decided to go abroad to Cambridge University to get my PhD um both my parents and my in-laws
basically said you're married now you gotta you know stick with your husband and you can't just
leave and you know go pursue your career so and that was the beginning of this cultural tension between what I really wanted to do and my passion, but then
the cultural norms and what society expects from you. And so there's definitely a clash there.
Yeah. So that is, I mean, it's super interesting. I mean, I, I remember for me, I grew up in rural central Virginia and like my I didn't
have my my mom had gone to like in the 60s, what at the time they call
secretaries school and she, you know, learned a bunch of administrative
skills and was a bank teller before she got pregnant with me and
my brother. And my dad and my entire lineage were all construction workers. And so I just remember
being very tentative about moving away to go pursue my career. So it was super hard for me. And I didn't do it until after
undergrad to like, leave where I was raised to go do something that I was passionate about.
It sounds like you not only did that, but like you also had all of this other stuff
going on that made it hard. Like how on earth did you muster the courage to do this?
Yeah, it takes a lot of courage and also a leap into the unknown, right? When you,
you don't really know how it's going to work out. Often, like in my case, my family kept,
you know, there was a lot of fear. They basically kept saying, no, no, no. So,
I also moved to Cambridge University right around September 11th, where, you know, and at the time I used to wear a headscarf. So I was very clearly Muslim. And there was just a lot of fear. And to me, I don't know, when you're excited about something and you're passionate about it, it becomes the best motivator, I think. Yeah. But I mean, still, I mean, it's really
inspiring to hear you talk about this because I know
those sometimes are the
hardest things. It's less about can I get an A in this
class or like, am I technically good enough? It's like, can I
you know, everything else that's going on in my life,
can I just make it work?
And like, there's just so much stuff
that's going on in everyone's lives.
And like, you're young and inexperienced.
And so anyway, I just always love
to celebrate these stories about people
who have figured it out.
And, you know, like talking about it, I think helps serve as a role model for other folks who can see,
yeah, Rana was able to overcome a bunch of stuff to go do this amazing set of things.
So it's just great to hear.
I do think one of my core values is, is this concept of courage, right. And being bold and, and,
and how it relates to faith and just having that, yeah.
Taking that leap of faith into the unknown.
You won't always have the answers you will. And,
and I like to say like embark on this journey.
I tell my kids this all the time,
like embark on the journey without attaching to outcomes and it'll,
you know, it'll open doors. It'll create new opportunities.
You often don't know what they are, but it's better than what you originally
thought. So that's, I kind of always encourage young people. Yeah. And so speaking of that very,
very good advice that you give your kids, that's a good mindset to have, not only when you're
pursuing something like a PhD, but like when you're thinking about having an entrepreneurial
career. So like, talk a little bit about like, yeah, once you get to Cambridge, you're pursuing something like a PhD, but like when you're thinking about having an entrepreneurial career.
So like, talk a little bit about like,
yeah, once you get to Cambridge,
you're working on this PhD
and then, you know, you sort of jump into this career.
Like what did those experiences look like?
So I, so my PhD was all about
bringing emotional intelligence to machines.
And I'm sure we're going to dig into that a little bit more.
But my plan was to finish my PhD and go back to Egypt and teach.
And literally a month before my defense, I met this MIT professor, Professor Rosalind
Picard, who I'd been following for years.
I had read her book and basically her book kind of pivoted my interests
into kind of pursuing this idea of emotion AI.
But I'd never had the opportunity to meet her in person.
And very serendipitously, she was visiting Cambridge to give a talk.
And she said, I have an hour.
She emailed the lab, and she said, I have, you know, an hour to kill.
I'd love to meet some students.
So I raised my hand, of course.
And I booked a 15-minute slot with her, which turned into 45 minutes. And I spent weeks preparing for
that meeting. I was going to demo my technology and kind of, you know, I had all these ideas
and we totally hit it off. And she said, you know, I'd love to bring you on as a postdoc in my group.
And I remember saying, oh my God, this is like a dream come true. But I have this husband who's been waiting for me for the past four years back in Egypt. I kind of need to go back. And she said, that's fine, just commute. So I started a four year commute between Cairo and Boston. I would literally like spend a few weeks in Cairo and a few weeks in Boston, which is fun, but also exhausting and
not good for any marriage. That's how I ended up at MIT. And then within a couple of years at the
Media Lab, it was clear that we had a huge commercial potential for the technology and
we spun out. Well, so let's talk a little bit about the technology itself.
I'm really intrigued by this idea of emotional AI.
Because one of the things that we think about all the time and
all of the infrastructure that we do and all the work that we've done with
OpenAI and our very strong conviction around how fast
these systems are becoming very capable is you do want them to be
able to be really useful to human beings. I mean, one of the things that I wrote about in my
book was this experience going back home to rural central Virginia, and I was talking with one of my friends who is the office administrator for a nursing home. And I talked a little bit about AI,
and she's like, yeah, I don't know how AI would actually connect with the people in this home.
It may be able to do a bunch of diagnostic things as well as a doctor, but like, that's not the
important thing. The important thing is being able to be in the lives of these people in a way that
they can connect with. So like, I think what you're doing is very, very intriguing. So maybe
talk a little bit about that and why you think it's interesting. Yeah, I mean, it's exactly what
your friend is saying, like empathy and
emotions are at the center of how we connect and communicate as humans. And our emotional
experiences drive our decision making, whether it's a big decision or a small decision, it drives
our learning, our memory, it drives this human connection, it's how we build trust, it's everything.
But if you look at how we think about technology, it's often what I call the IQ of the device. Take chat GPT, right? Very smart,
very intelligent. But it's all very focused on the IQ, the cognitive skills. And I felt like it was
out of balance with our emotional and social intelligence skills. And I wanted to marry the
IQ and the EQ in our machines,
basically. And so I had to go back. I'm a computer scientist by background, but I had to like study
the science of emotions and how do we express emotions as human beings. And it turns out
93% of how we communicate is nonverbal. It's a combination of our facial expressions,
you know, our hand gestures, our vocal intonations, like how much energy is in our voice, all of these things.
And only 7% is in the actual choice of words we use.
So I feel like there's been a lot of emphasis on the 7%, but the 93% has been lost in cyberspace.
So I'm kind of reclaiming that 93% using computer vision and machine learning technologies that weren't available,
you know, 20 years ago, but now they're ubiquitous.
Yeah, I think you're absolutely right.
In 2016, I dialed back almost all of my social media consumption because you effectively
have these machine learning systems, like particularly for
businesses where their business model is engagement. So like the more you engage with
the platform, the more ads run and the more money that you make. It is very easy to like get systems
that get the engagement by triggering your amygdala and like keep you in this.
And it's very easy to be triggered by email.
Like I all the time have email conversations
with colleagues where like I get super agitated
by the email conversation.
And if I just jump into a video call with them,
even like not even face-to-face, but what we're doing right now,
in seconds, all of the stuff that I was agitated about goes away. So I'm just super intrigued by
what you're doing. How do we actually get this rolled out more broadly? Because I think you're
absolutely right. We've focused so much on the text
and text is so rife with opportunity
to get us emotionally activated in the wrong way.
Wrong way, right.
Because there's a lot of confusion and ambiguity
where you can clarify that
when you see the face or hear the voice.
I mean, I think what's really cool about this
and what ended up being both the opportunity,
but also the biggest challenge for Affectiva, you know, when we spun out of MIT, was that there are so many applications of this technology. And so we tried to focus, but it was always this challenge of like, oh, my God, like, there are so many cool applications.
And so some that I think are really powerful, one is the automotive industry where, you know, we ended up selling to SmartEye and they're very focused on bringing driver monitoring solutions, you know, to the world.
And so this idea of understanding driver distraction and, you know, if you're texting while driving, well, we can tell that using computer vision, right?
Look at your eyes, drowsiness, intoxication, even we can, we've started like doing a lot of research to detect that using cameras, you know, optical sensors.
So automotive is one, one area.
We can do advanced safety solutions.
We can look at like, is there a child seat in the car and an infant in it?
And often, not often, but about 50 kids or so
get forgotten in the car every year and they die of heat.
So that's very fixable.
We can fix that.
Mental health is another space that I'm really fascinated by.
We know that there are facial and vocal biomarkers
of mental health disease, depression, anxiety, stress,
even Parkinson's. So imagine if we,
every time we hop in front of our computer with people's opt-in, of course, we can kind of
articulate your baseline using machine learning. We know your baseline. And then if you start
deviating from it, we can flag that to you or a loved one or, you know, psychiatrist.
So I think there's a lot of opportunity.
But we're missing the scale. I think this is where Microsoft comes in, right?
Yeah. Yeah. And we think about that all the time. I, one of the big things that we are attempting to do with AI is we want to make sure that it's a platform that other people
are using to build what they think is interesting and valuable versus us sort of dictating to the
entire universe what is interesting and valuable. And I mean, that is part of being a platform
is like the platform has to help you solve
all of the problems that you have
for building really great experiences for users.
So yeah, so it's super interesting
to think about what you are doing.
It's, I mean, ultimately too,
I believe it's going to be embedded in, it's going to be
like the natural human machine interface. We're already seeing conversational interfaces, right?
And then we're going to see perceptual machines, and you bring all of that together.
And imagine if, you know, as you engage with your device, whether it's a phone or in your car,
your office or a home robot, it can kind of glean all of that information
and then nudge you to improve, you know, to improve your life, whether it's to be happier
or healthier or more productive or, you know, whatever the KPI is. It can nudge you to do that better. And do you think at all about influence?
So, I mean, like, with technology and the limit, where you want the flows of influence
to be is, like, you want the human to influence the technology.
So, like, I have a thing I want to accomplish.
Like, I want to, like, be able to convey my intention to the machine and
have it help me do this thing. What you want is less of the influence coming in the other way,
that there's some sort of intention embedded in the thing that the person is unaware of,
that is getting you to do something that you might otherwise not do.
And so do you think about that much with what you're doing?
Definitely, because it's a very blurry line between manipulation, right,
and nudging towards positive behavior change.
Correct.
Where is that line and who decides, right?
And so in the case of social media, for example, it was optimized towards engagement.
And I don't know that that was good for each of us individually, right? It wasn't optimized
for us. So I think that goes back to like the ethics of all of this and how do we develop
and deploy this in an ethical way. It's not about the technology almost, it's about the
organizations and the teams that are deciding the use cases and the applications.
Yeah.
And I think a lot of it actually is about transparency.
It's about, you know, everybody being informed enough to understand, like, how the technology works.
Like, not at the deepest levels, but, you know, this is the basic structure of these systems.
I mean, my experience when I dial back on social media consumption, I'm sitting in there in 2016 reading a bunch of stuff that's the same thing over and over and over again.
It's all opinion, and I'm getting agitated by it all the
time. And I like, I, you know, take a step back and I'm like, what am I doing? Like, I'm not
learning anything like this is this is not healthy. It's almost like being on a diet of junk
food, right? Like you, you know, you sort of love eating the donut. But, you know, if all you
eat is donuts, like you're going to get diabetes and die. How do you you have kids, right? How do
you navigate that with how old are your kids? They're 12 and 14. Okay, are they on TikTok?
Oh, yeah, totally on TikTok. And like, we talk about it a lot. Like I talk about
it's like, you know, you're on this. So the first and foremost, like the thing that we try to do
is to give them productive things that they can go do that are interesting to them that isn't
just passively consuming content. Yeah. So my 14 year old is a competitive rower and she like
thinks she wants to be a cardiothoracic surgeon. So she likes to study all of the, and she's like
very into her schoolwork and she's, so she's super busy. So like, she's got all of these other things
that compete for her attention and they're all productive things.
When she gets on TikTok, I think my assessment as her parent is it's okay doing a little bit.
But even then, like we talk about, it's like, look, kiddo,
this system is designed to capture your attention for as long as it can possibly hold on to it.
And it really doesn't care about anything else other than you
swiping up to the next video
and doing that over and over and over and over again.
And there's a very powerful machine learning system sitting there
watching what you're doing and like trying to select
the next thing that's going into your feed that it doesn't
care whether you're happy.
It doesn't care about whether it's healthy.
All it cares is that you swipe up again.
And it's like, now that you know this, do you really want to be doing this all the... And
so I think that's the best I can do because she's 14.
So four years from now, she'll fly the coop and like I will like that I have no visibility into like what she's doing.
And so like I'm just trying to give her some patterns that she can use to make good decisions for herself in the future.
Not just, you know, put walls around things that she thinks she wants.
Yeah, but I actually, this is awesome to hear because I feel like I have a similar parenting
philosophy too. So rule number one or approach number one is to keep them busy with other
productive stuff, right? So both my kids are just like very active and they just have a lot of
things that they're interested in, which is awesome.
And then if when they use technology, think of themselves as not just consumers, but creators. Right. It's my son. He's 13. He has a podcast, which is awesome because now he has to think about he has to learn these new tools.
He has to kind of think about who am I going to invite? He has to think about the questions.
So it's not just consuming it's producing yeah so i
i try to kind of nudge them in that direction but of course he still uses tiktok well and and
i think you're totally right like the act of producing makes you think about a whole bunch
of things other than i mean like in part of wanting to produce i think as a kid is like oh
i want to like get tiktok famous or or I want people to pay attention to me.
But it also does this other thing.
It's like, okay, I have to try to understand what it is other people want.
That's a good thing.
I have to go learn the tools to do the producing.
That's a good thing.
They're all useful, interesting tools.
I have to figure out how to compose content in a way where it's compelling.
Like, that's so, yeah, I think you're totally right.
Like, it's a very interesting activity that teaches you a lot of things.
That's smart.
Yeah, I try.
I try.
There's still a fair amount of video games and TikTok happening. But I think it's a lot of fun.
Yeah, I mean, I did not intend this to be a podcast about parenting.
But one of the other things, too, that I try to do as a parent is I realize that there's a whole bunch of stuff that I did as a kid that was sort of really pointless.
Like, I read a bunch of comic books.
I, you know, like I watched cartoons whenever they aired on the networks that were on.
I collected Star Wars and G.I.
Joe action figures and, you know, put on like these huge pretend campaigns.
And I look at what my kids are doing,
and a lot of it is exactly the same thing.
It's just the medium is different now.
And so I try not to...
I mean, my youngest loves Roblox, for instance,
and so she spends all of her disposable income,
like her $ you know,
$10 birthday presents and whatnot on Robux. And at first I'm like, oh my God, like, what is this
kid doing? Like she's spending all of her money on like, you know, these virtual things. And I'm
like, but wait, is that really any different from what I spent my money on? That's like
all been disposed of now. Like I wasn't buying anything when I was a kid that had enduring value as an artifact.
And so like part of it is just, you know, old bias or parent bias or whatnot.
It's like you just, I think as a parent have to appreciate the context that you're in.
And just because like, you know, medium or, you know, something about the context in 2022 is different from your context, that is probably a lot of it is the
same thing. And like, we all turned out okay. Right. That's true. Adam and I, my son,
we have a lot of conversations around because he was at some point also into Roblox, but just in
general, buying these digital assets, right. Which I I'm perfectly fine spending the same amount of money on like a
physical thing. And he's like, but it's kind of the same.
So we have these debates around what does it mean to be in the metaverse and
yeah, invest and buy and like kind of be immersed in that world,
which is fascinating.
Yeah. And, and, and it is. And like,
I think for my kids it's given us a bunch of teaching moments.
Like, we talk about these digital assets.
Like, a lot of the assets, even in the physical world or, like, in the adult world that we have that we think are real, like, they're just sort of imaginary.
Like, they only have value because, like, a whole bunch of people have decided that they have value. Like it's, yeah, like you have paper in your pocket or maybe you don't,
that, you know, you've decided like this, you know, this piece of paper, a dollar bill or a
$10 bill is like, yeah, it's only worth something because we've all agreed that it's worth something.
Yeah. And which we just sort of forget, like when a thing becomes so ubiquitous
and like it's you know
just a commonplace thing to use it
to exchange value
but like in a sense it's
really I mean
from a 12 year
old's perspective without going into
fiat currencies and economics
and like all sorts of but like
conceptually like it's just not that much different. Right. Totally. Well, so let's talk a little bit about your, your company. So
it is yet another courageous thing, I think, to go from being an academic to being an entrepreneur. So tell us what it was like to be a founder,
like straight out of like postdoc. Yeah. I didn't have a lot of examples around me,
actually. And it was a steep learning curve. I mean, it's one of the most fulfilling things
I've done in my life, but also probably the most challenging. And one thing about, it's back to this leap of faith, right? Like I took a leap of faith into the unknown,
probably being a little bit naive. I remember the very first roadmap we built, product roadmap we
built and we showed to investors was Q1, we're going to build the technology and bring it to
the education market. Q2, we're going to go to healthcare. Q3, we're going to do automation. It was so naive.
Obviously, it took us like years to bring the product into one market.
I remember also very early on, we were being courted by an investor and he sent me an email
and he said, send me your BS.
And I was like, I have no idea what this guy's talking about.
The only BS I know you can't send in an email.
And he of course meant balance sheet, but I didn't know.
So, you know, so it was, it was, I learned a ton.
We ended up being venture backed and, and, and, you know,
we raised $50 million of strategic and venture funding.
So I figured it out on, on, on the go.
But yeah, it was, it was this like crazy emotional roller coaster. So talk about what were some of the more challenging things that you had to deal with?
Because I know everybody has a slightly different experience. Like I certainly have my own things
that like lots of things that were hard, like being at a startup.
What was the hardest thing for you?
Hmm?
What was the, what's one hard thing that?
Well, so I had been, I've been around entrepreneurs my entire life. I just didn't realize that what they were doing was the thing associated with this fancy word.
And so, like, I knew a lot about what some of what was required to do a startup.
It's like nobody's going to do anything for you.
If you want a thing done, like, you just have to roll up your sleeves and get it done. There's, you know, my dad was like a good example of this. It's like, you know, your job as a leader or like someone who's like running
a thing is to empower other people to do what they're great at. And if that means that you're
left with sweeping the floors or washing the dishes or whatever it is that has to be done, like that's what you go do so that they can go do
their best thing. So like that, that I was already comfortable with. But one of the hard things for
me when, and I've been a pre-IPO employee at Google, so, but like by the time I got there,
you know, there were hundreds of engineers at the company and like there was no existential risk of failure.
Like Google was going to be fine by the time I got there.
But when I went to my startup, like it was not obvious that everything was going to be
fine.
Like every day I felt like I could come in and make some mistake that was going to end
everything.
And the weight of that got greater every day when we had more and more people. So, like, I had talked these really good people into coming to work for me.
And, like, they had made a bet on me and the company and what we were doing together. And I just sort of, so the hard thing for me was
just figuring out how to bear the responsibility without breaking. And that was tough.
Yeah, I can relate to that. I think the kind of the story that captures that was two years in,
we had raised like a small seed round, and then we were getting ready to raise a much larger round. And we had, we were basically going to be out of money come August. And we were
talking to a lot of investors, but nothing had materialized. And we got a call from the venture
arm of an intelligence agency. They said, you know, we've been watching the company, love the
technology. We want to put in $40 million, which was a lot of money for us at the time.
But the condition was we would pivot everything to focus on surveillance and security,
which we had very strong core values around opt-in and respecting people's privacy. It's
very personal data. We didn't feel the technology was there yet to use it in this way. And so
it was a very hard decision to turn that funding away, not knowing
if we will survive past August, right? Because we didn't have a lot of other, we didn't have an
option B, but we did because it wasn't in line with our core values, you know, as a company and
why we started the company. And so that was hard. I remember, I remember, you know,
and we hustled and we found the right investors who supported us through our
journey and they shared the same core values,
but that was a big decision because I looked like you,
I looked at my team and I was like, Oh my God,
these I may not be able to pay these people in a few months. And yeah.
Yeah. I mean, those, those sorts of decisions are excruciating.
And, yeah, I think one of the hardest parts is, like, you just don't know.
And this is the thing in general.
Like, the further along you get in your career, like, you end up making these consequential
decisions, and you have no idea until much later whether or not they were the right decision
to make.
Right.
Yeah, absolutely. Yeah. I think the thing I'm most proud of kind of in the Affectiva journey
and even now is just the people, right? Because you bring these people on board, they trust you,
right? And they kind of believe in your vision and your strategy and roadmap. And I'm just proud because I look at these people
and they've developed professionally
and personally in their careers.
They've gotten married and have kids and settled.
And I'm like, oh my God, this like makes me happy.
Yeah, I totally agree with you.
I think that is one of the most rewarding things
about doing these things.
So I look at people that were on my teams at Google, people who came to work for
us at AdMob, people who worked at various points with us at LinkedIn, and just see what those
folks are doing now. It just makes me enormously proud, maybe more than anything else that I do. Yeah, it's awesome you're getting
that as well. I did want to ask you, so one of the things in technology that is not great is we've just had a hard time since the 1980s having real representation in the field,
that it is clearly biased in a particular set of directions, and we don't have as many amazing
women in the field as we should. And if you narrow down to AI, the representation gets even more narrow.
And so you're an example of a woman in AI who's had real success.
So like, how do you think about that?
So, you know, both in terms of your own personal experience and feelings,
but as well as like, you know, whether you like it or not,
you're sort of a role model now. And like, what do you do with that?
Right. Yeah. As you can imagine, I'm super passionate about this.
I will say like the journey of raising venture and strategic funding for Affectiva hasn't been
easy because you're often pitching to people who don't at all look like me and often haven't seen
examples of founders who look like me and often haven't seen examples of founders
who look like me. So it increases the risk factor, right? Whether it's conscious or subconscious. So
it's been tough. And it took me many, many years to own my voice, right? And not feel less. I mean,
I still kind of have this Debbie Downer voice in my head that
basically just says, you know, you don't fit, right? And I have to kind of work through that.
But as a result, I now feel very passionate about paying it forward. At some point, I realized
that, yes, I have an opportunity here to be a role model, because role models are powerful.
When you can see somebody who looks like you, who's, you know, a few steps ahead, you can say, wow, I can be like that person.
And so I I take it very seriously. It's also why I've started a fund.
So I I don't just invest in women or underrepresented minorities, but I definitely, you know, half of my investments so far are women-led AI companies, which is very unusual.
And I just feel I've been really lucky and I have this opportunity to pay it forward.
That's so awesome.
So last couple of questions before we run out of time.
So what are you excited about in AI now?
Like where do you think we're headed the next few years?
I'm just really fascinated by how with machine learning and AI and data, we can now really understand health and wellness.
And like this intersection of AI and biology, I just think it's going to truly transform how we think about our health and
wellness.
And, you know, we measure our health and our physical activity.
We can now start tracking our internal body systems in various ways using various sensors.
But eventually, all this data needs to come together and we'll learn a lot.
And then with AI and predictive analytics, it can nudge you
to change behavior for the better. So I'm just really fascinated by that space.
Yeah. And as am I. And I think it's one of those places even where
the IQ of the large models is useful. Like, not on its own. Like, you have to, like,
build real products on top of it that have a bunch of the things that we talked about earlier.
But, yeah, I mean, the combination of powerful models, this ubiquitous data that we've got about what's happening with you biometrically at any point in time, being able to ingest all of the world's research about what you know, what, yeah, I mean, it's really
fascinating to think about. And we need it too, right? Like we have, I don't know what the
demographics look like in Egypt, but like, if you look across the world, it's, yeah,
Italy's population peaked years ago and is contracting. Japan's population peaked and
is contracting. Germany's population peaked, is contracting, Germany's population peaked is contracting.
I think 2022 is the year where China will reach peak population and will be contracting
from here on out.
So we have this really interesting, and it's not distant future, it's near future coming
at us where well within our lifetimes, we will have this shift from, you know, what we have now to like more retired and aging folks than working age folks.
You know, and all the aging folks will have higher health care needs and demands than younger people. of what takes up the slack in the productive workforce
and what helps people aging lead a healthy, dignified life.
So it's, yeah, something needs to change here.
Yeah, and this concept of lifespan and healthspan, right?
How can you help people live healthier longer?
And I really do think
technology is at the core of the solution, both in terms of diagnosis and personalized and
precision medicine, but also in terms of care, right? Like using technology and maybe home robots
or home companions that can be more helpful. Obviously, a lot of ethics questions come up when
we go there. But, you know, I think this stuff that you're talking about, like, having things that,
AI systems that both understand human emotion and that are able to emote in a way that lets
them connect with other human beings. Yeah, my grandmother lives by herself. She's 92 years old.
Like she's well enough not to need to be in like assisted care.
Actually, she's like robust, good health.
But she's alone.
Like her, you know, my grandfather died 10 or 15 years ago.
I mean, she's been by herself for a long while. And my mother talks to her five
or six times a day on the phone. Yeah, and she has this community of people who look after her,
that she's been connected with her entire life. But I do wonder whether there's a role for
technology to play in giving her companionship,
not just to make sure she doesn't fall down and an injury goes unnoticed,
but so that she has some richness in her life with the companionship that she might not otherwise have.
Right. Yeah.
Yeah, I think there's opportunity there.
But again, how do you design these in a way that's thoughtful? You know, isn't spooking. It doesn't take away from her human to human connections, right? It's not embracing her, you know, her relationships with this robot.
Correct.
But it's complementing it. Yeah. And I think in a sense, that is maybe harder than training a really big model and having it be able to pass standardized tests or do complicated mathematics or whatnot.
Like training something that has authentic, non-manipulative rapport with a human being is tough.
It's tough.
Yeah, it's definitely not.
We haven't cracked that code yet.
Cool.
All right.
So one last question before we go, which I ask everyone.
So you have such interesting work that you do.
You know, you're a mom to two kids,
so you're super busy. But what do you do in your spare time for fun?
Fun? Oh, okay. So I'm an avid Zumba dancer. I love dancing. And so I've been doing it for six years. I'm very close to becoming a certified instructor. So I don't know if I'll move my career to being a Zumba instructor ever.
But it brings me joy.
It's good exercise.
But most importantly, it just makes me – I always tell my team, I'll be a happier leader.
I'll be a better leader if you let me do my Zumba class.
So don't schedule on top of it.
That is super, super cool.
Well, thank you so much for taking time to chat with us today.
And, you know, thank you for what you do.
Like, it's really inspiring to hear about your journey and to think that we've got someone like you off tackling these heart problems.
Thank you so much for having me.
It's a true honor.
I appreciate it.
Awesome.
What a fascinating conversation with Rana El-Khayoubi. So one thing that I wanted to talk to you about before we kind of dive into some of the Emotion AI stuff is, you know, you always start
out these interviews by asking people how they got into tech. And what really struck me with your conversation with Rana was how important it is, not just, I think, to have access to certain things, but how important it is to have people who support you in your interests. up in a time and in a part of the world where not every set of parents would be as open
to educating their children the way Rana's were, or certainly as open to exposing her
to technology the way that they were.
And I was really struck by how important that is as an aspect, I think, especially when
it comes to representation that we sometimes omit a little bit.
Yeah, I could not agree more.
And I think almost to a person, whenever you find someone who is having a successful career
in technology or in any other domain, there has to be support somewhere because it's so hard to be successful
at anything. There's so much pulling you in the other direction, like people telling you you can't
do it, you know, like people trying to get you to doubt yourself, just institutional things that like either aggressively or passive aggressively
pushing back against what it is you're trying to accomplish. Normal human emotion, like,
you know, some of this stuff is just sort of hard and people, I think, are naturally afraid of
failing. And so figuring out how to get the courage to get on top of that. And that is a lot of hard to deal with all on your own could have potentially pushed back against her achieving these really ambitious and awesome goals that she had for herself.
Absolutely. Absolutely. But I think likewise, you know, her background, but also her interests and the support she had and just, you know, the drive she has within herself, I think, makes her really well suited to doing what she's doing, which in her own words is, you know, wanting to marry the worlds
of AI and emotional EQ and IQ, I think is what she said, wanting to marry EQ and IQ and AI.
One of the interesting stats that she pointed out was that, you know, 93% of how we communicate is nonverbal. But when
it comes to AI, we've been focusing on that 7%. And I was really thinking a lot about that,
especially as, you know, the last few months, we've all been playing with so many various
GPT-3 models. And those models, I think, have gotten really good at that 7%, right? Like,
they're obviously not all the way there, but I think they're pretty good at some of the written stuff.
From your perspective as a technologist, as somebody who follows a lot of these things, from having conversations with people like Rana, what do you think we need to do next to attack that other 93%, those nonverbal cues which become so important to human interaction?
Yeah, I think it's maybe the most important question we should be asking right now.
So it's easy for me, at least, and I think this is true for other people in technology,
to get really carried away with the capability of a thing.
So like what it can do or like how it can do it.
And like that's certainly true with these powerful models.
Like I honestly have been, even though I've been very, very, very deep in this stuff for
a really long time. And, you know, we have, you know, in partnership with OpenAI,
been, I think, on the forefront of this stuff for the past handful of years. And so, like,
we can even see what we think is coming. I've been surprised that we accomplished as much as we did this year. But even though all of that's sort of a fascinating what,
like the thing that's really challenging with these things
is like what you choose to go do with them.
And I think this is some of what Rana is getting at.
So yeah, part of what I hope that we have more conversations about in the coming year is how we can build all of this capability into applications where you're serving the user and where you don't actually forget the emotional element. you know, pretty sophisticated AI systems already that, you know, materialize your Twitter feed or,
you know, figure out what it is that you're going to see on TikTok when you open that app.
And, you know, what those systems optimize for isn't necessarily the full gamut of emotions that,
you know, sort of enrich human beings. Like a lot of it is like, oh, I'm going to show you something sensational to keep you
like right here in this experience.
And so the thing that I hope we can learn from folks like Rana is like how we can build
these systems in ways where they're provoking the right emotion and where they can take emotional feedback from people to
help you, yeah, help you basically feel the way that you want to feel from interacting with your
technology, not feel the way someone else wants you to feel. Like, that's the difference between
agency and manipulation. I love that. I love that. And you're right, that is the difference
between the two. And I think that, you know, that's certainly what Ronna was talking about, some of her core values and the things that, you know, she wasn't willing to take money for and wanting to talk more about transparency in these systems. And I think that I'm heartened that we have people like Rana who have the background to be able to marry these two worlds together.
Someone, you know, who has, you know, such a strong background in emotional intelligence, wanting to take that into AI so that when these systems do come out, maybe we'll do a little bit better than we did with kind of our Gen 1, you know, algorithms, which, you know, maybe like engage, buy, and rage, right?
Like maybe we don't do that in the future,
which would be fantastic.
Let us hope.
And maybe even we use our AI systems
to stand in between us in some of these things
so that we can have more agency
over what we're consuming and how we feel about it.
Wouldn't that be cool?
Wouldn't that be great if the AI could actually kind of be that little bit of a barrier to
say, hey, I know this looks good, but this is actually going to make you feel really
bad later on.
Are you sure you want to do this?
That would be really cool.
It's like the little angel sitting on my shoulder telling me not to eat the jelly donut.
No, see, exactly, exactly.
Kevin, we could call it Clippy.
We already have the IP.
And it's like, are you sure? Are you sure you want to eat the donut? And I think we've already got something there. I like it. I like it. Yeah, me too. All right. That is all the time that we
have for today. Thank you so much to Rana El-Khawiyubi for joining us today.
And if you have anything
that you'd like to share with us,
your thoughts on any of this stuff
happening with emotional AI,
please email us anytime
at behindthetech at microsoft.com.
And you can follow Behind the Tech
on your favorite podcast platform,
or you can check out
our full video episodes on YouTube.
Thanks for tuning in.
See you next time.