a16z Podcast - a16z Podcast: For Google, Android is a Tactic and Cloud is a Strategy
Episode Date: May 30, 2015Google is a vast machine learning company. If you think about it in those terms, says Benedict Evans, every product and feature Google builds is an expression of its machine learning expertise -- or a... way to distribute it, and provide easier access to it. Evans joins the pod to pick apart all the latest machine learning-driven tech from Google as it hosts its annual developer party I/O. What’s become very clear this year, Evans says, is that for Google all the really cool stuff isn’t happening in Android, it’s happening in the Cloud. Finally, what’s next in VR from Google, and how it plans to tackle the developing world.
Transcript
Discussion (0)
Welcome to the A16Z podcast, day one of Google I.O.
And I have Benedict Evans here to pick it apart for us.
Benedict, welcome.
Hello.
I was excited by just the name of Android M.
It makes it seem so non-lollipopi, et cetera.
And I was waiting for an incredible breakthrough.
In candy naming.
Candy naming.
But what did we get?
Well, it's interesting.
And there's maybe half,
maybe half a dozen sort of separate things to talk about here.
The first thing is Android M, which has a bunch of perfectly worthwhile sensible improvements,
some of which were in iOS, some of which will be in iOS, as is always the way with this stuff.
No really big tent pole improvements, to be honest.
A few little bits and pieces and lots of good stuff, but no big tempile improvements.
What's interesting and what's increasingly apparent from Google is that all the innovative
really is happening in the Google services that get laid on top of Android.
And in many cases, of course, are also available on iOS and indeed on the web.
So, wait, describe that for us.
If it doesn't sit inside Android necessarily or it's not part of Android, it sits where?
So, well, so there's two answers to that.
One is it may simply be a standalone app like Google Photos, which we can talk about in a bit.
The other is that it may be something that's integrated into Android like Google Now,
but that it's fundamentally a cloud service
and there's a set of tools on the device
but that's really just kind of an end point
a client for something that's happening somewhere else
and so there are I suppose
three two or three sort of things
worth pulling out than are new here
the first is a relaunch of Google's attempts in payments
this time using the mobile operator's soft card system
which is sort of similar in some ways
and similar and different in other ways
it's to Apple Pay.
So this was a, yeah, a response.
This is a payment on the device using your card,
but with a sort of a pass-through tokenization system,
which is kind of quite fiddly.
And they announced several hundred.
So I've got to actually sit and watch the afternoon session
to work out what they're actually saying they're doing.
So no partner banks, you know, no country, it's nothing like that.
But so we have, now we have Google Pay to match Apple Pay.
And there's a fingerprint API if you have a phone that has a fingerprint scanner in it.
But of course, you know, it's very hard to make.
print scanners, especially if you're not Apple, and to make a good one. And then there's an
interesting contrast, I think, between that. And then the other, the second thing is Google Photos.
And so Google Photos looks a little bit like Apple Photos, or the thing that was rolled out in iOS 7 or
iOS 8, so it groups all your photos by location, by date. There's a cloud sync across every device.
Unlimited storage, where for Apple you have to pay, well, it depends how much you've got, you can pay a few
a month you can pay you know $15 a month or $20 a month and it's surprised people that people didn't
make that that Apple that didn't make that free and but then what you also have which Apple
doesn't have is cloud-based image recognition so you can search for people and it has face
recognition you can just search for dog or boat and it will find that photo that you took two years
ago of a boat with a dog on it or whatever it is and it's interesting to contrast that with
and there's an iOS app as well.
And so it's interesting to contrast this with Google Pay, Apple Pay,
because here you have a perfect example.
It's really hard to do it like integrated fingerprint scanner
with semi-gutcher support and a sapphire thing
and the whole thing that Apple's built.
Really hard for the Android ecosystem to deliver that
because nobody controls the whole stack.
On the other hand, Google finds it really easy to do image recognition
in the cloud of every photo you've ever taken
because that's what Google does.
And Apple conversely would find it.
very difficult to do that.
Theoretically they could, but it's just not in their DNA.
And so you've got this, you know, this thing that's been clear for a couple of years
is this sort of divergence of Apple and Google find it much easier to compete.
Each of them are good.
They're good at completely different things.
And so one of them will provide a product that's perfectly okay up to a point in one area
and a fantastic product in the other area.
And the other company will be completely the other way around because they've got a completely different skills.
So Apple is really good at stuff that's.
about integrated hardware and software. But crap, you know, crap, you know, not great at, you know,
integrated internet services and not really interested in machine learning at all. So they would
say, we don't scan your photos. We don't want to scan your photos. They're your photos. Whereas
Google would say, you know, we want to understand everything that you're doing so that we can
give you better stuff so that we can tell you what that photo was. Okay, so that raises the question
then, and this is always the question you have to ask with free services. What's in it for Google?
And I'm talking about Google photo right now. I think there's several answers to that. One of them
is they get all the photos and they can analyze them,
and that's more data for them to understand.
We're same as Google Books, say.
It just gets data in and then think about it,
and you can work stuff out from that.
The second is, the more that you're logged into Google,
the more that you're using Google services,
the more that they have a sense of who you are
and your identity and where you've been
and what you might be interested in.
And that translates through to better Google Now recommendations,
for example, and better map directions and all sorts of things
and better search results customized for you,
it also, of course, results in more relevant advertising.
I mean, I think the fundamental way to understand Google
is as a vast machine learning engine,
and everything that they do is about reach.
And they don't really care what the reach is
as long as they've got more and more of it.
And that's reach, both getting data in and reach getting data out.
And which device you use or what kind of data it is
is less important than the fact of the reach,
which is why this is on iOS.
Right, right.
Because, I mean, another way I'm thinking about this
is that for Google, Android is a tactic, and cloud is a strategy.
Whereas for Apple, the device is a strategy, and the cloud is a tactic.
The cloud is a feature of the core strategy, whereas for Google, the device is a feature
of the core strategy.
What does this say then in kind of our relationship with mobile in terms of, you know,
you've got free storage, you've got unlimited bandwidth, it sounds like?
I mean, if photos are that easy, that simple, and storage is that cheap that Google just is giving it away,
where are we kind of in this kind of continuum of what is valuable and what's valued?
Oh, it's interesting.
So, well, there's several ways to answer that.
One is that it used to be that music was a core point of strategic leverage,
that all your music was in DRM or DRM it was encoded in a certain way.
It was a pain to switch from device to device,
to switch from an iPod to something else or back again.
Once we went to streaming services, it became really, really easy to switch from
device to device.
And so content really isn't a strategic lever anymore.
It may be, we'll see what Apple does with,
is it next week but you know for the sake of argument right now content doesn't matter to
apple or google it's just a checkbox feature photos on the other hand if you've got all your photos
in apple's iCloud system it's real pain to move to an android device and vice versa if you know
if you've got everything in google photos it will have to see how well this works on iOS but
clearly on an android it will automatically sync seamlessly without you having to think about it
on iOS it probably won't or it will see how the integration works so photos
kind of does become a point of strategic leverage and, you know, a way of differentiating your
product from other people. I think there's another point in here, which is, you know, Google gave
this number of a trillion photos being taken a year. I've got no idea where they got that from
because even if you just add up the numbers from Facebook and Snapchat, Facebook and Instagram
and WhatsApp and so on, you get to about eight or 900 billion photographs taken last year.
And that's, sorry, shared. So if they were 8 or 900 billion shared last year, there's no way
the total number of photos taken was a trillion. It's got to be 10 trillion.
100 trillion. Yeah, because the photos you share
are, like you say, it's like 10th of what you've
did. Yeah, exactly. So that's like way, you know,
that number's way short at the total.
Just for contrast, in
1999, about 80 billion photos were
taken by consumers on film.
So then the,
kind of the next, the next point,
having said there were no tent poll features, I've kind of actually
talked about three things, but none of them are actually really
Android. They're Google Cloud services
that then get, may or may not appear on
iOS as well as on Android. And so
the first one is pay. The second one is photos.
pay is Android only, photos is not, and then the third is, is it now to tap or tap to now
or something, which is basically an extension of Google Now, which is available anywhere you
are on Android. And if the app is written properly, then you basically, you can be looking
at a listing in Yelp and you can activate Google Now, and it will just show you what the
address is and what the Zagat rating is and, you know, how long it will take you to drive
there, or whatever relevant information Google thinks it could provide about that.
And so it's a zone now is always kind of, whereas, you know,
the way Google now works at the moment is it's always looking over your shoulder
and every now and then it makes a helpful suggestion.
Whereas, but only for your web surfing and your email and your calendar and whatever.
But it doesn't look at you when you're in Yelp or what apps you're looking at.
Now, you can ask Google now, well, what do you think about this?
Who's playing this song?
Where's that restaurant?
And it's very context-driven.
So you could be looking at a restaurant in Yelp.
And you could just activate now and say,
how long would it take me to get there?
And you actually just say that.
You actually literally say,
how long would it take me to get there?
And it looks at what you're looking at.
It works out that there's an address there,
and then it works out how long it would take you to get to that address.
It knows that you're looking at a restaurant.
It knows how interesting.
And so this is, you know, to my point,
this is the machine learning and the cloud
and, you know, the stuff that Apple would struggle to do.
And this is kind of the magic of 15 years of machine learning,
which is everything that Google's been building.
And so what we saw all the way through I.
was, you know, that's what we do.
That's what we are.
And everything that we build is an expression of that
or a way to distribute that or get reach for that
or make it easier for people to access that.
And so you almost felt like Apple,
so at the Apple event, their whole narrative is,
here is this really cool stuff we built in iOS.
And here is this really cool stuff we built to make it easy
for you to build great apps on top of iOS.
Whereas for Google, it's much more,
here is this really great stuff
that Android users can get
and here are these really great APIs
that you can use to build apps
but it's like at Google
is much more about the apps they've built
than the operating system
that's my point
right and it's I see and it's not so much
about this ecosystem that again
on an annual basis gets improved
yeah so this is yeah I mean this is the thing
we're talking about earlier
because everything is in the cloud
you get this sort of
I had the same feeling last year
that it sort of felt slightly anticlimactic
I will say that.
We've done this a couple of times, and nonplussed is the word phrase that comes to mind.
So my response to that is this, that Apple, though obviously the hardware comes out once a year,
but that's sort of a separate point.
Apple is rolling out an operating system once a year.
They're rolling out an operating system that will go on to most of the devices pretty quickly.
So right now, about 80, 85% of all live Apple devices are running iOS 8,
according to both their stats and sort of mixed panel stats or Rackamai stats.
So basically, Apple roll out this operating system.
Everybody gets it, more or less, pretty soon.
And when you do an operating system, you know, you can't do it every month or every year.
You know, every year is about, arguably every year is too often.
So there's a whole bunch of cool stuff, and that drops once a year,
and then the next year there's more cool stuff.
From Apple.
From Apple.
From Google, there's two issues here.
The first is the cool stuff is all really in the cloud, and it's getting better every day.
So the idea of an annual release science.
is more to do with like PR and a developer event than it is, well, this is the stuff we've
been working on for the last year because like they'll carry on working and they'll release
more stuff next week and the week after and the week after that. It'll continue. It's
continual release. That's how the cloud works. The other thing is that right now,
about 10% of all the live Google Android devices are running lollipop, which Google announced
at I.O. last year, but actually only deployed in the autumn. So it's been out in the market
like eight, nine months or something,
10% of the base has got it.
Now, if you look at data for MixPanels,
they actually see 20% of their active user base has got it.
So obviously, which is you'd expect,
you know, people who are doing installing cool apps
that use MixPanel are more likely to be on the latest version.
But that's still 20% versus 80%.
And so, you know, Google,
if Google was to spend, you know,
a whole bunch of time and effort
making an amazing feature for Android,
it's going to take them for like three years
before the majority of the base has got it.
So why would you?
you do that?
Well, there's...
Isn't it make much more sense for it to be the cloud thing that gets deployed?
And of course, you know, this is the other thing that Sundar talked about last year, but didn't
talk about this year, is that something like 95% of all live Android devices are running
the latest version of Google Play Services, which is the layer that gives you all of
their cloud intelligence.
And so, yes, you're not running the latest version of Android, so you may not get the new
camera APIs or the new version of pay or the fingerprint API or all that.
kind of stuff, or, you know, the new graphics stuff, whatever, but you'll have Google
now. That stuff will put on new Google Maps stuff, and all of that stuff will appear immediately.
And so your point is that it doesn't matter so much that, and is Google consciously kind of
separating OS from core stuff? Yeah, so Google is separating the OS from the cloud.
They're moving all the intelligent, interesting stuff into the cloud as a layer that gets updated
all the time. And so I.O. is not like this is where all this year stuff.
stuff is in the way that it is for Apple, the way that it would have been for Microsoft when
you're doing an actual operating system that gets shipped onto devices. Yes, Android is an
operating system that gets shipped onto devices, but because of like the dynamics and the
slow replacement cycle and fragmentation and everything else, more and more of what they're doing
is kind of being shifted into the cloud or it's into this cloud layer that they built on top call play
services. And you see that most obviously in Google now where it's a client, but it's a client
for a cloud service you see it in i don't know maps i mean you know we didn't even talk about maps but
you know they wouldn't say so supposing they were to do street view now street view they might
announce it at google ioc because why not but it's not something that you know and you gets updated once
a year it's google maps is updated every second and new features appear all the time so that's the kind
of the distinction i think in what we're seeing do you hear google talk about it in those terms i mean
I mean, for example, they didn't talk about numbers of Android.
We didn't hear about Chrome at all.
Are they purpose, I mean, do you think that they're of the same mind as you that all the good stuff is happening in the cloud?
So at some point, we won't even be talking about this kind of underlying or the front end, the OS, et cetera.
So I wrote a blog post called What Does Google Need in Mobile?
And the analogy that I used is this book written by a French academic called Pierre Bayer.
which is a slightly facetious title of how to talk about books you haven't read.
And the point that he makes is...
I haven't read that one, I'll be honest.
I'll let you look at it.
The point that he makes is this, is that imagine a book that you read when you were 17
and didn't really understand, frankly, anyway.
And then imagine a book that's just come out and you've read four reviews of it.
Which of those would you actually be better able to talk about?
right when I'm 35 or whatever
yeah exactly
there's a book that you've read
and completely forgotten
you've actually read it
there's a book that you
read half of
the book that you skimmed
the book that you know
you've read three other books
by that guy and you've read the back of the bit
so you kind of know what that book is
and so the point is have you read it
isn't binary
actually in a kind of a fundamental
underlying you know
a sense of the meaning of what you mean
when you say do you know about that book
and so in the same sense
you know for Google to have reach
isn't binary
you know if you are um you know supposing you have a google android phone that's not like forked or
anything and it's got all the google services and you're completely logged in and everything's perfect
and you live in a suburb and you drive to work and you know the way to work and you know the way to the mall
and you know the way to your friends have to the bar and so you never use google maps and you don't
have any meetings so you don't use calendar and you use exchange at work and like how much all google's
really getting from you is web search.
Right.
Now imagine you are, you know, a 20-year-old developer living in San Francisco with
your iPhone.
You've got an iPhone, you're assigned into Google Maps, and you use Gmail, and you use Google
search.
You don't use Chrome, because no one uses Chrome on iPhone.
That, like, which user is giving them more stuff?
So there's all sorts of different variations in here for Google, and this is the point
that Apple sells boxes.
Google doesn't sell phones.
Right.
And so you've got this completely.
different dynamic of what they're trying to build and how they're kind of trying to go about building
stuff. And you see that in, you know, this is what you see and what they build. That, you know,
why is iOS, you know, photos is an iOS. Well, so that's, which is an iOS. For Apple, it is a binary
though. You either are or your art. Well, this is what I was saying that, you know, they're doing the
same things, but for one of them, it's a tactic and for the other one of them. It's a strategy.
And so there's not this sort of fundamental conflict between them. They're just arriving in
different places. The other part of the IO that got everybody excited about was VR. There seemed
to have been quite a bit of announcements, hardware, platforms, etc. Last year, I believe, Google launched
cardboard. Yes. So what did we see and where does it take us? So Facebook bought Oculus.
Google has a investment in a company called Magically, which we'll also invested in, which is
augmented reality as opposed to virtual reality, but also 3D. And there is sort of sense
from anybody who's used this product that sort of, oh my God, this is part of the future,
not quite sure what part of the future or what it might look like.
And then you've got another layer, which is, well, fundamentally, VR 3D, Oculus,
is smartphone, it's just a smartphone, really, with software.
Oh, it's waiting to be part of a smartphone, right?
Yes, exactly. I mean, VR is basically smartphone components.
It's the end point of the smartphone and supply chain.
And so like drones and wearables and connected home and all this other stuff,
VR is the piece dividend of the smartphone wars, not that it's a piece,
but it's the dividend of the smartphone wars is all these components.
And therefore it shouldn't necessarily be this closed,
this proprietary system being made by this one company that got bought by Facebook.
It should just be another screen.
And it should be an aspect of every screen, maybe, or certainly every smartphone.
And so you already see this with the Samsung Galaxy Gear VR,
which is a partnership with Oculus or uses some Oculus technology.
clearly Google would take the view, well, you know, all we're really doing is having a smartphone
show two images and work out where the smartphone is pointed. Really? So that kind of should
be part of Android or indeed, you know, part of YouTube. And so what they've done is this partnership
with GoPro for basically Daisy Chain a whole bunch, a dozen or so GoPro cameras in a circle. So
you can record 360 degrees 3D video in high-deaf. So,
I've seen, actually, I think GoPro has done this, but people have hacked these things.
Yeah, you can hack them together.
But, you know, you then have to put a lot of software in because you've got to balance the images
and you've got a, you know, you've got an image correct and light, correct for light
and shade and all this kind of stuff.
So you get this thing.
And so what that then means is you then have a stream and you can put the Oculus in
and you're standing next to Paul McCartney on stage and the video is playing,
and you can turn your head and you see the video behind.
So it's fully, and it's 3D, which is a, you know, really, really full of it.
And so what goes, basically, but what you're doing is you're taking commodity cameras, more or less, and software.
Right.
And so they've got a dozen cameras or two dozen cameras basically pointing outward set into a ring.
And then you have software that takes a video from each of those and merges them and corrects them so that you get a high-definition 3D stream that's in 360 degrees.
So you can turn your head around and look behind you or look wherever you want.
And then you have an integration with YouTube so that you can publish your stuff.
directly into YouTube and distribute it to YouTube and connect YouTube to your 3D device,
whether it's an Oculus or something else.
And then your cardboard, which is really just a kind of an observation that, you know,
you just need two lenses and to hold the screen at the right distance from your eyes.
I mean, I saw a whole bunch of this in CES.
It's just kind of three pieces of plastic.
Right.
To hold the phone at the thing in the right distance from your eyes with two lenses in front
and, you know, a little bit of a few gears and so on, theoretically.
So do you expect then in that kind of YouTube paradigm, you've got HD, you've got 3D,
you've got VR buttons on the bottom and...
Potentially, yeah, and, you know, obviously you have to, you know, you have to, you have
the right harness and put your phone in front, to hold the phone in the right place,
or maybe you have a dedicated one or something.
But I think what Google is trying to do is to just pull this all into software and pull it all
into the cloud and say, well, you know, that's part of YouTube now, and we don't need a
dedicated system to do that.
And so it's going to be interesting to see, obviously, for games or anything interactive,
it's quite different.
And video is obviously only one aspect of what you would do with 3D.
but for 3D video
I think Google are making their play
to participate there
and again as far as the ecosystem
Chris Dexson here at the firm
he talks about it as a new kind of medium
does this push that
access to it much quicker
yes it makes it much easier
you know it's the box brownie story
all over again I think
you know that doesn't mean there aren't a whole bunch of challenges
of what you actually do
I mean there's a sort of you know I think
3D video is really at the
kind of pre-Eisenstein stage.
You know, somebody's actually got to work out, hey, you can cut, and you can move the camera
and then start filming again.
Oh, wow, wow, what does that mean?
You can do montage, you know, you can have a crack on a camera.
And this is all the stuff that had to be invented because people started out just filming
a theater.
And it takes a while to work out you can actually move the camera and you can cut the film.
And it's exactly the same thing for VR.
you know, what does it mean for a director to be shooting the scene
where you are focusing on where you want people to look
and you're closer in your choice of lens and you close up
and how you frame the shot
when the person who's watching this can kind of turn their head
and look out of the window.
Right.
Never mind like walk out of the room and walk down the hall.
But, you know, we've got that question,
well, what is this supposed to, you know,
what would linear entertainment look like in this environment?
Google also announced or has upgraded, shall we say, Google Home, what's that all about and what are we seeing there?
So we have a new programming language, a new platform called Brillo and a networking platform, I think, called Weave.
Again, I've got to go and look at all the detailed sessions around this.
Essentially, if you've got a connected light bulb or a connected thermostat or something, Android probably doesn't fit on that device just because it's too small.
and too cheap and not hasn't got enough memory or CPU storage and so you want something else but
you probably want something else that isn't some horrible embedded system and so what is it and
Google's answer is we'll use this and here's another open global open standard that anyone can use
so Google owns nest and so describe then how this could work in that kind of nest view of the world
well it I think there's two parts one is it's just another way of trying to accelerate up
take so it becomes easier to make the connected light bulb or the connected scale or the connected
stove or whatever it is or the connected window lock or really really small light cheap things it becomes
easier to make those and it also becomes easier to auto discover them and to configure them all
and they will automatically show up now they will automatically show up in android there may be an ios
client whatever but it's just kind of it's a fundamental enabling technology it's a sort of an open
standard to make it easier to make all of these things and you know that's fine I
think the um you know the the the challenge with internet of things standards is the old joke
there's so many to choose from and you know there's a fundamental sort of underlying question here
which is how many of these things need to be smart need to talk to each other need to be plugged
into a common standard um you know does your door lock need to talk your door lock might talk
to your car alarm it probably doesn't need to talk to your tv set and so we're still sort of
And then, of course, you have all semiconductor platform companies trying to do lower level
interconnection systems.
And so there's all sorts of sort of stuff swirling around here, and it hasn't quite
kind of crystallized.
Developing world.
Yeah.
So this is interesting.
And obviously this is something that Google and Facebook have both been circling around
quite a lot.
There are, as it might be two to two and a half, two and a bit billion people on earth today
who have a smartphone.
There are, depending on your estimate,
somewhere between 4 and 5 billion people with a mobile phone, maybe even more.
There's over 7 billion connections, but, you know, an awful lot of people have multiple sims.
So, like, there's something like 900 million live connections in India,
but at least half of that is people with more than one SIM, maybe more.
So one of the things on my list of to-does is to get my own estimate,
my own guesses for how many people there are.
But anyway, there were at least another billion and possibly two.
or even 3 billion more people to get a smartphone on earth.
And all of those people have less money than the people who have smartphones today,
almost all of them.
And a lot of them are also, and living countries where,
and living countries where they may not have access to mains power.
So it's challenging to charge their phone.
Living countries where they may not have, where 3G networks may not be built out,
or they may live in rural areas where there is weak coverage or no 3G coverage
and they're on GPRS or edge, probably GPRS,
so they're at dial-up modem speeds.
basically. And the fact that somebody is willing to buy a $30 or $40
or an Android phone or iPhone does not mean, instead of a $600 Android or phone or iPhone,
does not mean that the mobile operator can give them a gig of data for 50 cents a month.
Because the network still kind of cross what it costs. So you have a cost of data issue
and a coverage and a speed issue. And so what Google is doing is, or they're doing various things,
including balloons.
Project Loon, right?
Yes.
but more tangibly what they're doing is a,
I think it's very interesting.
You, if you're in one of these markets,
the search page will be like,
have to use 10% as much data.
So all the clever HTML stuff
and the dropshadles and everything is stripped out.
And if you then tap on a link,
you won't get the page.
You'll get Google stripped down,
cut down, transcoded version of that page.
So you won't necessarily get the time story
in the times of India.
You'll get Google's transcoded version.
of the story in the Times of India
where the images taken out
or some of the images taken out
or downscaled and all the JavaScript
taken out and so on
apparently with all the ads still there
which is important
but
which means that the page will load
much quicker on 3G
and obviously if you are
you know if the publishers in that market
they may well have had in mind
people who have only got those to those phones
but you know if they're looking at you know a page
from the developed world that developer may well never
have thought that hey maybe people are accessing this on
I wonder, I mean, there's some part of me that would like that here, for that matter.
I know what you mean.
It's one of those things, a bit like internet.org, which clearly infringes on internet neutrality
in some ways, but is also actually a response to fundamental user needs.
And, you know, if you are, you know, if you could, have you paid for 10 or 20 meg of data
this month, and that's a meaningful part of your income, and then a web page it uses a meg of
data, it's a problem.
Right.
And the fact that it's neutral is really not.
either here or there. So this is, again, as I said, like the balloons, like internet.org,
it's a sense of the next billion people have got challenges that are different from the challenges
that we face in getting online. Well, Benedict, thanks as always. Yep, thank you. We will be sure
to broadcast the next episode using a VR rig in 3D and 360 degrees, so you can look at the back
of Benedict's head. That's it. Thank you.