a16z Podcast - a16z Podcast: Location, Location, Location -- and Mobile
Episode Date: June 11, 2015Pick your metaphor: Smartphones are "remote controls" for the physical world, or perhaps, as Steve Cheney argues, they're "cursors for the physical world". Either way, it's clear t...hat the age of mobile is here, GPS is not enough, and with sensors all around us -- both outdoors and in indoor locations -- it's finally time for truly context-aware computing. But what will that take, both content- and design-wise -- is it all just about eliminating friction? And how are players like Apple and Google positioning themselves for this micro-mapped world? a16z's Benedict Evans and Estimote's Steve Cheney talk about these questions and more in this episode of the a16z Podcast...
Transcript
Discussion (0)
Hello, this is Benedict Evans, and welcome to the 816Z podcast.
I'm here this afternoon with Steve Cheney from Estimate,
and we're going to talk about location and maps
and knowing why you are and where your phone is
and what your phone can do with that and all these kind of interesting things.
We were just chatting about this earlier.
I thought one of the interesting common strands across Google I.O. and Apple's WWDC
was kind of the death of the 10 Blue Links,
and Google doing a bunch of stuff to try and get ahead of you
having to type a query into a search box and press go
with now on tap and all sorts of other stuff going on inside the OSS
and inside their services
and Apple doing this proactive thing again
where the computer is trying to get ahead of you
and work out what you might need
and it kind of reminds me of this old computer science
saying that a computer should never ask a question
it should be able to work out the answer to
and we've got these pocket supercomputers
with two dozen sensors that we're carrying around with this everywhere and Apple and Google
amongst other people are trying to work out, well, what could you do when you've got that
and what would that mean? Yeah, no, it's fascinating. I think to some extent if you think about the
phone, the watch and these devices, right, it's not an extension of us as a human being.
So to some extent, we think about this as like it could become the first brain. It could be
something that it decides what's going on before you sort of tell it to. And I think the predictive
nature, you know, it, you know, I think we heard on stage at I.O. that the phone could potentially
tell you when you're hungry and, you know, when you're on your treadmill and which music I want to
listen to. And that's, you know, fairly provocative. And of course, the, you know, the search
intent and, like, killing the links is one thing. But how does the phone understand what you're
going to do? How does it understand what you're going to do when you're indoors? Is it really
know you're in your kitchen? Is it because your toaster told it, you know, hey, you're in your
kitchen. And there's a bunch of different ways that Google and Apple are approaching this
fundamental problem. And I think the same rules apply around, you know, device-specific strengths
for Apple and cloud-specific strengths for Google. But there's a missing link there where
there's still no layer of context in physical, you know, indoor environments, which is
where we spend, what, 80% of our time? Yeah. Yeah, exactly. So there's a sort of, I mean,
you can see people groping towards it. So Apple, for example, we'll use summation sensor to
know that you don't have a signal, but the phone hasn't moved since the last time you
didn't have a signal, so don't go to full power and hunt for one. And, you know, the phone
is face down on the table, so don't turn the screen on when you get a message. And there's all
these sort of little recipes that you kind of build up bit by bit by bit, rather as kind of
Windows and Mac OS built up all that kind of stuff over the last 30 years. And it's now being
added to the operating systems. But there's a kind of, to your point, there's a layer of context
around, well, precisely where is the phone? Precisely what are you doing, which of course is
what, you know, beacons are about and what, um, on one land and what Google now is about
on the other hand. It's about, you know, like Apple and Google are both kind of climbing the
mountain. I say this is the thing I'm probably saying too often, but they're kind of climbing
the mountain from the opposite side. So on the one hand, you have all of these sensors and you
have bleakins, which of course is what estimate does, um, for working out exactly where an iPhone
is. And at the same time, Google is using now to work out, well, exactly what would it be
useful for us to say to you at this moment. Yeah, I think one way that I think is a good frame is to look
at fine, you know, fine grain versus rough grain context. And something very provocative that actually
isn't explained and talked about much, which is very ironic. Non-ironic is my favorite word,
by the way. So I'm going to try to weave that in one more time. Very non-ironic, but we're sitting
in the valley and all this old heritage from the semiconductor stack and networking, you know,
is just right down the street from all these software companies. But people don't talk about
what's this other limitation at play, right?
How is the network evolving?
So you have this strength with Apple, you have these strengths at Google.
And of course, Apple has a chip team.
But there's this thing called the physical layer in the internet stack,
which is effectively, well, the air interface,
how is your phone or how are these devices communicating with each other?
And something really provocative to think about around the physical layer
is that, you know, GPS satellites went in the sky, you know, X years ago,
we'll call it 20, largely because of, you know,
applications that didn't exist yet or utilizing them today, but largely because of the government.
And although Moore's law has made progress every year, and it's just astounded us and these amazing
companies just producing these new chips that do twice as much with half the power, twice the
understanding, and we'll just shrink a device that can be a computer on your wrist, the physical
layer isn't getting any better at penetrating signals through buildings.
So GPS is enabled a car to come to your house in three minutes.
amazing. No one understood five years ago that a company like Uber or Lyft would exist
today. But in five years, I can guarantee you that the phone will still not understand
it's in the building because of a new GPS signal that can now penetrate walls. It will not
physically happen. There's an interesting split here because, of course, pre-97, and in effect,
one of the things that buried Rim and, to a less extent, Nokia, was that people producing
this stuff was super, super focused on the radio network.
And the Apple, one of the things that allowed Apple to disrupt everything was by saying,
no, no, no, no, it's just a piece of wire and we ignore it completely.
We make zero effort to optimize bandwidth.
You know, it's kind of an overstatement.
But we just completely ignore that.
We'll let Qualcomm deal with that.
And what we're going to do is we're just going to treat cellular like Wi-Fi.
And that completely turned, of course, that completely screwed up AT&T's network for about two years.
and it completely transformed what kind of experience you would build.
And you now see people trying to build applications in emerging markets
where you discover actually you can't act like that at all.
But I think part of that mentality is, you know,
just don't think about what the cellular network can do
because that's three years of conversations with some BD guy at a mobile operator.
Don't think about any of the physical layer at all.
Just kind of take what the GPS, take the AGPS, take the cellular and the Wi-Fi and everything else.
And Apple will do that, Google will do that.
But then completely, you know, don't try.
and do anything on top of that and you know the interesting thing about beacons is they give you this
way of digging in and actually saying actually you know you can know exactly where the phone is
inside the store or inside the building and what would you do with that it's like the kind of the
challenge is how do you get past the you know the old you walk past Starbucks and it gives you a
free coupon story which we've all been hearing for 15 years and the answer was always no
know, just put a poster in the window, that's going to work much better.
Yeah.
How do you get, you know, we've now got this kind of latent technology.
It's almost just now we've got GPS in the phones.
Well, what do you do with that ability to know where the phone is to within the nearest six inches?
Of course.
And I think with the iPhone making, you know, as much progress as it has in these eight years,
we can't fathom yet what it was like before in this, you know,
when telecom carriers had had the power and you really had to go through this, like,
probably seven filters to get an application on someone's phone.
I'd like to address that by rolling back actually to something where in the 80s
smartphone or today's platform but when PCs came about there's actually a really
interesting analog about a developer-centric approach and how as platforms evolve you can
have power in the hands of a developer so this is an analogy that I'm in love with and
it's sort of this it's a canvas right and it's like developing on top of a canvas and
if you think about when Apple invented the mouse or I think it was Park but when the
Macintosh had a mouse really what that meant is it meant
that you moved your hand and wrist on a table and you didn't worry about the physics behind which
way the mouse is moving, but the cursor moved on the screen.
And as the cursor moved, of course, consumer apps could exist, but actually flip that and
think about from the developer perspective, developers now could move the mouse and draw a region
or click and select three cells, hit a button, and they would add.
So things like VisiCal came about and other applications came about.
And that was powerful.
That was an amazing phase shift because in a very small period of time, people were having personal computers on their desk that solved problems.
And I think in today's world, there's an analog for that.
And the metaphor is that it's the physical world you're walking around in.
This is the new canvas.
You're walking around with the smartphone, which I think someone famously called a remote control for the physical world.
But if you flip that again, think about it from the developer perspective.
It's a mouse cursor for the physical world.
and they can build apps on top of actions,
context, venues, things you're doing.
And so, of course, walking by the Starbucks and the latte being, you know,
made is a little bit of a, it's a generic example that's been used even around things like beacons.
But actually the reality is you've made the decision to go get a coffee 10 minutes ahead of time.
So why doesn't your phone know that you've made that decision?
And a better question, is there a way for your phone to understand that you've made that decision?
Is it like mental telepathy with a brista nose?
unlikely. It's probably your phone predicting or you taking one action and it talking to the coffee
maker. And in the limit of that just isn't, you know, the cars now are being intermediate, you know,
probably will be driverless. I would likely suggest that Starbucks will be like bristolas in
five years. Yeah, it's a kind of, it's a question, as you say, about context, that how,
what does it mean when you know where the device has gone? And
that not that it's to the nearest 100 yards,
but it's to the nearest five yards.
I suppose the example that occurs to me is
using the four square glance on my Apple watch.
And suddenly it goes from something
that you, you know, getting a tip on a restaurant
in four square.
You haven't, theoretically nothing has changed,
but actually it's a difference between taking your phone
out of your pocket, loading up the app,
waiting for it to do X, XYZ,
and just looking at your wrist.
and it moves
in the same way you could say that
you know for Uber or Lyft
it doesn't actually need GPS
you can just type your address in
but actually just having the GPS
transforms the kind of capability
of doing those kinds of things
and I think there's a lot of
there's a lot of things that become possible
when you remove friction
and when you remove the difference between
well yeah why would you need
why do you need 4 square
to have GPS, why can't you just type in the name of the restaurant you're standing in front
off? That's only like 10 seconds. What do you like lazy or something? Well, yeah, you could do,
but it's all about when you remove friction, suddenly experiences go from being tedious to being
kind of magical. I mean, you know, you talk about the smartphone as a remote control.
I actually prefer the example of the smartphone or these devices generically being superpowers.
That is to say, so I was, you know, walking between two meetings in New York City a couple of
weeks ago. And I used my smartphone, tapped on where I was going, then tap the walk
button, put the phone in my pocket, and looked at my wrist every time I got to a junction. It said
turn right, go ahead. I listened to this on your last podcast. Yeah, tap me on the wrist twice. Oh,
it's time to turn now. And yes, I can walk down the street holding the smartphone looking at the
map. We've all done that. We've all been slightly afraid that someone was going to snatch it or that
we're running the battery down. You're going to drop it. And it's just kind of an awkward way of walking,
just like walking down the street holding a map. You know, raise your wrist, look at your watch.
suddenly everything changes.
And I suppose the point that I'm getting at is that what sensors do
and what wearables do and what all of these technology that's kind of emerged,
just kind of in the last year or two, does,
is they take all the friction away.
And so it's like there's the stuff that you could do using the old stuff,
but with the new stuff, the friction goes.
And when the friction goes, suddenly the UX changes
and things that were possible become, things that were possible,
but a paint become completely ordinary.
And you just do them all the time.
100%.
And I think it goes back.
to this fine grain. So what could happen if, you know, your device is really just understood much
more than they do today. And it's going down this path. But every time we level set and we look at,
okay, now we're in 2015, wow, there's a watch and it sort of understands. And then a few months
later, there's this thing called compilations. What is it called? Now the watch OS2 has something
that's a bit more complications. And so these new things come out and we're like, we're level set
again, that this is the new normal, but we should expect much more. And as soon as this
technologies that they're connected better and they're really just like in sync with each other.
And of course we're talking about sensors on devices and sensors distributed in physical
environments. There's going to be a 10x increase in the amount of how specific those actions
can be predicted. And I think a lot of it is the best way to distill it down probably from
my perspective from kind of thinking about approaching developers is can you just give the developer
an x and a y coordinate or a z as well um and of course like you can do that outdoors you can do that
roughly but that's all the developer wants it doesn't want to interpret a signal strength from a
sensor or you know have some um really like um complicated mechanic that is exposed by uh the m8 sensor
it just wants to understand and distill down can you just at this moment tell me and moreover
do it computationally free not at a penalty to battery light
right, which is the problem that a lot of companies, whether it was, you know, moves or Foursquare battled against with data science to try to keep your battery from dying as they predicted where you were.
But we're getting much better now.
And I think the reality is that word is not there yet and there's no one really leading the charge.
Both Apple and Google just barely mentioned anything at the respective developer conferences about indoor location after so much, so much hype.
I think there was a blue dot demo from Google and Apple had.
like one session on core location but didn't really talk too much about like the extensions
they made and inside their indoor technologies and a lot of this comes down to it's just super
challenging to calibrate an indoor venue well i think it is i think there's another strand here though
which is when you look at things like where google is taking now and where apple is trying to
take context um and you know that you know you now swipe left from the home screen to get to this
context screen and it seems like a kind of a small UI change but that's there now you know
that's where everything sensible will get interesting will get suggested to you in much exactly the
same way as Google has now except that now or you also swipe left on the home screen on Android
so that actually that's where Apple had the search screen we've been back in iOS 6 but I suppose
what I'm getting at is the challenge for this location stuff is and there's always been like
there's been like this chicken and egg problem which is that the beacon was there
but you had to have the app that knew to what to do with those beacons.
And so that was great, like if you went to or you're going to spend the day in the museum
and you could install the museum app.
But if you went into a mall, then you think, okay, I'm going to install an app for every store in the mall?
Of course not.
I'm going to install the mall app, no.
So what do all these becons and beacons actually do?
I mean, it's a classic problem of, you know, that you are the most important thing in a corporation's life,
but the corporation is not the most important thing in your life.
And so they want you to have the app, but that doesn't mean you want the app.
And so the thing, what I'm getting at is, as Google and Apple build up this meta layer of understanding and analysis around now on the one hand,
and Apple what Apple is calling Siri or they're calling it proactive, they're going to rebrand all of this and repackage it again.
But the layer that watches you and makes suggestions, that's where that location stuff really starts surfacing back out, because that's when it can turn around and say, hey, you know, here is this useful piece of information.
And that's how you solve the binary problem of the app is installed or it's not installed
because you've got this intermediate layer where content can show up that doesn't have to be the app.
I mean, it's a little bit like, as I was saying on the WWC DC podcast yesterday,
it's a little bit like what Windows was trying to do, what Windows phone was trying to do with the home screen.
We had this stuff appearing on the home screen that you didn't have to open the apps, but you still saw stuff.
But you had to have the apps installed, obviously.
and I think more Google is poking around at
and Apple are poking around at is,
well, what useful stuff comes to you
without the apps installed
or without opening the apps?
And that kind of is a kind of a key lever
into that kind of micro-granular context
that you're talking about
that you don't have to have the app,
it can still show you useful stuff.
100% agree.
You know, if you look at the iOS revisions
that have been just released in the last three years of 6, 7, 8, 9,
now. We've seen a preview of it. They are converging on that, right? And it's subtle things.
First, it's a few pixels in the bottom left of the screen, this hero icon where you open it up,
and it will think you're in Starbucks, it opens that app. I don't think there's much uncertainty
that the skin in the UI will become much more as sophisticated and predicting. And I don't
think there's much uncertainty that sensors are a huge, huge now advancement for the phone
understanding, like, have you dropped it, have you done this, have you moved it, should it
turn the battery off.
And so those two combined will probably actually be additive.
Well, there'll be a combination effect that gives a lot of power to Apple actually as
an integrated kind of ecosystem where they can just do whatever they want with the skin.
And they can also predict that every, you know, 85% of people have the latest OS and, you know,
this many people have the latest model, which has the exact same antenna and reflect signals
is the exact same and you know they can approach it from a developer angle in a consistent way
so the the um it just hasn't happened yet and why are they teasing out such small small
advancements to sort of surfacing like non-app based ui um from home screens and from other
places is it just is it the same problem of you have to know what website you want to go to
well i think there's two part those two things that occur to me listening to you one of them is
that
this is a point I've made
in the context of messaging apps
is that the smartphone
is a platform
in a way that the PC was not
and on the PC
yes you could install
all sorts of different
internet connected apps
but argue Netscape was that layer
yeah exactly in reality
the web browser
was the platform
not the operating system
and say you had your Windows
or you had your Mac
and you had one icon
which was the internet
and everything appeared
within Netscape Navigator
and Netscape made the experience
consistent for everyone. And everything happened inside that. And so the internet experience was a web browser and a mouse and a keyboard. And yes, there was Spotify and Skype and what have you ran the ideas. But basically it was a browser. And it was a browser and the smartphone broke that apart. And it moved the layer, the services layer went, went one level down in the stack. You went to the operating system. And so this is obviously the first impact to this is for messaging apps because, you know, you can see people's address book. And you can send people push notifications. So you can, you know, people don't have to keep visiting a website. And everyone has all the apps have easy access to your phone, your phone directory, your photos. And so. And so.
it became really easy to have five messaging apps where nobody would have used five different social
networks on the web because you didn't have that platform there.
And I think what we're seeing now is, well, what other things do you do when the operating
system itself is the internet platform?
And it's not just providing an address book in the photo library and press notifications
to messaging apps.
It's also about context and it's about watching everything that you do and seeing your meetings
and seeing your diary and where you've been and where you are in the store and watching you.
in effect, which is why Apple, of course, is talking about privacy so much.
Yeah.
Go back to the sort of concept of indoor.
What does it mean when you're indoor?
I think if we just assume that the phone now is just not anywhere near our first brain,
but it's going to converge on becoming much more intelligent, the actions and the things that
that means for the consumer is you approach a sensor and, you know, deterministically
100% the phone or the watch understands that you have actually approached that area of your
house that you haven't been in in a certain amount of time.
That's a different signal for the phone to process, well, what should be that action?
And if in that corner happens to be your dog wearing a sensor on his collar,
and the dog is thirsty and you haven't walked him in this long,
because the sensor knows it hasn't left the perimeter of the home,
that might be a value add for the consumer.
It might be something that is amazing,
that they just, like, never understood that they really wanted or needed.
And it also obviates the need for humans to just create a list
and to do all these things and store all this information,
in our head, much as Google obviated the need to just remember everything because you would
go to a search bar and you would just find that information. So I think we're really trying to come
full circle on what these applications can do, and they're going to make humans much more
intelligent and liberate us to probably go up and up the stack, is that allowable to say,
to go do other things.
I mean, I think there's an analog when you look at what's happening with all the companies
that are bidding for Nokia's here maps at the moment, that location, in a sense, is page
rank for the real world. And pushing both understanding the maps, and you say there's a story
this morning that Apple has got its own fleet of cars driving every street in the world, just like
Google did, to build its own ground truth for maps at last. Spending some of that is out $200 billion,
that knowing where everything is, right down to where is that piece of furniture, where is that
stand in a store, knowing down to the inch where the phone is, where you've been, where you're going,
what you meet, what might happen next, completely transforms how you can tell people things,
how you can say useful things to people.
100%.
They're storing a digital copy of the physical world in the cloud,
and I think the only question is what does each of those companies?
There'll be many more companies, Facebook and other people,
relevant in that discussion.
But I think it's clear that Google has one intention on what to do with that,
and Apple has potentially quite a different one.
Okay, well, that's all we've got time for,
and that's kind of fascinating to think about.
Thanks a lot for coming in.