PurePerformance - 074 Advanced Real User Monitoring Tips & Tricks with Ben Rushlo
Episode Date: November 5, 2018Happy Guy Fawkes Night!In the first episode with Ben Rushlo, Vice President of Dynatrace Services, we learned about things like not getting fooled by Bot traffic, which metrics to monitor and how RUM ...can replace your traditional site analytics.In this episode we dive deeper into RUM use cases around user behavior analytics, bridging the silos between Dev, Ops & Business and elaborate on why blindly optimizing individual page load times is most likely wasted time as you won’t impact what really matters: End-to-End User Experience!In our discussion we also talked about UX vs UI as well as importance of Accessibility. Here two links we want you to look at: Holger Weissboeck on Let’s put U in UX and Stephanie Mcilroy’s presentation at DevOne.Listen to Episode 70:https://www.spreaker.com/user/pureperformance/070-exploring-real-user-monitoring-with-Let's Put the U in UX:https://www.youtube.com/watch?v=qi19hls9LfYStephanie Mcilroy’s presentation at DevOne:https://devone.us/speakers/#stephaniemcilroyData vs. Info article Brian mentioned:https://medium.com/@copyconstruct/monitoring-in-the-time-of-cloud-native-c87c7a5bfa3e
Transcript
Discussion (0)
It's time for Pure Performance.
Get your stopwatches ready.
It's time for Pure Performance with Andy Grabner and Brian Wilson.
Hello, everybody, and welcome to another episode of Pure Performance. My name is Brian Wilson and since today is actually the day before Halloween and this episode is airing just after Halloween,
I'm going to introduce my co-host as Andy Candy Grabber.
Hello Andy, how are you doing?
That's interesting. Especially the grabber thing, right? I mean, that's, that's, I know it's related
to my last name Grabner and not that you were intending anything else there.
Well, you're Candy Grabber, you know, like that's what you do on Halloween.
Happy Halloween to you too.
Yes. Are you you you sound a little
under the weather are you you feeling okay uh yeah i'm just have a strange thing with my voice i don't
know what happened over the weekend uh my wife is also trying to figure out what i did wrong
because she's off she's fine and uh we made it for the weekend and i ended up with having a very
strange rough voice and i i promise i didn't excessively drink nor smoke cigars.
So that's not what it was.
Okay, but maybe other things.
All right.
So with that, we have a repeat guest.
Ben Rushlow is back with us today.
Ben, hello, Ben.
Let's just get to you, Ben.
Hey, how are you doing?
How are you?
Good.
Hey, guys.
Hey, Andy. Hey, Brian. Thanks for having me back. I guess I must not have been too horrible the first time.
And happy Halloween, I guess, day before Halloween, Eve.
Yep. And thank you. Before we go into, before you reintroduce yourself, I do want to refer listeners back to episode number 70. This is episode 74. So just go back four episodes,
episode 70. You'll hear kind of part one of this conversation where we talked a lot about a lot of
the real user and synthetic type of topics. This is a continuation of that conversation. So please
go back if you haven't listened to that, listen to episode 71st because the context will carry
over into what we're discussing
today. But with that said, Ben, do you want to reintroduce yourself in case anybody is ignoring
my advice to listen to episode 71st and let us know who you are, who you work for, and why anything
we should take seriously, anything that you have to say today? Sure. Yeah. So I'm Ben Rushlow. I work for Dynatrace. I'm the vice
president of services, which means I have kind of half of the global services teams. And we're
focused on what we call DEM, which most people don't necessarily know what DEM is, but it's
really kind of synthetic, real user and session replay parts of our product and really helping
our customers make the most out of that, you know,
those product areas and kind of understand how to improve performance. In terms of why you should
listen to what I say, I have no idea. That's, that's, I don't, I don't have a, have a great
answer for that except for, you know, I've been doing it for a long time. And really what is
interesting to me is all the customers that we get to work with. And so when you're listening to me,
you're really listening to thousands of customers over the last 20 years that I've been doing this and kind of their stories, but I just
get to tell them. So that's cool. Hey, Ben, actually, this reminds me, I mean, we go,
we obviously work for the same company now, but we go back before we both worked at Dynatrace,
back when you were still at Keynote. Yeah, way back, yeah.
This was in the days when we had the Dynatrace HX edition and you brought us in for a couple of engagements
to analyze web performance issues
that you discovered with your larger enterprise clients.
And that was actually a really cool thing
that a lot of blog posts on my site
came out of these engagements, also product improvements.
And as you said, we're all here, Brian, you and I, we're all, I think we're all in the lucky situation to speak with a lot of customers and learn from them.
And then using a channel like the podcast to give it back to the community.
And that's pretty cool.
It is cool.
I remember the Ajax edition.
It was such an awesome product. And I remember, you know, Yeah. I remember the Ajax edition. It was such a, it was such an
awesome product. And I remember, you know, kind of just taking the industry by storm and then,
yeah, the working with you, it was great until, until we couldn't work together anymore because
we were competing and now we're all one big happy family. So that's a interesting story, I guess.
Yeah. So, um, building on the, uh, the previous one, right. we had we so Andy, obviously, you were you were not able to make it. I think you were held up by their commissar last time. If my memory serves well, it just came back to me now. Right. So glad to have you back in this conversation, because I know this is a topic you're very knowledgeable and passionate about as well. We're speaking about rum topics, synthetic topics, just basically, you know, the end the end user that end experience what's the whole point of your application what that what performance
considerations and things to look out for that in that side um so ben has given us graciously a list
of topics he enjoys talking about was there any specifically that you wanted to tackle uh first
andy or ben excuse me i have a few more questions if you don't mind.
I think I will let Ben, you know, start because he obviously, he's our guest today.
And I know you have a list.
So, Ben, what do you want to talk about first?
Well, I think I'll tackle, you know, kind of the idea of, you know, how does real user
monitoring, how do I see real user monitoring kind of as being, you know, the voice
of a voice of truth for performance. And I think that's, I don't know, I'd be curious what you guys
think, but I know that, you know, we've all done this for a long time. And you, what I find is that
there's so many kind of voices screaming to our customers about, you know, this is what performance
is. And it's can be silly things like, oh, the CEO, you know, on, on his or her computer, you know, this is what performance is. And it can be silly things like, oh, the CEO, you know, on his
or her computer, you know, went to our site and they think the site is slow. Right. And that sounds
silly, but, you know, you'd be surprised how many times I hear that actually to, you know, hey,
Google, you know, because, of course, Google can go into any of our clients with their name
recognition. You know, they've come in and they've run some tests, you know, on, you know, 2G or 3G, which is kind of their common,
you know, approach. And they tell us our, you know, our pages take, you know, 30 seconds.
And so that becomes sort of what executives are then spouting to all the teams. And so I guess,
you know, that's, to me, an interesting topic, which is you can't even talk about optimizing performance until you sort of know what the voice of truth is.
Right. And if you have too many voices, then what you spend time, I think, at least we've seen with our customers, you spend time doing is really running from voice to voice and trying to vet and understand, well, this doesn't match with this and this doesn't match with this.
And, you know, and you don't actually get to optimize performance. Right. And so for me,
I guess what's so exciting about a real user product and, you know, kind of hopefully getting
our customers, you know, to to look at that as the single voice of truth for performance
is then you can spend more time actually optimizing
versus kind of arguing and debating about,
oh, well, the QA team uses this headless browser
and yeah, the CEO is using kind of a stopwatch
on their machine and we also have this other data source
that we think, and Google's coming in and telling us,
and the reason I keep saying that is
because I have a few customers
where that has created so much noise, the Google kind of involvement with our customers.
Or I guess even other ways, like, hey, I've used Chrome DevTools and I did a few measurements or I used web page tests and I ran a few measurements.
And that's the voice of truth.
And that's really challenging.
So, yeah, i don't know i don't know if you guys have experienced that as well but i think that's the the the explosion of of performance
tools has not really helped us necessarily get to you know what is really true what is the user
experiencing and how do you actually optimize that yeah i mean from my perspective i agree with you
but i think the we also need to differentiate between measuring performance as in the metrics that we've been talking about, whether it's page load time or rendering visual complete.
And so these are obviously important to look at from a real user perspective to really figure out what's happening out in the real world.
Another aspect that we do not get at all from anything that we do synthetically,
and it also obviously includes our product,
is what are people really doing on our page?
How are they navigating from A, entering the page until they're checking out?
And we can optimize performance as much as we want if the path, the journey,
is terrible, not based on what we think, you know, the journey is.
Because if I can write a synthetic test and write it in a way how I think people navigate through the system.
But if I actually see from real user data how people are really navigating through the system, I may identify performance bottlenecks that are not, let's say, the classical performance bottlenecks in terms of response time is high or it takes long to load, but it's actually
a performance bottleneck in terms of it takes me too many steps to get to the point because
the page is simply not intuitive enough or it renders differently on a certain screen.
Therefore, certain buttons are not visible and people have to scroll around and find
things.
So I think this is the conversation I also have a lot is learning from the real users.
How are they getting from point A, which is entering your website or your mobile app,
until they achieve what you want them to achieve?
The whole journey.
I think we always need to talk about the journey.
And that's the key thing.
And nobody can simulate this accurately because we don't know our thousands of millions of users and what they do.
Yep. I think that's a great point. I mean, it kind of gets to another topic we have on our list,
which is this intersection of behavior and performance. And I think that's where,
you know, the future generation of really kind of performance superstars at our customers
are people that understand what you just said, which is performance is only part of the equation, right? And so if we're so focused on,
well, you know, if we hit sub three seconds for visual complete, then that means, you know,
the business outcomes will be achieved. Well, that's super simplistic. And as you said,
there's lots of other things that affect our ability as customers or consumers to interact
and, you know, and get done what we want to get done on the site. And it can be a poor design,
poor UI design, you know, tricky buttons, you know, all sorts of things. Right. And so I think
you're you're exactly right that, you know, we need tools that help us both understand performance.
And I think path based performance is super important, but also path-based sort of behavior, right? So are people dropping off at certain points
in conversion funnels? You know, are people kind of struggling? And that's where I think
session replay combined with, you know, real user has an interesting, you know, value proposition,
because it is hard to know when you say, well, people are dropping off here in a
funnel. Why is that, right? Well, we can look at performance and say, well, no, performance is good.
So, okay, checkbox, hopefully that's not the issue. And then you can kind of, you know, guess about UI
design, but as a performance person, that's not my specialty, right? But if I can actually see the
sessions or see a number of sessions that are struggling at this drop off point and actually watch what's happening.
Oh, wow, they're getting confused, you know, about this button or they're or they're getting some sort of front end JavaScript error that we weren't really detecting or not really understanding how that's affecting the UI.
That's pretty cool stuff, actually. So I'm very excited about kind of bringing that piece into the, you know, into the equation because it's going to help us who aren't UI experts to sort of see what the user
is really doing and hopefully, you know, give us some hints about what might be going wrong, I guess.
Yeah, that just actually sparked a thought in my head, not major contribution, but the idea of
this whole replay being able to see the user interact with the site. I can't remember what
site it was recently where I was going to click a
button. So it has a button with text in it. Right. And when I go to click the button, nothing happens.
Why? Because you have to click the text in the button. That's the active part, right? Which is
the dumbest design, which probably, I mean, I have no idea what kind of impact that had on the thing,
but I know I myself was really frustrated, you know, fortunately for whoever it was, I didn't give up on using the site, but that's the kind of thing
that you're not going to see from statistics, right? From seeing whatever performance metric
that you want to capture. It's not going to tell you that users are getting frustrated
and clicking several times because you have just like bad button design. I don't even know why
that design exists or I can't, again,
I wish I could remember where it was because they do deserve to be shamed for
it. But, um, you know,
it's one of those problems where it's like, we're too close as designers.
I think we're there. I say we, I don't design,
but I think the exoners can be too close to the problem that they don't really
realize what they think is
intuitive because hey they're the ones building it right and and kind of putting it together
uh you know is not intuitive and i have actually a dynatrace example which is kind of silly but
we're developing this wiki page and and we're talking about some notifications to customers
around changes in our product space and i'm not involved in that but i asked one of my team
members hey you know,
where's our stuff? Cause I'm clicking on this wiki page and I can't find it. And they're like,
Oh, well you don't click on the link. You actually click on this very small arrow that's next to the
link. And that opens a sub menu. And it's like, and again, maybe I'm the only one that's an idiot
that doesn't understand that, but it was super not intuitive yet. The people that are developing it on our team were like, oh, of course, this makes sense. You click this
little subcarat that's one pixel, right? And then it expands. And I think that's exactly the
challenge is that we don't know what you don't know because you don't know where your customers
are coming from in terms of their frame of mind, in terms of this, is this intuitive, right? So I
think the session replay will be super, super interesting there to short circuit all those guesses.
Well, I think it should work this way,
or I think customers will understand.
It's like, well, let's actually watch
and see what customers are doing,
which is really, really cool.
And I think that's a great point too,
because if you go back to when the iPhone first came out,
or even just as it's working, right?
There was barely any instruction manual.
The whole thing was designed
so that you can easily figure out how to use it in fact you know we have i have two children now
one is seven and the other is nine uh or she ate she's seven i need to know these things uh she's
seven um and and and actually my second child has developmental delays right but she she was able to
pick up the ipad and figure out how to use it, right?
Both of them at very young ages.
And I bring that up because you talk about that UI design
where you had to know to do the carrot.
You shouldn't have to know like good UI design.
You can figure out just by looking at it.
And then I guess the flip side of that,
or not the flip side,
but the takeaway from that is
being able to see how your customers interact with it helps you determine whether or not that's good ui design but anyway
we're going into ui design uh i just wanted to make that point because it was just i thought it
was a real amazing feat that apple was able to create an interface that like two-year-olds can
figure out like literal two-year-olds it's amazing but i want to just add one more thought on this because I was just at DevOne at our conference in Detroit two weeks ago.
And Holger Weisberg, he's working at Dynatrace
and is leading the UX team.
And he did a phenomenal presentation about,
it's called Let's Put the U in UX.
And he really talked about UX versus UI.
And he brought a lot of these examples
on what can go wrong and what we can learn
from bad examples, but also from our real users.
So I can, Brian, I would love to put the link in.
It's called from Holger Weisberg,
Let's Put the U in UX.
And there was a recording from his session
that he did earlier this year at DevOne in Linz.
And it's just mind – it's a really great – it's a great presentation that he did, and it's eye-opening.
And the other presentation that I saw in Detroit was a presentation around accessibility. What was interesting there, the presenter, she ran us through an experiment where she used a screen reader to navigate or let us read the page that she was once designing.
She was part of a project and had to design a page.
And she was very happy with it.
But then people came and basically complained about that it's not really accessible and friendly.
And then she used a screen reader and it was phenomenal to see how complicated the page
was structured, especially for people that have certain challenges with either reading
or navigating through these pages.
And it was another great session on accessibility.
And again, I know it's kind of diverting a little bit,
but on the end, it's part of the topic we were just discussing.
It's about user experience is more
than just measuring the page load time
of individual pages.
It's the full end-to-end journey.
Yeah, let's link all those.
I mean, I think especially the accessibility one,
I have physical stuff like that with my daughter.
But what you see out there in the world of the internet is if you think people don't care much about security, the amount of people actually care
about building accessible apps or, or, or web pages is even, is probably much further down
on priority. So I think that's a topic that people really need to start taking seriously. So let's, yeah, let's put those up. That's great.
Cool. All right, Ben, what's the next item on the list?
Well, I think, you know, I don't know, I guess I'm curious because it kind of brings up the next
item, which is performance, you know, tools and kind of the intersection of, you know, behavior
and performance, but more, you know, how it relates to, you know, traditional analytics. And I think this is an interesting one. I don't
know if you guys run into this, but a lot of times I'll be with customers and they'll be asking me
things like, well, we already have Google Analytics, right? Or we already have Adobe.
Everybody has Adobe, really. I mean, Adobe is part of a bigger framework, but they obviously
bought Omniture and, you know, the analytics tools there.
And so don't we already have kind of what, you know, what RUM provides?
And then on the other side, you know, performance people saying, well, I already have kind of a RUM tool, right?
Maybe I'm getting it free from my CDN, you know, with what Akamai is doing with Sosta.
Or, you know, I'm kind of, you know, getting some RUM data from even web server
logs or something, right? I'm getting some data about real user performance. And I think, you
know, what's interesting to me is kind of bridging those gaps. And we kind of already talked about
that. It makes me think that the performance person of the future, right, really has to have
kind of, I was going to say a lag in, but then I'm thinking there's probably three things, right? In performance and behavior and UX, UI, right? And so, yeah, you can't have three
legs. So maybe legs, not the greatest analogy, but you have to kind of be able to bridge those
topics, right? But I also think tools that do that are really important because they do break
down those silos. And we get, you know, a lot of folks will kind of question, well, why in your
tool, Dynatrace, you know, your RUM tool, do you have this behavioral stuff?
Because we're already getting that from Adobe or Google Analytics.
Again, most enterprises, it's typically Adobe.
And what I always say is, you know, I know as a performance person and dealing with a lot of performance people that one of their frustrations is I'm doing all this stuff to optimize performance and I have no
idea how it affects, you know, customer behavior. And the reason I don't is because there's these
silos in our organization where, you know, I can't actually get that data, right? It's like
locked down because it's Adobe is the voice of truth for the business. And, you know, there's
maybe sensitive data in there about conversion or engagement, and therefore I can't get it as a performance person.
And so what I always say is like that's what we're trying to do is bridge this gap.
Right. So that when you make an optimization change, you can actually see, does that result in, you know, more time spent on the site, more clicks, right, more engagement, more conversion. And I think that's really an important piece. Amanda, you brought it up. It's like, you can't be this so disconnected from, you know, how performance is only one part of the
picture. And, you know, the work that we do as performance people has a big effect on behavior.
And like, we need to actually understand that. So I think that's an interesting topic is like,
you know, this bridging of those two. And is it, you know, do you use multiple
tools? Do you use the same tool? Is it dangerous if you have kind of similar data, but not exact
data? We run into that a lot. Like, well, the Adobe data doesn't exactly match what you're
showing us, Dynatrace. And therefore, you know, does that create more challenges? I'm curious
what, you know, what you guys think about that. But it's very, very top of mind for us as we go
and talk to people about performance and behavior.
And they sort of say, well, I have these other five tools, right?
Or, you know, I have this tool that does that or this.
And it's an interesting topic for me, I guess.
Yeah.
So let me bring, I mean, I like what you just said and have an analogy.
Well, not an analogy, but it reminds you of something that happens in the past in our industry. Remember the days when Steve Saunders came to the world
and said, you know, don't start optimizing the backend
because 80% of the time is actually lost in the front end.
Remember that?
Yeah, for sure.
Yeah.
And so if you look at our performance engineering,
where Brian and I come from, from the backend performance,
and we would have been tasked to optimize performance
by 50%, then shaving off 50% of something that doesn't really make a big dent.
Of 20%. Yeah, 50% of 20%. It's not too much.
It's not really a whole lot.
We could tap ourselves on the shoulder, and we could hug,
and everybody's happy at the back end because we just optimized performance by 50%.
But we would be happy, but probably nobody would really have noticed it.
And obviously, over the years we learned, we have to look at the full page load time and we need to
factor in all these other things that are actually happening when loading a page.
And I believe what you just said is the next step is we should not focus on individual pages,
but we need to figure out how does this impact behavior and and I agree with you I hear this all the time I hear that's all I believe I work more with
back-end folks than with with business folks and and when I asked them you know
do you actually know how of how is your feature used out there do you know what
impact you make with your with your deployments do you actually you actually know what type of people
you actually have on your website, which browsers
they use?
And then they say, well, we have this information somewhere
on the business, but it's either hard to get it,
or they don't want to give it away,
and then it doesn't really make sense to us
because it doesn't correlate to what
we see with our tooling that we have on the back end.
So I completely agree with you.
The challenge is there is individual teams
bringing their individual tools.
They kind of guard their data.
I've also seen, and maybe you've seen this as well,
I've also seen people that are massaging the data
into their favor because obviously at the end,
everybody needs to report to somebody
and then you massage the data so that you can show that you have done a good job, even though it's not the truth.
And I agree with you that this bringing the full-stack monitoring and full-stack includes the end user all the way to the back end and all these metrics.
And providing it in one tool and then taking these individual metrics and giving it to the right
stakeholders and especially providing that correlation is is i believe the next big step
and i believe uh correct me if i'm wrong we've been talking a lot about devops kind of bringing
dev and ops closer together and i'm pretty sure that the next step is bringing business business
closer to to devops and then whether it's BizDevOps
or whatever you want to call it,
I'm not sure what's the right term right now.
What's the term people use?
Is it BizDevOps?
Yeah, I think that's the term.
I mean, I think, and I think you're right
because a lot of the work that we get to do,
we do work with a business and it's amazing
when you can turn a business person on
to understanding why performance actually matters
in the context of a bunch of other things, right? And that's where I think you have to be a bit
nuanced as a performance person. You can't go in and say, Hey, well, you know, the, your only goal,
Mr. Business person, Mrs. Business person is to optimize performance, you know, or, or get
performance under three seconds or something, you know, the, the laugh at you. Cause of course they
have a bunch of other things, right? They have features, they have, you know, obviously all the
business metrics, et cetera, that they have to
kind of manage within that performance program. But if you get a business person to really get
excited about seeing performance as one of the factors that will make them successful or not
successful, right, will make their application successful or not successful, it's amazing what
change that will do in the organization, right? So, you know, we've seen that where we'll be talking to performance people and, you know, they're great people, but they just can't get that traction.
Like, you know, they can't get the budget. They can't get the time slices that they need to actually improve performance.
But if you can get to the business person and I think we talked a little bit about this in the last podcast, but you can get to the business person and kind of show them you guys are way outside of normal. Your competitors are kind of kicking your butt in these areas, you know, in terms of performance and you
get their mind share that performance matters. And I think that's why showing behavior and
performance intersecting is important. You know, then you can actually see change in the organization
because all of a sudden, oh, the money frees up, the time frees up, the focus frees up. They want
to see, you know, kind of metrics back to them at a business level that really changes things.
And I think that's, you know, that's where we get the most excited when we start working
with customers is like, oh, okay, this is going to be a great customer to actually see
change through the organization.
So, yeah, I think that's a super important thing is, is that next evolution.
And I think your analogy actually hit that pretty well.
And, you know, the one thing I wanted to add to this was just the idea of, you know,
the multiple tools and all the data and I'll see if I can find the link again, but there's a great
article discussing the difference between data and information, right? So I use data very
specifically. I, and I think this, I think I discussed this with you, Andy, right? So we have on, on, on our Dynatrace
site itself, we have some Google analytics running and I wanted to, as an experiment, take and run,
I wanted to try to, I wanted to export the RUM data and the performance data from RUM data
from our site and try to merge it with the Google analytics data into the Google analytics database
so that you could do, you know, multi-analytical cross-referencing and all, you know, something that a customer might do.
And I remember, I think it was you, Andy, who gave me the perfect excuse to not do it.
Because if it wasn't you, whoever it was said it was how you cannot link the R rum data to the google analytics data you can correlate but
you can't like there's no connection between that data you know it's just timestamps a transaction
there's a home page hit registered at this time in google analytics and a home page hit registered
at this time in dynatrace but you can't necessarily say those two are the same right you can't marry that data you
can do some um correlation of that data but it's not directly tied whereas when you start getting
these tools that are developed to capture all the data in one place um you have all that direct
uh not even just correlation but the causation you. You can say, here was a slow, or here was user X,
here's their actual homepage hit, here's the slowness.
So the multiple tool approach, I think,
just has a lot of limitations in that
where you're kind of saying, well,
it appears that this is related to this
because they're in approximately the same timeframe.
And that's something I think we don't have the need
to stick with doing anymore,
because more and more tools like ourselves or anything
are capturing all the data that's required
to do this the proper way.
It is an organizational challenge though, right?
Because as Andy said,
everybody brings their own tools to the party,
and then it's like, whoever has the loudest voice kind of tends to win. I mean, it kind of own tools to the party. And then it's like whoever has the loudest voice, you know, kind of tends to win.
I mean, it kind of goes back to the first one, right, where you need a voice of truth.
And I think that's really, really important because I've just seen, you know, organizations spin out because they're spending 90 percent of their time arguing about, you know, which tool is the voice of truth.
Right. And I think even though we acknowledge that every tool is imperfect and, you know, every tool probably has a of truth, right? And I think even though we acknowledge
that every tool is imperfect and, you know, every tool probably has a sweet spot of what they do
really well and what they don't do really well. At some point, I think, you know, organizations
have to say, well, this is, it's not a perfect measuring stick. No measuring stick necessary is,
but it's the one we're going to use, right? For this next period, maybe that period is,
you know, the next year or the next fiscal year or the next, you know, big releases. Because if you don't, I think you just spend and spend so
much time in kind of inter departmental fighting around, you know, what is the right metric versus
actually optimizing. And I mean, I think that's, I don't know if you guys see this too, but like for
me, having done this for 18, 19 years, I tell people, you know, the biggest issue with
performance is organizational culture. It's not performance, right? Because everybody's always
asking us like, oh, you work with hundreds of customers every year. And so, you know, what's
the, you know, what's the solution? Is it this tool? Is it this cloud? Is it this technology
stack? You know, blah, blah, blah, right? Is this CDN, this tag manager, you know, what's the thing
that's going to help us be the best and the
fastest? And it's like, most technology to me is agnostic, right? It's like, well, you can
make technology really bad and you can make it really good using the same stack or the same
kind of implement. It's all how you implement it. But really, it's the culture of performance. And
it's starting with your dev teams and your QA teams and your ops teams and kind of the misalignment across all those silos that in my, at least in my experience,
you know, that's what really differentiates top sites. And I say top in terms of performance
versus, you know, kind of everyone else that's not doing so well. So I don't know if you guys
went into that too, but I really do think organizational culture and silos is one of the
biggest, you the biggest challenges to
high-performing, effective sites, actually. Yeah. I see that, too. But I have maybe a different
angle on it. But let me explain it. So we all know that the biggest goal of every company is
to make money. And so if they would have a top-level goal
of saying, we need to increase revenue by 10%, then this should be broken down as an OKR,
as a measurable key objective. And if the organization doesn't have silos,
the teams talk with each other, then every team underneath the CEO, which would be every team, should figure out what can we contribute to that 10%?
Now, can we measure it accurately?
What contribution we have?
And I believe this is where the moment where all the teams should get together when, let's say, the goal is set out for 10% revenue increase.
And the marketing people can come in and say, well, we have some new campaigns.
Well, how do we measure it?
Well, we need some type of tool that can measure online campaign efficiency.
And then the performance team can say, well, you know, we believe that we can make your
sites faster and we want to contribute X percent to that.
And how do we measure it so that we can actually see if we really move the needle as much as
we want it and set out to versus you?
And then it goes on and on and on.
So I believe it really is a, as you said, it has to be a clear directive from the top down on what we want to achieve as an organization.
And then it has to be broken down into the individual teams that contribute to the business success.
That should be pretty much every team in the organization. And once they sit down, they need to figure out, okay, how can we really measure the goal
and our progress towards that common goal that we have.
And in a functional organization, I know it's easier said than done, but in a functional
organization, I would assume that people then come together. Well, one thing we need is we need to have something that allows us to measure all these things in a way so that every individual department can accurately say, this is how much we moved the needle.
And there's no question about your data is strange.
My data is better and so on and so forth. And I think if you do it from the top down and break down the goals and then also figure
out how to measure it, then you probably come to the conclusion that we want to do as a
team.
And now let's figure out which technology we're using to get this achieved.
But let's not pick individual island solutions.
I think that's hard.
Yeah, I agree totally.
I think that's just in practice when you have these organizations that we work with that have outsourced agencies that are doing part of the technology. They have kind of legacy technology stacks blending with new technology stacks. almost like a nirvana kind of idea, right?
Like the OKRs from the top and then everything else flows.
But I just think in our customer base, at least the ones we get to work with, which tend to be larger, right?
You know, Fortune kind of 100 customers.
That's a big, big challenge.
And it's, yeah, it's one that you can't solve with technology.
I think you're exactly right.
It's about kind of organizational structure and about goals and about common language
and kind of common metrics that you're all pulling for.
But that's a big one that I think interrupts the ability
for folks to really get done what they want to get done
and really develop and maintain awesome sites, I guess.
But even if you brought up the example
with you have some external vendors or service folks that are delivering parts of your infrastructure, parts of your applications, and they may have their own tools. can you dictate them or ask them to just contribute the same metrics or the same,
you know, use something that is relatable with all the other metrics
and then you maybe come to a conclusion that you have to kind of standardize on the same tools.
I think, you know, if I'm company X and I'm bringing in a third-party provider,
then I obviously spend a lot of money with them.
And part of that contract should also be accurate measurement that can be correlated,
or not only correlated, but actually makes sense for reporting back up to our global goals.
Because I hear this constantly that, you know, we have an external provider that sits somewhere else,
and, you know, we don't know what they do.
And they use some strange tools and they give us some reports.
And all the reports look good, but we don't really know what they mean.
But I think that's, yeah, I think that's the challenge is there's misaligned incentives there, right? So the third party, you know, provider, let's say maybe their goal is, hey, we just want to make more money, right? And, you know, do more stuff for this
company. And we're going to protect ourselves by, you know, these opaque SLA reports or KPI reports
that aren't aligned, right? And if you're not as strong, and that's from the company side,
I think you're right. If you have the right leadership and culture and kind of approach, you probably can in your contracts align those two pieces.
But I think what happens a lot of times is, oh, that contract was negotiated with procurement or with this business person who is no longer there.
And, you know, that wasn't set up very well. And now it's like this almost adversarial relationship where the vendor is saying, no, like, you know, we're providing
you these metrics to prove we're doing a good job, even if those metrics are nonsensical. Right.
And the business people are saying, oh, man, this contract wasn't negotiated very well. And
we don't even understand what these metrics mean. And we're not happy, but we can't really get out
because they're meeting. You know, it's like it's a very much a cultural, you know, kind of
procedural issue than it is a performance issue. But yeah, I think you're right. If you have the right, you know, kind of upfront understanding and expectations, you probably can
build that in. But I think a lot of our customers just don't have that or, you know, it's, hey,
this legacy partner that's been here forever, or, you know, there's some SLA report that
they're showing us that they're doing a good job, but they're really, it doesn't really translate
into real world, you know, kind of customer satisfaction
or whatever.
So it's a big challenge for sure.
Yeah, well, hopefully, you know, with our,
especially you and your team, right,
we're not only, you know, telling our customers
that there's opportunities to make things better,
but also guide them in the way to get there.
And, you know, whether you want to call it BizDevOps
or whatever you want to call it,
I believe part of what I always feel so proud of
as a Dynatrace employee
is that we're not just a tool vendor,
even though obviously we love our tool,
but we're not just a tool vendor,
but we are, at least I see myself,
and I know that most people within Dynatrace
see themselves as a partner to our customers
in getting them to the next stage.
And that includes, you know, sometimes obviously tools,
but most often is showing them a better way of doing things.
And because we can, you know, go back and have a lot of years of experience,
like you said, you know, I have 20 years of experience in that space and a lot of our colleagues in
the service organization have a lot of experience and saw a lot of other
companies and how they transformed.
We saw our own transformation and therefore we just want to make sure that
everybody out there knows we are, we're, we're not just a tool vendor.
We are your partner that helps you to, to,
to make the next right move towards a better functioning organization and surviving in the digital age, which is what it's really all about these days.
That was a great little commercial for services, Andy.
Thank you.
I owe you a beer or something.
And you brought a tear to my eye, Andy.
It was so sincere.
It was so sincere. It's true. I know know i'm busting your chops it's totally true though
um we're starting to run low on time i do want i know i think this is the last topic on here it
looks like it might be a fun one um so i do want to make sure we get to it um because yeah i even
love the the way Ben put it,
the scourge of the internet.
Let's talk about third-party content.
I think, unless anybody else has any objections of topics,
but this one I think we all see almost immediately
every time we engage with anybody.
So much third-party content, and I think it's just out of hand.
I think it's just out of hand. And I think it's a great, yeah, I think it's a great topic. I mean,
it's one of the biggest areas that we work with our customers on and really hasn't changed over
the last, you know, 10 years. It's, I mean, it's gotten worse, but in terms of like, there hasn't
been a great solution. So, you know, for, for your listeners, right. Third-party content, the way,
the way I think about it is there's kind of three types of content, right know, for your listeners, right, third-party content, the way I think about it is
there's kind of three types of content, right? There's first-party, which is your content coming
from your domains, you own it, right, your JavaScript, your images, your CSS, your HTML,
whatever. There's second-party, which most people probably haven't heard that term, but we're trying
to make it popular. And that's really not your content. So it's actually a different vendor, a third party vendor. But it's critical for the render or functionality of the page. Right. So an example of that is you might have a personalization service that's provided by a different company that when I come in, it reads my cookie and it gives me some, you know, different content that's slightly personalized, right? That would be kind of an example of a second party.
I don't own it.
I don't own the technology, the hosting, the kind of operational quality of that vendor,
but it's critical to my customer or to the render or usefulness of the page.
It has a business function, right?
Yeah, exactly.
And then there's true third party, which is the way I think about it is the most simplest
is your users could care less that this is on the site, right? So they, they, the actual customers,
they don't get any value from it, but you as an organization are getting value, right? And whether
that is, uh, you know, analytics, right? Then there's a whole host of analytics stuff, including
kind of session replay and analytics and tracking tracking, or it's, you know,
tracking campaign effectiveness, tracking media buy effectiveness. You know, there's endless kind
of tracking, you know, in that third party group. And I think that's what becomes challenging is,
you know, business people get very excited about adding this stuff, right, because it provides them
some business value or you hope it does.
And then, you know, the technology teams kind of get steamrolled and say,
okay, we need to put this on the site.
And then what happens is, you know,
you kind of get this very bloated,
you know, site experience
where there's all this stuff
that the user could care less about
that is slowing down performance, right?
And it's been implemented in, you know,
in not the ideal way.
It's implemented too high on the page. It's potentially, it's blocking. It's, you know,
it's not done in a very elegant way. And again, no offense to the technology teams. It's a lot
of times they're just given the mandate, like, hey, you got to put this on. And, you know,
then they talk to the vendor and it's like the vendors, of course, these third party vendors,
the thing that kills me is they always, in their instructions for implementation, are like, Lotus at the very top of the page, right?
Lotus, you know, before everything else. Well, why? Well, because they want to get their money,
they want to show that they're providing value, but that doesn't really balance the kind of
tracking that your customer doesn't care about, you know, but you care about with the actual user
experience and performance. And so I think that's really the challenge is, you know,
business gets sold these things, they get implemented poorly, they kill user experience.
And it's a bummer. It kind of goes back to, Andy, what you were saying. It's like, you know,
our customers can only control what they can control. So they've done a great job in hosting
and CDNs and optimization and backend and image
optimization. And then the site still sucks. And it's like, well, why? Oh, because you have
30 third parties that you don't control that are actually slowing down performance.
And it's a major, major challenge. And it's not just media sites. And it's not just the sites
you think about that might have a lot of analytics or tracking or whatever. It's really every site. I mean, it's very rare now for me to come into a customer and not see this being
one of the top three issues on their, on their, on their pages is third parties. So I don't know,
it was a very long rambling introduction to that topic, but you can tell I'm passionate about it
and about kind of fixing it. Let me, let me stop you right there though, because I want to address
the elephant in the room, right?
The biggest complaint you had, well, not the biggest, but one of the biggest things I heard
you talking about was this analytics thing has to be loaded first, right?
And I do want to say that's our mantra as well, right?
Now, we have served a different purpose, but just because I want a full disclosure kind
of thing.
Let's talk about that for a second though, right?
Because when we talk about loading our RUM tag
in to get those metrics,
we're always saying it should be loaded first.
How do you see that?
Or what is the difference if you see any between that
and other types of analytics or tracking that may be,
I don't know, I don't want to put any words
in your mouth at all.
Is there any way you address that? I'm putting you on the spot here.
It's an interesting topic that is a little awkward, especially if I've gone on a rant
about third parties and then I'm talking to the customer about adding Dynatrace, right? And there's
different ways you can add our tag. So a lot of our customers will do it in a tag manager, which
then you can kind of defer it and you kind of make that tradeoff of missing some data potentially.
But, you know, deferring deferring the load that that's that's called an agentless deployment.
But that and that's some of our customers do that that don't have the full stack of Dynatrace.
But you're right. Yeah. If you're going to auto inject it via the one agent, then it will load pretty, pretty high on the page. But I think, I think that
gets to the trade-off with all third parties, which is what is the functionality that's being
provided and what is the quality of that, of that actual, you know, tag, right. Or that,
or that integration. And that's where it becomes really interesting is you'll, you know, I think
our Dynatrace stuff, actually, we can get that beacon or get that JavaScript down pretty quickly.
It's auto-injected pretty quickly and you you can kind of show what the actual performance overhead is
versus some of these third parties. I think they're not technology companies like we are.
And they're just putting their stuff out there. And then they have no idea how bad it is. I mean,
you wouldn't believe the number of times I'll go with a customer to one of their vendors that's a third-party vendor with data and say,
this is, you know, this tag is taking a second, two seconds, or it's crazy variable, or it's
horrible in this part of the country. And the third-party vendor is like, you can just tell
when you talk to their, you know, their technology folks, they're just clueless, right? So I think,
I mean, that's a roundabout way of answering the question, but it's saying, yeah, I think you have to make that trade-off of functionality versus impact to
the customer and then watch it very closely, right? And we always encourage our customers
to monitor these third parties, right? Which you can do, right? You can see what's happening in a
waterfall and, you know, make sure that the trade-off that is being made there about functionality
versus performance is worth it, right? And so that's kind of how I would answer the question for Dynatrace. But yeah, it's an
awkward, can be an awkward discussion if I've just gone on a big rant about how much I hate
third parties. It's a bit more first party too, if you're running full stack. So. Exactly. And
yeah, criticality. I mean, the funny thing is you'll go to a customer and I'm not kidding you
almost, I would say almost 90% of customers, when we go and say, here's all the third parties that
are running on your site, no one can tell me what they are. In other words, like most people
are like, I have no idea what this is. I have no idea why this is on the site. And it's because
third parties call third parties. And really, I guess, is that a fourth party? I don't know.
But, you know, there's this kind of spider web effect. And most of the folks we deal with have
no idea what this stuff is doing. And so one of the things we suggest right out of the gate is don't start with kind of, you know, just pulling stuff off, but do an audit of is this stuff needed?
Like who put it on there?
What was the business value?
Is it still providing that business value, right?
Obviously, if you can throw performance in there, is it a bad performer, which would then make it more of a candidate for removal?
But that's a starting point is like most of our customers just
have no idea where this stuff is coming from. And that sounds crazy. And it's not like their
site's been hacked and it's like, you know, virus third party. It's just that they don't know. It's
legacy stuff. It's stuff that is loading other stuff that they weren't aware of. And that's,
I mean, that alone is a good first initiative. And that could take customers months because it's not
about, I mean, we can tell them what the third party domains are. It's about them actually finding an owner in the organization,
right? That can be really challenging for sure. I want to add one aspect to this, not continuing
exactly the discussion you have, but coming back to third party in general.
I'm sure both of you remember, it was, i'm not sure how many years it was five six seven
years ago super bowl was always the big thing for us to analyze uh performance of websites and
obviously third party was always always stay on the top but one thing that i remember very vividly
was the success that godaddy had back in the days. And I believe their story was during the Super Bowl,
I think a half an hour, an hour before
they were airing their ads,
they were taking off all the third-party components
because their mantra was,
well, it's better to survive
and get all these people on our website
and then showing all these fancy third-party components,
high-res images and all that stuff.
So let's figure out what is the bare minimum
we need for our website
and show that during the peak hour
and then switch back.
Now, obviously that was very successful,
but what I thought of this was,
well, if you can survive with the bare minimum,
why not revisit the bare minimum
and compare it with what you have right
now, every sprint or every month, every quarter, and figure out what you can shave off. Because
obviously, you are doing business just as fine, in most cases, I would assume, if you remove some
of these third parties. Yeah.
Now, I would also contribute to that because I know Ben was talking a lot about third parties
with things like different trackers and things.
But, you know, I want to throw in there like fonts, right?
We see a lot of third-party fonts being added.
Sometimes third-party CSS things being downloaded
from external places.
Take query coming from external places.
Yeah, but I mean, fonts especially.
I mean, I've been seeing a lot of clamor about fonts lately.
And it brings back kind of to your point, Andy.
Like, what does that font, is that font really doing anything?
Is that font really helping the business drive a better need?
Because you now have to load this font and re-render it.
You know, why? Yes, it's nice for things to look nice. Right.
But if, you know, I always go back to the fact that one time, way back,
one time I was able to load windows 2000 on my PC with five floppy disks.
Right. You know what I mean? And that was pretty big for itself back then.
Right. But it's, it's, it's just the whole bloated just because we have the bandwidth,
just because we have the speed doesn't mean you have to put every single bell
and whistle in your site because that just makes it mean that just means
there's so much more potential for disaster to strike out along any one of
those components.
So to your point,
Andy,
going back and saying,
what can we strip out?
What's critical?
What's not critical? And you can even collect the performance metrics afterward, right? You're like,
hey, let's pull out this font. Did that have any impact on our conversions? No. Great,
because we didn't need it. Of course, you're not going to be using console font, although I love
that. But anyway, that's just my little soapbox on that. I think it's a good point.
And I think, Andy, the idea of kind of measuring with and without is really interesting.
And we see customers where if you pull their third parties off, or they do for certain reasons, it can improve performance by 50% or more, right?
And so that, again, goes to the point of like, you know, your technology people might be doing everything they can to optimize, but they're optimizing a very small portion of what's actually driving page performance.
I will say one note of caution is I have had customers that say, well, we want to measure
ongoing our pages without any third parties, right? And they're not doing it for the way you
said, which is to sort of strip down and kind of, you know, hey, what should we add back in?
They're just like, well, that's all we can control. And so our team is gold or bonus on what we can control. And my challenge with that is like, that doesn't
exist, right? You're measuring something that doesn't exist anywhere. So your customers are
struggling, but you're reporting to your management, oh, hey, our performance is one second.
Well, yeah, but in a totally artificial sense, right? So you can take that concept too far and start to sort of
say, well, because I don't control the third parties, I just want to measure what I can
control. But the reality is your customers are struggling, right? And that's why I think it is
so important to go back to kind of the real end customer measurement, which is, which is ROM or,
you know, understanding, you know, that with, with real user, what are your customers experiencing?
It doesn't matter that, you know, you think you only control this first party stuff.
The customers are struggling.
And that's, you know, at the end of the day,
what matters most to all of us, I think, so.
Yeah, I agree.
Brian, do you think, I mean, I know that we,
I mean, Ben, I'm sure, as you said,
you know, this is a topic that is dear to your heart.
And we went through a large list now,
but considering the time and
considering that we want to
maybe keep some additional things for the
future, because there's always stuff coming up
that we want to share. We may
want to invite you as a third-time
guest. I believe,
Brian, if you're okay with it, let's
wrap it up. Do you want to summon the Summoner Raider?
Do it now. I want to summon the Summoner Raider, yeah.
Because you didn't have the Summon summer Raider last time when you guys were
chatting.
I think I tried to do a,
did I try to do one or no?
I don't recall.
So,
so for everybody,
if hopefully you just listened to episode 70 before this one,
so,
you know,
right.
And if you haven't shame on you anyway,
Andy,
go,
go on Andy.
Let's bring it on.
So a wealth of information. Thanks Ben for, for sharing. Anyway, Andy, go on, Andy. Let's bring it on.
So a wealth of information.
Thanks, Ben, for sharing.
I think what I learned is that RAM monitoring real users is the ultimate truth, as we said.
But I think we also learned that it's not only about individual page load times that we measure,
but it's very important to measure the user journey, the full user experience, because otherwise you're just optimizing individual
steps along the way.
But if you don't understand where users actually get lost on their track, then you may optimize
for the wrong thing.
I believe I also learned that it's very important to consolidate and to sit down with all of the
different teams that have a vested interest in monitoring different aspects of performance
and figure out how to consolidate on tooling and on metrics so that everyone that contributes
to the business success of the company has accurate metrics and understands how their metrics impact others in their organization.
So how does a marketing metric impact the performance of the other way around?
So in the end, they know what impact they have when they make improvements. And we talked about the top to bottom, you know, goal setting and then pushing
it down to the individual teams and the teams then figuring out what they need to measure and
what they need to do and what they can contribute to improving things. I know we also talked a lot
about different toolings. I mean, it's a reappearing topic. Obviously, there's a lot of
tools out there that a lot of different teams bring in different tools we believe we have a great solution to that with with dynatrace but dynatrace aside i believe
there's other things we learned today like in the very end and we talked about third parties
so third parties are still a big challenge out there ben you said it every time when you walk into an organization and you open up their waterfall,
you see how much content is on there
from third parties
and how that impacts performance.
We all believe it's bad
if the answer is,
well, we can't control that.
That's obviously not good
because then you're living
in your own little,
let's say, side universe
and you only see your world,
and you're not really making a big impact on the larger scale.
Yeah, I learned a lot, and that's always great.
Ben, we worked together years and years ago
when we were still working for different companies,
and we were focusing on web performance.
Now we are focusing on end-user performance.
We're focusing on user experience performance and how to optimize that.
And it's great to have you on the team with your, with your team,
with your experience. And great to know that our customers,
as I said earlier,
not only have a vendor that they can trust,
but also a valid partner and a trusted partner.
And we happy to help them.
That was such a great summary, Andy.
And thanks again, guys, for having me.
It's always fun to talk to like-minded performance people.
So it was a great another hour spent with you guys.
And thank you for joining with us.
You're now officially a member of the Two Timers Club.
The next goal will be to be on the three-timers club
um that's the the next level up all right i think you have a lot of exposure to a lot of cool
things that you're seeing and uh hopefully you'll be able to be back and be back with us soon and
share some uh recent experiences or some other topics i'm sure uh that we haven't quite covered
yet um i want to thank you for being on again. And I just want to give a warning to everybody.
If you do have Halloween candy and Andy, Andy,
candy grabber is around, look out because he will take it.
Andy, though, in all seriousness, I hope your, your voice,
you just sounded so down today. It kind of made me sad.
And I hope.
I feel, I feel great, but it's just my voice.
I'm sorry.
I've been talking too long today.
My voice is...
You know, over the weekend,
something happened to my voice.
I don't know what it was.
I hope you recover.
I'm sure I will.
And I want to thank everyone for listening once again.
If you have any questions or comments,
you can tweet them at us at pure underscore DT
or send an email at pureperformance at dynatrace.com.
If you have any show topic ideas,
you'd like to hear us talk about,
or even if you have stories and you might want to be a guest on the show,
you can reach out to us through those means as well.
So,
so long for me,
Andy and Ben.
Thank you.
Make sure whoever is on the call and it's,
if it's before end of Januaryuary to sign up for perform perform yes
2019 is coming up end of january and bam actually you have a hot day session right
a hands-on training day on monday i do yep i will be there yeah we'll do a monday session
i'll be doing a hot day as well and i think uh we'll be or at least well andy we'll be, or at least, well, Andy, you'll be running around, but we'll be doing podcasting from there as well again, I believe.
So, yes, perform 2019.
And, oh, for anybody who comes, I just want to remind you,
it will be my birthday while we're out there, so please bring me a gift.
There's no better way to show your appreciation for this podcast
than to come bring me a gift. Best gifts would be uh expensive bottles of scotch
absolutely yeah got it i'll have to remember that or like yeah or anything anything i'll
even take a pat on the back all right thank you everybody