PurePerformance - Putting the Business into SLO Automation with John Kelly
Episode Date: November 22, 2021What are good business level objectives (BLOs) besides conversion rates? Who is responsible for defining them? Who needs to report and who is held accountable?We invited John Kelly, Sales Engineer at ...Dynatrace, to answer those and even more questions. John – aka Tech Shady - has been helping our customers over the past years to implement business level reporting for their critical applications. It was exciting to hear that there is much more than your classical availability or conversion rate business metrics. The one we think is really exciting is Engagement Rate. So – tune in and learn for yourselfShow links:John Kelly on Linkedinhttps://www.linkedin.com/in/john-kelly-b22b992/John Kelly on Twitterhttps://twitter.com/JohnKelly17
Transcript
Discussion (0)
It's time for Pure Performance!
Get your stopwatches ready, it another episode of Pure Performance.
My name is Brian Wilson and as always I have my co-host with me, Andy Grabner.
Hello Andy.
Hello Brian.
How are you?
I tried to get the audio.
Are you good?
I just made funny faces when you were talking, but nobody sees my funny faces because they only listen to the audio.
You did a good job in not getting distracted.
Or didn't you look at the video?
I did a little bit, but you're saying you made funny faces, but I couldn't tell what the difference was between that and your regular face.
Okay.
So Andy, so background, I come from New Jersey.
Andy just showed me the New Jersey state bird, which is the middle finger. So because some people say flip the bird and that's the u.s euphemism but you wouldn't know
that and john kelly thinks he's talking but he's i'm gonna have him all muted so he's he's like
haha i'm coming yeah but he's not gonna be able to get a word and he could say whatever he wants
right now but i'm gonna cut it so if you want to say the dumbest thing in the world john kelly go
ahead and say right now but anyway and anyway, Andy, let's continue.
Should we really delay since we know,
since I already mentioned the name of who is going to be on here
and people have no idea who he is.
Should we just keep talking stupidly and make him wait longer and longer?
Yeah, but that means we will have long, long episodes
that nobody wants to listen anyway.
And so I don't know.
Well, I'm sure you're going to come up with a good segue.
That's true.
Because people are waiting long, Andy.
That's true.
They're waiting long and maybe they are jumping off.
They are bouncing off our website
and go to a different podcast
who is much better than our podcast.
So this would be, for instance,
a business metric that I would track
when I look at our statistics.
How long can we keep people engaged?
No, in all seriousness, actually, that is, I think, was that a good segue, Mr. Brian Wilson?
I think that's a very good segue.
Yeah.
And let's segue over to our guest today, Mr. John Kelly.
You think you are known
in the industry by a different name.
What's your other name?
Oh, yes.
TechShady. TechShady. Where does this name
come from? Come on. Because I'm
standing up.
I'm from Detroit.
Everyone in Detroit has a rapper all through
ego. So, you know, we have
Big Sean, Slim Shady shady price to 5.9 but
nonetheless i had to fit myself in there so we came up with tech shady and it's like some shady
not that i know anything about technology just to kind of but i am the shady part so there you go
well so it's it's you're just like us not knowing anything about technology but like
but we like to talk a lot about it um so john for those i think it's the first time you're just like us, not knowing anything about technology, but we like to talk a lot about it.
So, John, I think it's the first time you're actually on the podcast, if I'm not mistaken.
This is.
Now for the try-in.
And maybe the last.
We'll see.
For those listeners that haven't had the pleasure yet to come across you, to see you in media in person or talk to you on the phone,
can you give us a little background, like what you do and how you ended up in the position you're in right now, which means talking to us?
Absolutely.
About, I don't know, 20-odd years ago, I started as a pre-sales engineer for a performance management company called Precise Software.
A lot of the current Danitrace people came from Precise over into Donatrace.
And then about eight and a half years ago or so, I started as a sales engineer,
supported areas like in the D.C. area, spent a lot of time down in Dallas,
got myself a little southern accent while I was down there for a while.
And then the last couple of years, I've really been focusing on power dashboard and advanced
analytics, right?
So more of a specialist role around all things down.
So how can I work with businesses to help them maximize their down-trace investment
and deliver business analytics on top of everything else they're doing from an observability
perspective, all that.
So just really working to tie it all together and deliver some great content.
Pretty cool.
Talking about great content,
I think your performance clinics
and also your hot days,
your hands-on training days at Perform
has always ranked, I think, in the top position anyway,
like top, top, right?
Your power dashboarding sessions
have always been extremely well attended, well received.
I know you don't need more praise because you are but i think it's it's a good yeah
but no it's it's really fantastic what you're what you're doing and before we started the recording
maybe we chatted a little bit about you you constantly are you're very close in interaction
with our product team to tell them what you think, what people need out there
when it comes to business analytics.
And this is actually really then,
you know, the topic that I would like
to talk more about.
I remember Klaus Ensenhove and I,
we did webinars together
about biz DevOps,
where we talked about tying in business metrics
and aligning everything to business
when it comes to making also technical decisions.
And we talked, we came up with the term
BLO's business level objectives
because we thought it makes a lot of sense.
The world talks about SLO service level objectives,
but if we try to align it with business objectives,
let's call them business level objectives.
How do you see this?
I mean, when you talk with people out there, what are business KPIs or business level objectives?
Are they, you know, who defines them?
What are good examples?
Who is responsible for them?
And also who is held accountable in case you don't get them?
Excellent question, Andreas.
And it's actually really a simple but yet a complex answer.
Right. Could be more confusing there. It's simple in the fact when you start out because it's a business level objective.
So business. And if you really kind of strip that apart, the main thing and sometimes the only thing a business is concerned about is a conversion, right, or conversion rate,
right? So every application has conversion goals or multiple conversion goals, and it could be
something as simple as, hey, I'm going to go to Andy Grabner's performance and download the latest
performance clinic, and I have steps to take to get there, and once I complete the download,
I convert it. We think of it also traditionally in e-commerce, right, where I go to a site, I search for a product, I add it to the cart, check it out, purchase it.
And it could be, you know, anything.
A lot of different things where I'm just trying to submit a client or transfer funds.
So every business, every application is designed with a function in mind or multiple
functions. And so each one of those are a conversion goal. So when the business has a
conversion goal, you know, that's the easy part because it's like, hey, what are those goals?
And like, for example, my easy travel application, it's purchase travel, right? That's my conversion
goal. Now, the hard part is everything that supports the
application right so you look at all the infrastructure um just just everything that's
out there from infrastructure to code right that's supporting the my easy travel application
there are tons of metrics right millions of metrics that come into play. The hard part is now I have to, and
I'll make this statement, I'll probably make it a few times throughout the podcast.
Every metric has some form of relationship with a conversion goal or multiple conversion goals.
Bearing that in mind, then the challenge is just to find which metrics are the ones that have the most deviation or the
most kind of exceeding their baselines, and then understand that relationship to the conversion
goal. And I'll give an example of maybe there's this application that's running,
and I start to see a drop on my conversion goal. At the same time see a spike up and say cpu i away right so
logically you think about it if cpu i await increases that means users are waiting on cpu
to become available to process something as they've moved through like a typical user journey
the more they wait the probably the more chances of them exiting and abandoning and not converting.
So there's a relationship there.
And when you go to define the business level objective, you have to start to look and say,
hey, what is my conversion rate when we have no CPIOA?
And then maybe like, what is it at a half a percent, 1%, 2%?
You kind of keep going up as you're...
This is kind of like a role of a data scientist, right?
That kind of starts a role of a data scientist, right? That kind of
starts to consume this data because the goal is to find out where is that breaking point? Where do
I start to see my conversions trail down? And once I start to understand that, I now understand the
relationship between a metric and my conversion goal, right? And then I can establish my BLO based upon that metric, right?
And say, for example, at 3% CPU IOA,
I start to see a deterioration in CPU.
So maybe I set at 2%, 2.5%.
And then once that BLO exceeds,
then I know that's something I have to go address
because the worse it gets,
I'm going to be impacted by conversions on the backend.
So I've got a couple of questions here,
but I'll start with the one that you just highlighted
because I always thought with business level objective,
I always talk about metrics that the business understands,
but you're talking about IO metrics.
So that means you're saying a business level objective
is actually on any type of metric,
even if it's technical, as long as you know this metric is going to have an impact on the business which is conversion rate
for instance right right so now the the tough question then is and you mentioned you know data
scientists are maybe needed how do you understand how do you select those technical metrics?
Yeah, no, good question.
Sorry to cut you off, but this is my topic.
Fire it up here. Let's go.
There is things where you can set just pure business level objectives on conversion rate.
And kind of the things you would typically set just pure business level objectives on conversion rate. Okay.
And kind of the things you would typically see from a business perspective, number of sessions, right?
Different things, traffic to the website, all that.
And if that exceeds, the problem is, is that you don't have maybe the right context.
When you start to marry the data, the metrics, right, from the technical side into the BLO, then it becomes a little bit more business aware.
Now, the challenge is that there's a lot of metrics.
So that's why you need things like AI, you know, to be able to then manage and process that auto baseline and everything.
AI does a great job, for example, of understanding dependencies.
This metric is, you know is dependent on this host.
This host has a dependency with this app server.
This app server has dependencies with all these services. And it does a great job of farming that out so that when this metric starts to exceed
its baseline, it has an entire pendency map to know what could be impacted.
And when you take that one step further
where you make your ai more business aware then and i and i kind of i always challenge a lot of
customers that say hey again every metric has a relationship with that business call that
conversion and then once i can start to tie that together it makes sense so the AI is going to then alert me to those metrics that typically are more
volatile, right? And I start with those. And then there's always like the usual suspects where
I look at, say for example, I have a user journey and back on my easy travel application.
I have four steps, right? Hit the homepage, do a login, search, and purchase. And so I kind of keep a sharp eye on those steps because that's typically the four steps people take to get through to purchase.
And I look at, hey, is there anything in terms of any one of those steps where duration might have a negative impact on that user experience?
Of course.
Like if I try and log in and it takes 10 seconds, I might not wait.
I might leave.
Okay.
So now there's a relationship between long logins and conversions, right?
Or maybe there's error rates.
JavaScript error rates increase.
And so I need to understand that relationship.
And then that again then feeds them to the BLO, right?
Because then now I have it on my critical key user actions
that are part of my user journey.
And I understand everything
from duration,
largest contentful paint,
error rate, and so forth.
You kind of go down
the kind of the usual suspect metrics.
Those are good starting point.
And then AI is going to come back
and say, hey, this application,
maybe memory is very volatile.
So then I'm going to start to say,
well, what is the relationship between memory and conversions?
Always tie it back to conversion,
which is what is really the business is most interested in.
Which brings me to the next question.
There was actually a note that I took in the very beginning
because you, and I'm just reading off my notes,
you said conversion rate.
You brought the example.
One could be, does somebody go on our website
and actually download our podcast, right?
That's a conversion rate.
Or obviously purchase transfer funds.
What about like non-classical, let's say,
applications where immediately a conversion comes to mind?
So I'll give you an example, Dynatrace, right?
We have a software that people are using.
What is the conversion rate in Dynatrace if we want to figure out if we are actually doing
a good job?
So, you're talking about people who go visit dynatrace.com?
No, I'm talking about the product itself, right?
The product itself.
Yeah. itself right our users yeah like that's let's say let's assume i'm a product manager at dana trace
and what should be my conversion rate metric what's my conversion goal
you know it depends on what you're trying to achieve right so say for example you've um
you know in the product there's a lot of times where it prompts you to say, hey, there's a new feature, and you click to enable it, okay?
And that alone could be a conversion goal, right? Because we're pushing
out this, maybe it's just a new page, right? So the
like the problem page, right? So it shows all the
problems, there's a new visualization for that, because you'll have more filters and so forth.
And you promote that right up in the top it says hey click here for this new feature and also provide feedback so now i can see like two conversion goals one of which is um i know how
many users you know we can see how many users are actually you know launching down a trace and how
many actually are clicking on that feature And then how many are actually providing feedback?
Because if it's a great feature,
and if it's kind of lighting the world on fire
and it's doing phenomenal,
I guarantee you're going to get a lot of feedback.
And it's going to hopefully get a lot of positive feedback.
And so those are kind of different convergibles
that you can sit back and be able to monitor and manage that.
How much feedback are we getting?
Yeah.
So it's also, I mean, sorry, Brian, go ahead.
No, no, you finish your thought.
My thought was this is perfect
because this is, I think,
something we need to get into the heads of everyone out there.
When we are defining new features,
I think one of our conversion goals
has to be like feature adoption, feature interaction.
And then, as you said, feedback is a great another,
is one of the steps in the user journey
because we want them to try out a new feature,
use it, hopefully successfully,
but in the end, definitely give us feedback.
And these are like, it's like a conversion funnel,
but the conversion funnel, for me,
because we always talk about e-commerce so often,
for me, the conversion funnel is always to a put something in a shopping cart and then and then
check out but in this case it can be applied to any feature in any software product and in this
case it's about feature adoption and then really getting also people to give us feedback and i
think this is something that hopefully some of the product managers are
listening and i'm sure i know we have obviously internal ways and how we measure and monitor
feature feature adoption at dynatrace and we use this to drive decisions
but i wanted to just highlight that conversion rate not necessarily has to be buying something
on an e-commerce store but it can be applied
to any software feature that is implemented and if you're not and maybe this is something where what i'm saying now if you don't know what your conversion goal is then you should not even
release something because then you don't even know if you're successful with it. It's hard to measure what you don't know. Yeah, but I wonder if we're confusing some ideas here. There's the idea of feature adoption.
Now, maybe it's my understanding of this, so I'll state what I was thinking about this is,
it sounds like a lot more you're talking about feature adoption, Andy, as opposed to, to me,
a conversion is something where there's a process, there's multiple steps to get from point A to point, you know, I guess obviously point C.
So again, using the Dynatrace product as an example, the data explorer, right?
People being on that page is using that feature, but that's not a conversion.
Conversion would be more like, hey, we want to make sure people are using the data explorer
and putting those on dashboards. So the conversion then wouldn't just be heat map of how many people
are on the data conversion page, which is people using that thing. Cause they might just be wanting
to be looking at data and that and all that, but more like how many people are doing that and then
putting that on a dashboard, which is that multi-step process of click, a, a, you know,
click, click, click, click, and then have a final outcome.
Because otherwise it's like, you know, we see that there's other tools to show you heat maps and what features people are using, but there's not that process.
You know, I always think about the business level objective as a business process.
And to your point, yes, it doesn't have to be, you know, commerce related.
It could be, as John was saying, creating an account, looking up a value, uploading
documentation, right?
But there's a process that you have to go through several steps, which you have to take into consideration UX design so that users know how to use it.
You have to take into consideration performance on the front end, performance on the back end, ease of use and track, whether or not people are getting through that entire process.
And if they're falling off, where they're falling off, and is it related to usability or is it related to performance at least that's the way i've always
looked at it and i'm just wondering if we're blending the two there well you you are and i
think they both have valid points right because there's there may be certain goals in in it in
development i develop this new feature and i really want to make sure that uh are people uh
enabling that right and that could might be just something and there's not a lot more feedback other
than the fact that hey i put this new feature out there and i saw out of you know the past month
we've had you know 10 000 people enable this feature okay so that's some good feedback to
know that you you're getting the volume of people that are at least trying to enable it now to your
point brian what makes better sense is that uh are they actually using the feature as we design and
is that are they easily able to then like for the data explorer be able to find their metric
and visualize it and put it onto a dashboard right and? And so then that kind of, then you start to track those steps to say, okay, yes,
we have 10,000 people have now enabled this.
How many tiles have been built with this, right?
And then what was the experience of every time someone goes
to build a tile, because the idea behind the new Explorer
is to prove that user experience,
make it better than what it was before,
which we might've been a little bit clumsy,
things are organized better. And then you can start to analyze that saying, yes, I we might have been a little bit clumsy. Things are organized better.
And then you can start to analyze that saying, yes, I can start to see a massive improvement
because we're measuring that user experience and they seem to be having an optimal experience
now whenever they deal with the Explorer and add that to a particular tile to a dashboard.
Yeah.
So it sounds like a conversion should, at least in my definition, and I wonder if there's others, should have at least two steps.
Because you're starting at one point and getting to a second point.
And it's whether or not you made it to that bit is where the conversion is, as opposed to just adoption, which is our people on that page using it.
Which is a very, very valid business piece as well.
And that should be have goals you know I think Andy to your what you might have been getting at a little bit with the idea of if you
can't define it you shouldn't be putting it out if we go way way way back to our
interview with Garenka when there was the idea of not only before you put out
your feature and check it in what's its performance criteria going to be and
what's its adoption rate going to be for it to be successful the idea of adding into that you know what is that conversion or adoption rate um putting that in there clearly
being able to define that you know what does adoption mean what does conversion mean and as
you're putting those new features in whether it's from a i guess that would probably come more from
the the product manager right as opposed to the developer, at least the adoption and that.
That's something that should be in some way or form checked in with that feature.
Yeah, absolutely.
And you're 100% correct, Brian,
in that two steps is like the minimum.
And then for this feature,
I would look at two steps
that they basically hit the explore page, right?
And then they clicked on the button to enable the new feature.
And so it's two steps because then I could say like my example before,
10,000 people clicked on that over say 30 days, right?
How many people didn't?
How many people went to that explore page and continue to use the old
explore, right?
And then you see maybe it's like a hundred thousand and then you think,
whoa, wow, we have far more people are ignoring that feature than actually using it. And then you see maybe it's like 100,000. And then you think, whoa, wow, we
have far more people are ignoring that feature
than actually using it.
Why?
What is the reason behind it?
And that's where we get into then maybe looking
at the behavioral aspect of it.
Is this something where maybe it's not intuitive to them
that they should be clicking that,
or they don't know enough about how good that feature is?
Maybe the explanation there doesn't tell them
that this is a life changer, right?
It's going to save you from, you know, making these clumsy mistakes.
It's a lot easier to be able to create dashboards with this newer feature than
without. And so if you can promote that, have that visible, then that should in
turn grab up those conversion rates because you understand the behavior.
You try to understand the reason why people aren't enabling it. And that's kind of where you start to drive those those conversion rates because you understand the behavior you're trying to understand the reason why people aren't enabling it and that's kind of where you start to drive
those those conversion goals and the understanding of it cool hey um maybe just having a quick
discussion on how who is who is responsible for defining these metrics? Who is held accountable in case they're not met?
Who makes sure that these metrics
are captured and evaluated
in an automated way? Because otherwise, this
doesn't make sense at all. What do you see in your world? Because I have
my theory about it.
Yeah, it kind of goes, depends on the company, right? So we see a lot of larger size companies,
they have these roles of data scientists are really starting to take over more and more.
And so, and the reason why is because the data scientists typically have their hands in all
these different pots, right? So they're working with operations, development, the business side.
And so they're just trying to bring all that and mirror that together.
So they're understanding and building those types of relationships between a metric and
what the business cares about is like conversions and so forth.
And so they work with those teams to be able to establish that.
The business and anyone that has any kind of stake in the application, which should be just pretty much everybody from development operations all the way through, is they're the consumers of that.
Because ultimately, someone like if you go back to the feature example, somebody is responsible for that feature.
And so they want to measure it, but then they also want to understand the behavioral aspects behind that as well.
And so that's where the data scientists will connect those dots,
but the stakeholder or whoever is responsible for the feature will, you know,
manage and monitor that.
So, you know, the more and more, obviously, you can automate the better.
So, you know, obviously's, obviously it's kind of very smaller groups.
Sometimes they put the onus on like operations or so forth,
or maybe like that they, you know,
you're monitoring champion within an organization.
Sometimes like a data scientist can be very, very neutral, right?
Because they sort of, they work with all groups,
but when you get to say someone who maybe is more like on the operational side and they just happen to be managing the performance
solutions, and then when it comes to creating business KPIs and business SLOs and so forth,
they are, or VLOs, sorry, it's sometimes it's maybe an operational person and the issue typically with that is that are they truly biased because what if it's that cpu io weight issue you know are they
really going to want to maybe potentially expose something that they've recommended that's
undersized you know so we see a lot of times where it just depends on an organization that
you kind of maybe run into some of those, maybe a biased view of it as well.
But the more you can remove that and put it into a newer world,
like a data scientist or someone who's not really directly responsible for development
or the operations side of things, and kind of is that conduit between all of it,
then that's where things kind of tend to excel.
And they're much much much more successful
being able to create those billows kpis manage those and then ultimately put those into some
form of reporting or a dashboard so everybody can consume them that needs to know and needs
to have that information available to make the right decisions. Coming back to your cool advanced dashboards
and the reason why people love you so much in the hot days
because you're really building beautiful dashboards
that can be understood.
Thank you.
Hey, I know it's been unfortunately
through the whole pandemic
and that we couldn't travel that much.
So we didn't see each other as much as we used to. that's why i didn't get a chance to actually talk to you about what are kind of what
happened in the last 18 months in terms of any new metrics that you've seen kind of popping up any
new trends any new things that um like what i say a cool important like business level objectives
look like right now these days and how has this changed over the last couple of months and years?
Yeah. So the more and more, you know,
I deal with metrics and dashboarding every day,
deal with customers and always getting a lot of good recommendations from
customers. And it really starts to go when you start to combine things together.
Right. Cause again, like I mentioned, metrics are all related related to conversions but other metrics are related to themselves as well and so you start to kind of
start to think outside the box a little bit and kind of like one of the the jokes i mentioned
earlier to someone else is that like i like to find your role right they didn't kind of really
understand the space i told them i'm a quarter accountant right right? I'm a quarter CSI crime scene investigator, right?
I'm a quarter therapist and I'm a quarter artist, right?
I call myself an artist, right?
Because it's like from an accountant point of view, I had to crunch a lot of things.
I got to really relate and just I'm dealing with numbers all the time, right?
And from the CSI perspective, I work with customers where there was a crime scene or what we call an anomaly. So I go in and I basically work with them to understand what caused the potential
issue. So you need to be able to gather that data because that's valuable data. The past is always
a good indication of what the future holds. And to fully understand that is how you establish
those data points and bring them out to be able to now start to monitor that so that thing doesn't happen again and then the therapist part right is
because then i need to understand those relationships right i need to dig deep into
how is this related back to the business or how these these two different types of metrics related
together you know what makes sense in terms of being able to create maybe this new type of metric and then the artist part comes in here because then i got to put it up into a
dashboard that makes sense where everyone can look at it and understand it right so some of the
metrics that we start to have come up that we evolved is one where i look at the stability
right what is stability and how does stability of the stableness of an application impact conversions?
And so I'll go back to my user journey, right?
And I look at each of the different steps from home, login, search, purchase, and so forth.
If I look at just say back to login, we looked at that earlier, how stable is login?
And how i determine stability
as i look at the maximum and minimum response time and i determine what is the range right and
ideally a very stable environment is going to have the same login response time and so in that
range should be very close to zero that's the perfect world right and when i now i slice that
metric right because I have stability.
And I say, what is the stability of login
for users that convert?
What is the stability of login for users that abandon?
And what is the stability for users
that actually exit the site at the login step?
And I start to compare that.
And then I might see a scenario where users that exit the site
have 120 second range. their stability is 120 seconds
where users that continue maybe it's like three seconds okay and then i know that's hey i've
identified the issue because now what that's telling me is that people are leaving the site
are leaving because they are some long running logins. There's these outliers, right?
And I need to be able to then quickly go and identify and isolate those outliers and figure
out exactly why did it take three, four minutes for the login page to load for these users
because every single one of them exited the site.
They abandoned, right?
But more importantly, they left the site, right?
And so stability is one. And it's something, you know, you always look at metrics like, importantly, they left the site, right? And so stability is one.
And it's something, you know, you always look at metrics like, hey, what's the max duration,
min duration, average duration, 90th percentile, all these different things.
But when you start to combine some together, you create this new metric I call stability.
Another one, right?
You asked for metrics.
Yeah, I have metrics, right?
Is engagement rate, right, you asked for metrics. Yeah, I have metrics, right, is engagement rate.
All right.
Now, when I look at an application, I kind of say,
let me go back to my easy travel application.
How long does it typically take someone to move through their user journey from the homepage to purchase?
And pat a little bit onto that.
So I say, for example, maybe it takes, we'll give them three
minutes. They should be able to work their way through that journey to be able to search for
travel, select it, edit the card, check out, purchase, and they're done. All right. What are
the number of steps that are required to get through that user journey? Say in my case, I
have four, but there's some other steps I'm just not too interested in. So say it's like 10 steps.
Now what I look for is to say, okay, first career run is to find out how many sessions have entered my user journey.
Okay, so get my total funnel count or session count.
And then I say, how many sessions spent more than three minutes on my site and had less than 10 actions?
Because to me, there's an
engagement issue. If it takes them more than three minutes to get through that funnel, something's
not right. Maybe they don't know once they search for travel, they have trouble figuring out that
they got to click in and edit the card or do something, right? It's not maybe intuitive.
And so when I flagged that, so more than three minutes, less than 10 actions, that's not a really engaged
session, right? And then so I tracked that. And now I do the math, right? So the number of sessions
that I determined that were not engaged divided by sessions, and I get my engagement rate.
Then, all right, I do the same analysis. What is the engagement rate for users that convert?
Engagement rate for users that abandon and uses that exit right
and see how different it is and then i can see that something pops up right and same type of
analysis for errors duration all that it's just all about understanding the metric and then figuring
out the behavior of the user whether they convert abandon or exit and is there a delta is there a
difference if it is,
then you found the area. In my case here, it's like I'm an outdoors guy from the Detroit area.
It's like, now I know where I need to go fish, right? That's where the fish are, right? So
there's a big lake out there, but by being able to play out some of these metrics,
then I can hone in on exactly what the issue is. Hey, John, with this idea, this engagement rate, this is very enlightening.
And my question about it, though, is it sounds like you're being put in a position to figure these things out, come up with these new metrics.
Is there any industry guideline from this point of view um do we have you know a nice good set of
is it wild west where people have to create i mean because what you're doing right and and i
think it's really amazing is you're saying hey let's take a look at some of these metrics let's
and let's see if this ties back to anything because then we know that there's a relation
and when you talk about even like user engagement, right? How many of those people might've just been distracted, right? It's very,
it's very soft data, if you will, because it's just the nature of it because we're dealing with
the actual end user. It's not the computer. It's not like ticks on the CPU clock. This is, you know,
someone might stop and pick their nose for a little bit and then get back. And like that starts
throwing some of the metrics off, but we can't weed those out, right? They just get, they're in
the fold. There's nothing we can do about those out, right? They just get, they're in the fold.
There's nothing we can do about that.
But as you're going through and finding new ways to analyze data, new ways to try to find correlations between, okay, we saw this all happen and this, you know, the conversions
went down, so could be a possibility.
Let's look into this.
Is it all up to, let's say the data scientists or the accountants in your role at
that point, or actually that would be the CSI at that point, to make these things? Or do you have
from, has there been a tried and true thing of like, here's your hit list to start with? Like
if we think Andy, Steve Souders, right? You know, the hit list for browser performance,
is there a hit list of great places to start with this? Or is it really just going in and
trying to, you know, obviously you're trying to standardize some of it, but are you able to work from something, or is it all just Wild West at this point?
You try and come up with a standard, right?
So, like Google, for example, came up with some Core Web Vitals, right?
It's the largest contentful pane, right?
So, they kind of published these different metrics and how it's calculated.
Industry solutions like ours, you know, adapt to that. We provide that as a null metric that we
can look at. But when I start to go and start to play with and, you know, kind of create these
terms stability, you know, this engagement rate and so forth, and combining those, there are some
guidelines that are out there, right? It's not completely wild west, but they are, you know, basically where you can do enough research to find that the typical e-commerce site, you know, if I look at just, say, like a page load, if the page load takes more than, say, seven seconds or whatever the number is, the chance of people leaving is significant, right?
So they have a lot of good metrics like that. And so sometimes it's where I take a look at that is significant, right? So they have a lot of good metrics like that.
And so sometimes it's where I take a look at that particular metric, right,
the length of a page load, and then I sort of say, okay,
well, how would that relate into when I start to determine this min-max
instability metric that I have?
Because there's some things that I can pull that are out there,
but there's not like a place where someone's advertising that a application
stability should be X. Right.
So we just have to kind of like in that case, we feel our way out.
And so, you know, to me, it's like, did you guys ever follow the NBA?
Right.
Rasheed Wallace from the Detroit Pistons would say, ball, don't lie.
Right.
They always say, ball, don't lie. Right.
Cause he complained and got fouled and person missed a free throw. And he would say, ah,, ball don't lie. They always would say, ball don't lie, because he complained he got fouled
and the person missed a free throw.
And he would say, ah, ball don't lie.
I say data don't lie.
And so when I and I, we've been working with a lot of customers.
We start to deploy this out there.
So we start to look to say, is this stability,
is it really kind of making sense?
Are we actually starting to see where we do have wide range?
Is it because of what we perceive as that users
are going to be abandoned at a higher rate because that stability goes up. And so we start to measure
that. And we have a lot of customers that are using this and we get a lot of good feedback,
a lot of good data. And so that's kind of how we justify it. There isn't some of these engagement,
there's no one site where I can go and figure out what is a typical engagement rate for you know basically Andy Grabner's podcast
site or something like that you just have to kind of feel it out you know so some of these
and also requires you like where do you come up with three minutes and 10 actions it's that you
have to work with the people who know the application right They'll kind of get a feel for what we have in mind.
A user, I look at it as kind of the reverse of Vegas.
Vegas is that the longer you stay in the casino,
they want you there.
You tend to lose more money.
Applications are the opposite.
The longer you probably stay in the site,
the chance of you converting probably goes down.
You can measure that.
And you can validate that is Is that a true assumption?
And then feed that into your engagement equation because then it follows from there.
And who, just to wrap up on that one, who, like if a listener or I, anybody, right,
wanted to start doing this stuff, who would be the best person to talk to in the organization?
Or what would that role be? Would it, you know, project product manager may or may not know what like business team, but like who, like, where would they go to find out? Like, what should we be looking at in our industry? Right. Whether it's commerce,
whether it's banking or whatever, what are some of these like business related metrics that
make more sense in different industries to, to track? Like who would, who would they go talk
to, to collaborate any ideas on that? Or am I stumping you there?
Again, it goes back to an application owner, but what we're starting to see is the C-level. They're really starting to understand
the importance of it. For example, I can't name
the grocer, but this grocery business
is starting to change, right?
And I look at like a grocery store as like the blockbuster, right?
Because everyone used to go to the blockbuster to get the videos.
And then suddenly this thing called streaming came along and they all went away.
Grocery stores are kind of going through like this revolution right now.
It came on because of COVID where people were hesitant to go order groceries online and
have them delivered
or go pick them up because they didn't trust, right, that they would get the right type of
produce. I actually start, I do, because they're also our customers too, so you got to support
them. But anyhow, I just like figured out, get all these bruised bananas and all that kind of
stuff. So when COVID hit, it was like, nobody wanted to go into grocery stores. It's like, that's where everyone was going. Cause that's
the only place that was open, uh, kind of the worst place to go. So you do order online. And
I realized that I got some pretty darn good produce. I had no problem. And if I ever had
something wrong, they just dropped a fresh one off on my porch. It's like, this is perfect. Right.
And so now the grocery business is realizing that that is kind of the future, right?
And so then now when they look and say, well, what we used to measure before is different because now this whole ability for, you know, these different grocery store employees going down aisles and scanning and building and fulfilling your order, that needs to be measured and monitored, right?
And then so that direction for this grocery chain came from a very high level.
You know, at the C-level, it's saying that we need to manage and monitor this like we never have before because that's going to be the future of our business.
And so they're establishing those business-level objectives, those KPIs as to what is a successful ordering online or having groceries delivered business bottle look like
and what things should we measure as we continue to go forward and so that's why you know with you
know we kind of then work with those type of grocery organizations to help them right because
it's a journey everything's a journey like billboard diagonals journey yeah i gotta say
in the song yeah talk about songs.
The Bilbo song, yeah.
You don't know the one, Andy.
Leonard Nimoy sang...
Well, hey, I do...
I don't know if that answers your question, but I do want to flip the tables.
Can I flip the tables to have time?
I think Andy knows about this a little bit.
I talked about it earlier.
I think so. I think I know what's coming.
I have a rough idea. If we can make it work in five minutes,
that would be awesome.
In five minutes. Okay. So what we're going to do is we're going to play a game. Okay.
So I'm going to say it's the name of the game and it has nothing to do with show tunes from Sound of Music.
Okay. So you're good, Andy. The name of the game is Name That Metric. Okay.
So in the category is revenue dashboard, right?
So Brian, you've seen a lot of revenue dashboards,
some that I've created out there.
And so this metric, you gotta guess the metric
and I'll give you three clues, all right?
And go through it fast, we got five minutes.
So the first clue is that this metric is,
I think it's the single most important metric
from the revenue perspective.
Conversion rate.
You said conversion rate.
What did you say, Brian?
I thought there were three clues.
It's a revenue dashboard.
I'm giving you one.
You get a guess, and then I'll give you another clue.
And that's how it kind of works.
It's interactive.
It's the most important to what?
What is the most important metric that I should put on my revenue dashboard?
This is revenue, right? So I have easy travel.
Card value.
Card value. Okay.
Kind of getting there close. Andy, maybe a little closer. Okay.
Well, I couldn't steal his answer. I had to try to open a path.
Okay. All right. Well, here. Okay. So number two, as I go through here,
let's see.
Outside, you know, I've created a lot of these revenue dashboards.
So outside the dashboards I created,
I have yet to see this metric used in any revenue dashboard.
I haven't seen it appear other than the dashboards I've created.
So you want to take a second guess at it,
or should I just go to the third clip?
Maybe it's something to do with average clicks per revenue,
something like this, like how many clicks.
Because this is also related to the first question, right?
Yeah.
So Andy was closer.
It's closer to conversions. Andy was very close with conversions.
But it's not.
Maybe it has to do with the conversion rate based on engagement.
It has to be an engagement rate metric,
because I really am fascinated about the engagement rate discussion we had.
So it has to do something with...
I know, yeah.
So Andy, you're getting close.
Conversion duration?
Because we have time.
Getting closer.
So, and more importantly this metric
will identify the root cause in my experience in about 60 of the time that just looking at
that metric i knew exactly what the issue was and initial load of the first page that people get to, because otherwise they'll bounce off.
You're close. It's a revenue dashboard, right? So if they bounce off the first page, then I'm not going to have them on the revenue.
So Andy, you're close with conversion rates. So there's the, onto the final round here. So let me paint a picture and kind of describe it. So, uh,
I have my review dashboard, right? And users read dashboards, uh, left to right top down.
And so you put your most important content up on the top, right? And the dashboard should,
to me, it should really do three things. Is everything good? Are you trending in the right
direction? And is there anything that I need to do, including biz operations, anything we need to do right now?
And so that revenue dashboard should tell that.
So my revenue dashboard, and I'll just focus on the top.
Maybe there's some other pie charts and crap I put on below.
But just on the top, I have some of my revenue.
So for my time frame, for like, say, two hours, I'll put up my sum of revenue.
And let's just say it's 5,000, right?
And then what I also do is, typically right below it,
I say, what is the revenue from the previous timeframe?
Because I need to understand the trend.
Are we trending in the right direction?
And so I'll put it there and say, for example,
on my dashboard, that shows 10,000, right?
And then I do things like, you know,
a couple more metrics, I have risk revenue.
What is that?
That's like revenue where the user's experience was not satisfied.
So they're tolerated or frustrated, but they still converted.
They generated revenue.
But it's a risk because they may not do this again because they have a bad experience.
And then lost revenue is abandoned carts.
And so we track that and we show the current timeframe, previous timeframe, those metrics run across.
So now with those metrics, do I have everything I need? Am I missing something?
And, you know, the answer is, yeah, I'm missing something, right? Because if I were to tell you,
Andy and Brian, that, hey, we currently in this timeframe, so this last two hours, we did 5,000
in revenue, but the previous two hours we did 10,000. You'd say like, oh my
God, something's wrong. My hair's on fire. I need to fix this, right? But you're missing a key metric,
right? And, you know, and the same story I could tell if I just replaced revenue with claims,
right? The number of claims processed, number of risk claims, number of people who abandoned a
claim, right? But I can also do the same, and Danny was number of people who abandoned a claim, right?
But I can also do the same, and Andy was kind of hidden on the point here, with a conversion dashboard, right?
So I can do conversions, risk conversions, and loss conversions or abandons, right?
And what is one of the most important things I said up front?
And Andy, you kind of had it right.
You said the conversion rate.
Conversion rate. I said up front and Andy you kind of had it right you said the conversion rate conversion rate that's the most important thing that businesses is is you know kind of looking at that's the number
one metric is conversion conversion rate now let's flip it to revenue so what is the most important
measure if for conversions the most important for business the conversion rate is the most important metric for revenue
it's the the ratio between revenue at stake basically your lost revenue or your whatever
you called it uh towards the actual revenue i think you pretty much went around it it's called
uh revenue rate right now i have it on my dashboard, then I see it on another dashboard, right?
And that's the sum of revenue divided by session count.
So if you go back to my example, currently I have 5K, previously it was 10K, right?
And everyone thinks their hair's on fire.
But now with a revenue rate metric on that dashboard, I can see that the current revenue
rate is 100, right?
So that's about $100 per session.
And the previous revenue rate, hey, it's 100 as well.
What does that tell me?
By looking at that one metric, I can see that, ooh, we just had a 50% drop in traffic, right?
And there's nothing bad going on with my app, my infrastructure user experience,
because my revenue rate is consistent.
100 for both.
The volume two hours ago was significantly better.
This time frame, it's 50%
less. So I need to go talk
to my marketing team to better promote
advertiser website to increase traffic.
Problem solved. Without the
revenue rate, you would have been digging
into like, oh, John told us about all these relationship metrics and I got to go find engagement rates and all this.
And those are important.
But when the revenue rate is showing something that's consistent over time, then it's just traffic.
And it's going to run.
And it's going to run.
I'm really sorry.
John, you guys finished this, but I learned a lot of new stuff.
In the email, if you teach me at least one new item today,
then it's a win situation.
I think I learned more than one thing today.
Awesome.
I want to say thank you.
I'll talk to you soon.
You guys wrap it up.
Sorry to have to jump.
John, I'll wrap it up.
Bye-bye, Andy.
Thanks, Andy.
John, I think this was fascinating because I got to run too.
It's funny because you've been bugging me forever to come on.
I think there's more I want to talk to you about.
I do have another meeting right now, though, so I think maybe I'll talk to Andy and we'll definitely have you back.
Because I think, and just to give a preview for our listeners and for you, I think a lot of this ties into or expands into the SRE world.
Because when you think about site reliability,
it's mostly about is the site a bit available and running.
But then what that's not doing is looking at any of the ideas of the conversions and the user satisfaction and all that other stuff.
And I think that ties directly into this.
There's a lot of crossover and a lot of ways that both of those can be used
with each other, but also have similar subsets.
Anyway, my mind was racing when you were discussing all this stuff so fascinating stuff um yeah and i think
my last parting comment i know you got to go i'm going to try and keep as long as i can
is the second you can you know basically make your ai business aware where you don't have to
create and start to manually look at these
different relationships between the technical data and the business side.
Yeah.
When AI can actually effectively do that, which is not there yet, but will be soon,
it's really kind of game over.
And that's kind of where people are marching towards.
And so between now, we just either feed the AI, try and make it more
smarter, or provide metrics and dashboards to show you that it's this, it's engagement rate,
it's revenue rate, it's this issue. And the key to all this is collecting all this data.
This all comes back to observability and not just backend observability. We talk to customers about
this all the time. We're like, I just want to find out how my, or our operations teams, that just
want to know how their servers are performing.
Well, how about the code that you're, that's running on those servers?
And how about the end users that are engaged with the code that are running on those servers
in that environment, wherever it might be.
It's all feeding into the same thing at the end of the day, it's all to feed the business.
So yeah, very, very important stuff.
Yeah.
I think, I think we definitely want to have you back.
There's a lot more we can talk about.
It's funny because you've been fighting to get on the show for a
while to be honest it's not that we've been resisting it's just like it's always
like hey what do you want to talk about and then you know you would never come
back so we open the floodgates and hopefully our listeners enjoy this and
well thank you for the questions yeah if anybody has any questions that they want
us to bring up or talk to John about
for the next time we have him on,
please send them to pure underscore DT
or tweet them at pure underscore
DT or you can send
an email to pureperformance.dynatrace.com.
Thanks to you
very, very much, John. We look forward to
talking to you again and thanks for all the work you're doing
with us. Thanks for having me. Appreciate it.
Thanks, everyone. Thank you very much.