PurePerformance - The hitchhiking guide to load testing projects with Leandro Melendez
Episode Date: August 16, 2021āBecause 9 out of 10 load testing projects fail due to ignorance and outdated thinking about load testing!ā. That was the answer Leandro Melendez, aka SeƱor Performo, gave us when asking him why ...the world needs yet another book about load testing.In too many projects Leandro has to remind and educate decision makers and practitionerās about load testing best practices, how to ask the right questions and how to approach a project from start to finish. His book āThe hitchhiking guide to load testing projectsā is a fun and edu-taining read for people that are new to the trade as well as seasoned performance engineers.For more content from Senor Performo check out his PerfBytes Espanol Podcast, his Spanish YouTube channel and all the performance engineering presentations he has been given over the years.https://www.srperf.com/podcast/https://www.srperf.com/el-youtube-channel/https://www.srperf.com/presenter/Pre-order the book herehttps://www.amazon.com/dp/B09C4ZT1LB
Transcript
Discussion (0)
It's time for Pure Performance!
Get your stopwatches ready, nuevo episodio de Pure Performance.
Mi nombre es Brian Wilson y como siempre me acompaƱa mi confiterion Andy Grabner.
For English, press 1.
Hello everybody and welcome to a new episode of Pure Performance.
My name is Brian Wilson and as always I am joined by my co-host Andy Grabner.
Hola, Andy.
Hola, Brian.
I'm completely amazed.
So did you...
I practiced.
I practiced.
How many practice sessions
with Duolingo
or whatever app you use
have you taken?
That was just Google Translate
and probably about
two weeks now.
About 15 minutes.
I just kept saying it
over and over
to make sure I can get it.
So but it means that you pressed once so we're in English now. If I press dos,
we're back in Espanol?
Si, pero el episodio muy short.
So I'm still very happy that we started in Spanish. Hopefully we didn't lose a lot of people that were thinking they have clicked the wrong way.
I'm listening in America.
I want English.
Exactly.
But now the question is, probably everybody's asking, what is wrong with those?
Are they local?
Or why are they speaking in Spanish?
Do you think it has anything to do with our guest today?
It can't.
There can be no reason it has to do.
I don't understand the connection you would be no reason it has to do i i
don't i don't understand the connection you would be making there andy yeah no no i don't think so
i just i was just feeling feisty you know so okay okay good well then uh thank you for listening in
that was the show it was a little shorter than expected but i told you short exactly okay now
let's go to business. Leandro. Hola, amigos.
Hola.
Hola.
Very happy to be here.
Hola, indeed.
And yes, very local here.
And very pleased, very happy to be joining.
Hola a todos en espaƱol.
And hello to everyone that is hearing in English.
Yeah, let's keep it pressed in one for English for now.
But later we can say adios
or some other goofy things
and have some more fun.
Pero creo es tu primera vez
en Play Performance, correcto?
SĆ.
SĆ.
So it's the first time for you
in Play Performance,
hopefully not the last.
I didn't practice this.
And we didn't say,
you just said Leandro.
So let's say who Leandro is.
Like Leandro Melendez, right?
Senior Performo.
Because people still like Leandro.
Who the heck is this guy?
Leandro DiCaprio?
That is something that I have still to go like, I am Iron Man.
No, I'm SeƱor Performo. And because believe it or not, at the beginning,
some people did not link this face and name with SeƱor Performo.
So I had some conferences where I was near the person that I was tweeting to
and like, you're SeƱor Performo?
That's also why I have the mustache and everything.
That makes it more recognizable.
It's good that you mentioned and pointed your face while recording a podcast
because nobody will see it.
But we took a picture early on.
We'll use it on social media.
But fair point.
Can you quickly introduce yourself for those folks that have kind of followed
what we've been doing over the years and still haven't stumbled across your name?
Well, as we mentioned, I am Leandro Melendez,
also known in the internet and some testing conferences
and videos as SeƱor Performo.
I am an IT engineer, a performance tester,
have been for the last 20 years, a little bit over that.
I like to blog about testing, performance, IT. I have a blog.
I collaborate with the Perfvites team with Perfvites en EspaƱol. That's why you were
listening to so much EspaƱol. SĆ, muy bueno. And as well, I have a couple of YouTube channels
where I generate content about performance testing and in English and in Spanish as well.
I do some conferences, some talks here and there and some other things that I am going to be introducing soon.
Exactly. And we want to make sure that we definitely post the links to our summary, because I know we have a lot of listeners that probably prefer maybe Spanish
as it is their first language.
So it's great to know that, Leandro, you are providing content
for performance engineers in Spanish.
So very much appreciated that you contribute to the community,
like we all do when you do it in Spanish.
And I remember now, this is not the first time
that Andrew has been on Pure Performance.
Because in Perform, the last Perform we had,
he co-hosted a few episodes in EspaƱol.
That's true.
That was our first Pure Performance EspaƱol.
Very good.
And Andrew, obviously, we know each other from PerfBytes as well.
I used to do Ask PerfBytes, and you were a guest of mine many times.
So I always appreciated that. So what brings you, why are you, why do we have you on the show today even? Like what the heck? Why are we talking? Why are we talking to you? Aren't you
a competitor or something? There's no competition in performance. Yeah, I know. I don't think so.
No. Well, I have some news and some things that I have been working on for some years now.
And I am super excited to come and try to talk about and get some interest around a book.
I wrote a book about performance testing, specifically low testing. And I, um, depending on my publisher, uh, how fast he gets
to work that out. Um, it's already ready to be, um, on the, on your favorite, on the New York
Times bestseller list. Number one, I really hope so. Now, uh, on the testing times bestseller list, hopefully, I don't know, it'll be on Amazon.
So we are planning to put it in the Amazon platform pretty soon.
We will do pre-orders and we're going to be selling the book titled The Hitchh guide to low testing projects which is a fun guide a fun
way of explaining all the topics all the steps all the levels if i may may i say
on how you should do a low testing project awesome And so at the time of this release, it may or may not
be up there. But if not, coming to these people should should
just keep checking on Amazon. But also I imagine there'll be
announcement on the PerfBytes website. So yeah, check that
perfbytes.com. They'll, they'll make an announcement when that's
going to be up there.
Awesome. All right. Well, thanks for being on the show.
I was kidding.
Thank you very much.
In all seriousness,
Leandro, I'm not sure,
was very pleased when you sent me your initial draft and asked
for feedback.
Was happy to contribute. Hopefully, it
helped the final version of the book.
What I would be interested in is,
you said you've been doing
load testing for 20 years,
or at least we have been
in the space for 20 years.
Brian and I,
we've also been in the space
for many, many years.
We've been talking about
performance engineering
for a long, long time.
And I think since,
at least for me,
since I figured out how to write blog posts
and maybe get something out on YouTube,
we also started to educate people on the practices.
Why did you still write a book?
Why do you think there's still a need for a book like this?
And then also, why do you think people should buy
and look at your book?
What's special about it?
And for also, who is the target audience oh wow so many questions but um i'll be able to try to answer
in one answer yes no uh i mean um the the reason for the book to start with one of the questions, as for the few that do not know, I'm a consultant.
And over these last years, I have been showing up with many customers that need performance testing services, specifically low testing services.
And that's another topic, the differences in between performance and low testing. But often a situation that I walked into was to have to talk to management,
have to talk even to testing teams about what we were about to do. Why did I need it to do some
things? There were several technicalities. There were several C-level people that I had to explain
what we were about to do. And I kept losing them, falling asleep, watching over
the window in the meeting room and waiting for the recess to start and be able to run out of there.
And I had to come up with creative ways to engage the audience. Like, how can I explain it to
someone that has never truly thought about
systems performance? And I started to come up with examples and very, like, I was like a director.
What, how can I engage them? Hey, do you have a dog? Do you like puppies? Most of the people
answer yes. So from there, I started to generate some fun examples analogies stories that
would help me to explain to them what we were about to do and from those examples generally
they were excited and like finally I got you I know what you're gonna do and I got several times that aha moment, but several times I had to repeat the same story
several, several times till I said like, why don't I write this up on one hand for people
that are not in the trade and want to more or less understand what is going to happen.
And at the same time to give my coworkers that some may not be so creative with weird examples or relatable
things that can be explained or even for themselves to understand. I have found several
colleagues that internally I wonder like, do you really understand why we have to do this
or that way so that everyone can get a better approach so i would say
people foreign to the trade would have a blast reading the book and people starting in the
performance testing world would get a good understanding some ideas best practices that's
another one and um for seasoned performers at least they will have fun.
They will have some ways or examples
to explain the trade.
And hopefully if they have some gaps,
they will be able to identify them,
polish them and figure out
some of the best practices.
Yeah, I was just looking up the email
that I wrote to you on April 3rd
when I had this first chance to read through half of the book.
And I want to confirm what you said, because I said it's a great enter and edutaining read.
Because it was really, as you said, you made it fun to read the way you drafted the stories, the analogies.
So I can really recommend once it's out for everyone, check it out.
I also agree with you.
It's great for people
that are kind of new to the trade
and want to understand
why do we do all this?
How do we do it?
You know, what's the purpose of it?
And hopefully also
for some of the seasoned ones,
encourage them to also maybe,
you know, get into the mood
of trying to educate people with what they have learned
over the years and hopefully you can also inspire more people to to talk about their experiences
yeah i remember too one of the as you were describing it i remember it made me think
about something and i'm like oh that's right that was part of my feedback to you was that
you know i think a lot of people who go into performance get tossed into it. I had some training.
I got taught by it by someone way, way, way back.
Thanks, Maria.
I know you're not in the business anymore, but thank you.
But I was learning how to script.
And then I was on my own.
And every time I went to a job, I was like, oh, who else is the load testing here?
Who do I talk to?
Who do I coordinate with?
Who do I learn from?
They're like, you.
I was like, so I have to figure it all out and as i was reading your book i'm like holy cow if i had
this when i started it would have been a whole different world you know at one point i met mark
tomlinson and he gave me some pointers and tips and helped guide me a little bit but it's always
been and i don't know if it's still that way i imagine it probably is a lot of places because
even though we all love to think performance is number one priority everywhere.
Right. I'm still sure it's a lot of, oh, we still need to check that box.
Let's grab this this person. You you do our load testing. Right.
And you got to figure it all out. So, again, not to not to gush over your book so much.
But I think for that getting started area, at least from that point of view, it's fantastic. And of course I agree that it's a great way to help just even a seasoned person.
Like explain, like I always say, like Andy, you,
you had that whole bit of how do I explain what I do?
And you did that whole slide deck with the camera and like,
how do you explain it to your parents? I think it was way back.
And this book in a similar way is like,
how do you explain to other people what this is so they can understand it?
And so I think that's just really important.
And in a similar, on a line that you mentioned there, how most of us, I don't know a performance engineer that when they were a kid, like one said, I'm going to be a fireman.
Another, I'm going to be an astronaut.
I'm going to be a police.
And the other, I'm going to be a performance engineer.
No.
Most of us were just thrown at the trade and like, yeah, you're not a performer,
figure it out. And in, and a bad practice as well, why I wanted to also write this book is because when most of us are thrown at performance and low testing, it's like go there, script and slam the system. Like, okay, what steps?
And several bad practices that I saw or I experienced, and even myself, I did several
times.
The book has several personal learnings about how to improve what you are doing.
And most of us that are Tron, we have no idea.
We are just all script, all the functional test cases
that you find in the test case repository.
That was my initial, yes, sir.
Your orders are my commandments, right?
But then we learned, like,
why am I automating functional scripts for load testing?
Is that the best?
And little bits and things like that,
that I try to make sure that everybody understands.
And again, even seasoned engineers,
I have found still doing that after several years.
I think the whole topic of functional, reusing functional was, I think this, the whole topic of
functional,
reusing functional
tests, I think
is actually
something where
we probably can
get into
controversial
arguments, the
two of us,
because I think
I replied to
you in the
email and
said, we
actually just
had a couple
of weeks prior
to me reviewing
the book, we
had Roman
Feistel from
Triscon also on the podcast and he was actually
talking and explaining how they are they found a way to kind of generate all the functional tests
fully automatically some load tests and then running them and i understand that we all have
different backgrounds and different approaches to it but i i could see that this can be a
controversial thing in your in your book at
least it was it inspired me to tell you that maybe some other people think differently and that's
also good right because a book like this should also it's your experience uh and it should spur
some conversation no and it's it's something where um i i give some explanations about that
where because i think yeah functional tests
functional automations are very useful
for performance in general but
probably I would do it differently if I'm doing
the big and wide load tests
where if I don't have other routes
well that is the automations that I have
but as an
example as a synthetic
continuous triggering
I love to have those functional tests to be happening here and there and that we can get the metrics.
But that's also part of the distinction that I try to do with the book.
This is for load tests. I like to call it assurance because you're not only doing testing, you're figuring out things, you're trying to polish,
you are alerting, capturing events,
all that we do in the general performance trade.
But for load tests, in general,
we want to trigger the process that is most important
and generate the load.
And it depends as well on the scenario.
I'm with you.
Functional test,
I'm not going to do negative functional automated tests in a load test, but maybe others may be very relevant and useful. And I repeat this in the book with like a good question always comes
with the answer, it depends. And in performance, that applies like crazy.
I think what we've seen in many conversations, Andy
and Leandro, we've had some of these before on Ask PerfBytes, is there's a
context is always important. And if you go into
performance, let's just use the broad category of performance right now, I know we're more
about load there, but if you use performance as a scientific field if you will right it's about
knowing what your different formulas are what your different use cases all the different conditions
and then matching those to what you need to do and a lot of that is i think you cover that in the
book but the idea of like what is it that we're trying to achieve what kind of a test am i running is this a performance test is it a load test is it a stress
test it's a capacity test am i doing unit testing on microservices whatever that might be you have
to know what that is and then you have to have that vocabulary those tools those formulas if you
will to be able to execute what it is that you need. So all these things can be valid in the right use,
but you need to know and learn what to use for what and understand what
questions you're asking and how to find the answers to it.
Because it really is a much broader practice, right?
As we've said plenty of times before,
a load test is a subset of performance and there's a bunch of different
variations on all those different things, but that's the continuing study.
That to me is at least the exciting part about the entire performance field,
is that it is so broad.
There are so many different things you can tackle.
And there are so many opportunities to learn different concepts
and figure out how to apply them.
And learning all those is the real fun part.
Hey, another question that came up, and I think, I believe at least Brian and I gave you some similar feedback on this.
The book is amazing to explain newcomers to the field why we're doing this, what this is all about.
You're covering a lot of your own experience. But I think in the last couple of years, at least, we've moved to, let's say, new approaches to performance engineering and load testing, especially in the DevOps field.
And I think you even said that you could probably write yet another book on kind of where the industry is heading and where some organizations are already heading.
Is the book already? Do you have already planned it out or what's what's happening
next after this one so uh and yeah and this is something that i i got some feedback i got some
questions and personally when um i was done with the book i was like oh man uh to give you some
context i started writing this book let's say about eight or so years ago, when those were the main practices that we didn't have that much agile, DevOps, continuous, any of those buzzwords or processes were even implemented.
And it took me that long that, well, and I started wonder, is this not relevant anymore?
Should I just abort the book and start another one?
But nowadays we still have projects that are not agileable or continuous,
that you still need to do things in the waterfall way.
On one hand, that's a few of the reasons why I kept going with this book.
The other one is most of the performance engineers, as I said,
are thrown at start scripting and do these things.
And it's important that they understand some of the principles,
why we do some things in certain ways, and be able to translate. And last, I think there are still several projects
that even they have scrum meetings,
daily stand-ups, all this,
but in essence, they are still waterfall.
They have a big bang release
and they have to do this automation before release
and they just want to test for load.
But that's a very good point.
And it's more or less a closing that I do on the book that agile methodologies almost mess and mix it up.
Everything that is stated on that book is different.
Most of it.
I mean, not everything.
There are several steps that you need to change your mindset about how you do performance testing. And, and it's ironic because
lately my rant and motto has been like, performance testing is not only low testing
or they are not equal. And it's ironic that I just go and write a book about load testing. But yes, it's still
relevant. I think most of the people still need to learn and understand that.
But that gave me that opening
for the difference, how you should do performances, how you
should think about performance in these Agile continuous
days. And yep, it's in the works. I will do a similar
explaining with fun examples, with lots of references, pop culture and relatable things
so that people can correctly jump on the agile continuous wagon and start delivering or even requesting like
to their teams like hey why are we doing these things i i read that in the last book and they
said that it's not so relevant anymore where and we uh like with the new book so yes uh it's still
relevant it's still important for everyone to learn all what the book has, not just because I want to keep pushing the book,
but it truly was a moment of personal doubt and questioning myself.
Should I push for this?
And I asked a few people and they were like, yeah, that's still very, very relevant
and people still need to know it.
But yeah, beware.
There may be, I hope to not to take that many years
and that now the next methodology is over
when I'm done with the next book.
But yeah, and if things go well,
it might be a trilogy about performance matters.
So you have so long and thanks for all the tests.
Oh yeah, it's an endurance test for myself to keep pushing and writing.
And I was doing the next book in the head.
I don't know if it's the next hitchhiker book,
but one of them,
if it's going to be a hitchhiker trilogy,
it's got to be four books though.
You know?
Oh,
okay.
We can,
we can push it.
Four book trilogy.
Yeah.
Performance at the end of the universe.
Would it be?
And what would be the other one?
So long and thanks for all the tests. and uh i forget the last book so long and thank you yeah yeah i
forgot that one um so obviously we want to encourage people to to take a look at the book
and i want to ask you another question so you said you are you know a consultant you help a
lot of organizations you are brought in to do performance engineering,
performance testing, load testing.
Based on your experience,
why do you see performance projects still fail?
Not those where you were involved,
but what are the reasons
why some of these projects fail in the end?
And what is in the book that would prevent this
if people would actually read the book?
And obviously don't cover the whole book,
but maybe you can highlight two or three things where you say,
hey, you know, I wish everybody would know this exactly.
And if they only read one chapter of the book or two chapters,
this is what they should do.
This is what they should read,
because then this will lead to more successful projects. Well, it's, uh, it made me think a lot.
It's very kind of you to say that, uh, the projects that I'm involved in do not fail.
Uh, that I would be lying if I agreed with that because, uh, they at times probably wouldn't say
fail, but they don't achieve the best, the best outcomes that they could from our best practices.
And I would say that's part of what I see as some of the issues that appear when the customer, the client, the organization is adamant about following some practices that as much.
And I keep telling other testers, like, this is not the right approach. Again,
we should not be automating, changing this dropdown box 10 times or typing text because
that's functional. We should not automate some of those. We should not automate triggering batch
jobs. Those are scheduled. They are already automated in a way. We can just ask someone,
use your resources better. And some of those practices, I try to clarify and recommend,
again, not only a manager, a C-level person, but the consultant to be able to say, hey,
here's an example of why this is not the best practice,
why this will not give you the best outcome or will not provide any performance-related
benefit to you.
You are just testing, to quote a real-life example, you just want to have 5 billion users
connected to the system, even as in reality you have 10.
Why?
Because there are 10 billion registered,
but they are not active.
They are not inside of the system.
Why do you want to simulate that?
You're not even making them do anything.
And some other weird requests that we keep getting
from the customers,
and I think that will give a lot of added value because the customers that we manage to convince to show why and what is the benefit of doing things the way that we recommend or in general that I would say is the to do. When they understand it, apply it and see the results,
see that we detected lots of things that,
and to ring for what you guys are doing as well.
When you understand that it's not everything automation and slamming the system,
if you implement a good monitoring system and if you get metrics from organics, if you get all this information, performance is not about only automating and
slamming the system. And there's an area where I explain like, hey, step one, monitor. If you
can put an APM, if you can put some tracing, telemetry, things that when you are doing other testing steps, functional, manual, even
unit could be, you can get some of those metrics and testing is different.
So I think from experience and from what we see with organizations, ignorance and being to, let's say, given ways of thinking
or outdated processes or practices,
I think is what is hurting a lot organizations
and what is making performance testing projects,
if not to fail, probably, or I would better say,
they don't get the best results that they could get
from their efforts.
And most of the time with those customers, it's like, really, really, you don't want to do that.
Really, this is the fifth time I explained to you.
But in the end, you're the customer.
You're paying for the service that we need to provide.
Do you really want it?
Put it in writing.
CYA for me, you requested this and we point out the risks,
why you are not going to get the best results in this way, why we think. And here are the risks
that you may still encounter. And I would say nine out of 10 times that happens. And
on a bright side, because we did all that due diligence, we get more business back so um but i would say and that's a big
thing that i try to achieve with the book that people understand and um attach to best best
practices and get the best benefits from their testing efforts i will i will not quote you on
that but i think i'll took the I wrote this down for my summary.
Ignorance and outdated thinking are the 9 out of 10 reasons
why load testing projects
don't give you the results that you expect.
I mean, that's a very strong argument
and I think it's a very good one
and a good reason for people buying a book.
I don't mind being quoted with that.
Yeah, no, it's good. I stand by it yeah that's perfect um so can you just quickly the projects that you do um are these like
range of like very small companies to large enterprises or is it typically large enterprises that bring you in or how does that work in your line of work?
So, well, and again, it has been several,
well, we can say decades.
On this, we have gone from small, not too small,
but medium-small companies to S&P 500-sized companies
that have massive systems,
massive people interacting.
I could say one of the largest ones that I remember
and had the most fun video game, online video game,
that we had like 500,000 players playing against each other
on a platform.
Automations that had to be,
have some, not AI, I would say,
but some sort of intelligence
that would play well against each other.
And even systems that are homemade,
that you have to figure out ways.
So we get experience of all sorts of customers
and all sorts of the ones that are like, yeah,
this is waterfall. We need your process like that. We are, and I was about to do it. It's a podcast,
quote unquote, agile or continuous where we can tell them you are not yet. And here's why,
here's why this process should change. And even some others that, as I mentioned earlier,
they may be getting close to truly being
agile and continuous, but they request us
a waterfall performance test that
we just want to know the response time from our system.
Well, don't you have the metrics already?
No, we need scripts for that.
And sometimes my bosses are like, don't tell them, don't tell them.
You don't need scripts for that.
We don't need to create them.
We can help you.
We can configure a good APM, give you some metrics, some dashboards,
alerting so that you're aware of your performance.
And you don't even have high utilization,
too many customer users.
You'll be fine with just monitoring.
You'll be fine implementing telemetry in the dev environment
so that you know at the dev stage.
And many other recommendations.
But yeah, we go from projects from tiny
to several thousands of users concurrently playing with the system.
How often do you test in production?
Depending on what type of test are you referring,
because my impulse is not as much as I would like to.
Many users are still hooked up with that stigma that, oh, no, you shall not touch production.
You shall not pass over here.
And many others are having those differences or misunderstandings the times that i have been given past to test in production is
because uh there's no test environment to that is sized similar to production where they can do
a a test but they don't need a capacity test they don't need a low test or a stress test they just
want to know the performance and to be accurate-ish, I would say.
And that's another one where I'm like, you are not concerned.
You just want to measure response time under not even a stressful load
or a production load.
You can use here your test environment.
You just need to capture the problem areas.
You need to filter out.
But, and in the other hand, in production environments,
I would like to do some load tests on most of those environments.
And many of others I highly recommend to the customer. Hey, you have a decent, I don't know, blue, green process.
Why don't you, instead of asking me and and again some of my bosses is
like hey that was no i wouldn't say good business that was business but i'm like that's not the best
practice the best practice like make make your users do the testing release one version have a
parallel production environment measure that do the performance and load test.
Because I think that's another urban legend
that is not true anymore.
Load testing doesn't necessarily have to be automated anymore.
You can have better testing groups,
you can have, what was the name of, canneries you can so you mean so basically what
you mean you don't need a testing tool to generate synthetic load you can in some cases already if
the architecture allows it use real user traffic to do your performance testing and load testing
yeah most of most of the times and some some of those recommendations with some customers,
when we come up and tell them that, they're like,
what? You don't want to? Why I shouldn't automate?
I already bought the tools. I already have these.
And on the other hand, that's where like, wait,
I didn't say to skip completely the tools.
You can still create some good scripts that will not low test.
You can have them as synthetics,
let them trigger every hour, every day,
every 30 minutes, and get a measurement.
Make sure that you analyze if there's a deviation that concerns you, that you have any concern.
Lots of these shifting right methodologies
for performance, which as well is part of the trilogy or, I don't know, quadrilogy, how to say it, that I'm planning or thinking on doing.
Because many of those, I don't know how to segment, and that's a good point.
The switch left, switch right, shift, sorry, I keep switching, pun intended.
Those terms, it's important, it's part of the new and recent methodologies
and is something that many organizations need to loosen up,
get rid of the fear and because on most situations,
don't take it and run it and say that I said this.
If you don't have good release practices, probably you should not be testing in production.
But most of the time, if you did your due diligence, please start doing it.
And besides just the general user traffic in production, as we were discussing a few episodes ago,
who was it, Andy, who was talking about
you're not load testing
unless you're testing in production.
Hasse Wielstra.
He was basically saying,
shift left is a lie.
Yeah.
We came out to the idea
that, yeah, there's shift left cases,
which is more performance-based.
Load is a bit more production-based.
But to your point,
you can have your real user productions,
but when you have those tools,
you can use that then
in your production environment to test above your current traffic
levels.
What's your capacity?
What can you ramp up to?
If you're going to have a sale or whatever, and you're going to add 20% users, well, that's
where you do that.
But having your base users in the system at the time is phenomenal.
It's a phenomenal environment to test it in.
Yeah.
And he also made a great point that he said this is a great way to
make people think about SLOs
because basically you have
a certain expectation that the system, even
under 20% more load,
behaves in a certain
way. And then you need to think of
what is the expected
way? Well, let's define an SLO, whether it's
on a response time, on a failure rate, on throughput, whatever it is,
and then define it in production,
then add additional load on it
and see if the system still behaves within the SLOs,
even under higher than normal load conditions,
which can happen, right?
If some, as you said, like a flash sale happens
or something, whatever, who knows.
But as you say, those are very specific scenarios
that you want to figure out, that you want to filter out. And that's another thing that I tried
to make a point in the book as much as I could. There's no silver bullet in performance and load
testing. It's not a single test. If you have this risk scenario, test for it and figure out a way to trigger it if you have that
other do it another way put your system upside down and see if just that microservice blows out
or if the whole ecosystem will have issues there are several mixes don't think anymore that is
just yeah i run a low test and that's it. I figure out everything. I keep saying it's like every load test or performance test in general
is like looking for a needle in a haystack.
That bottleneck becomes the needle.
And if you try to do too much at once, you have several haystacks
and you're trying to find those needles on there.
And keeping a little bit with those analogies,
the performance testing area, the whole universe of performance testing is like medicine.
But for you to be healthy, you don't just go with the dermatologist, to give an example.
You also have to go to the dentist.
You have to go with the orthopedist.
And if you have severe issues, you go to the specialist on those different areas.
Same with performance testing.
You don't just do a low test here.
You just don't do switch right.
You don't just do switch left.
My recommendation would be try to implement as much of those practices as you can or is relevant.
Because there are some others that also should not apply to your particular circumstances.
As Brian said, context is super important.
That comes back even to, hey, we want to go to Kubernetes.
Well, why?
We want to do a load test.
Why?
Everything is always about context.
What's the goal?
Once you can define your goal, no matter what it is, if you're moving to a new technology,
if you want to develop a feature in a certain way,
if you want to do some sort of testing,
what is the goal of that?
And then you can design what you need to base upon that.
But isn't it because you heard it at the barbecue over the weekend?
Because you were listening to PerfBytes and said,
hey, I have to test it this way.
And you said it a little bit joking,
but that's why I said,
don't take me fully on those recommendations
because they depend a lot.
There are things that you absolutely need at times
and some other organizations
or some other projects that you yourself may have
are not needed there.
That's also why we have to,
maybe after thislaim in the beginning
of the podcast listening to this podcast will not automatically lead to better performance
this is something we should have all i have to do is listen if you just listen it's going to be
better yeah probably better understanding of performance but uh yeah it won't improve
automatically and uh like like those uh stock market podcasts like everything you hear
here is not a professional advice you should touch base with your personal performance consultant or
advisor and so on yeah good so leandro to um kind of your final pitch for people to buy the book
what's what's your final pitch? This is the reason why...
If they had $5 left in their entire life,
why should they spend it on your book
instead of food or anything else?
So to start,
I'm not sure if it's going to be $5.
We're still working that out.
In that point, to be honest,
I'm trying to keep it the lowest possible, just having a marginal profit.
I just really want that knowledge with most of the people.
And if they can, I hope that the book will give them a good journey.
It's an experience to read it. I hope it is fun that you get a good time, that you get
engaged reading all the stories, all the analogies, all the examples that I try to give. And in the
end, hopefully even if you want it or not, you'll end up understanding at least half of those
practices, why they matter, why they are important, and you'll be able to explain them.
I think that's why you should pay attention to the book, why you should pick it up, give it a try.
It might sound self-indulging, but I almost guarantee it's going to be fun. You are going
to enjoy it. At least smile here and there or like at the minimum like brian told me when
he was reviewing it you will know what is a chancla and why it is dangerous from mexican culture
i forgot about that one what was that oh no don't answer it because you gotta read the book
i really appreciate the way you handled it.
You wrote it a combination of fun anecdote, and then you went into, after that was done,
you would go ahead and explain everything behind it.
Because what I found, similar book, right, in a way, The Phoenix Project, where they
wanted to talk about this idea that's very, very important, but they did it through this fictionalized story so that you can understand it. Now, the only, the one thing that was
missing from that was they, you know, for better or for worse, I mean, I thought it
was a fine book and it communicated the ideas, but they never took the time then to take
and explain those in detail from a technical point of view as an aside to the story. I
guess they did that in follow-up books and there's a whole other practice to it.
But that's what I like.
In yours, you have both there.
You tell the anecdote,
but then you explain the technical side,
which gives you both bits of context.
And that's a very good point,
because I think it's the DevOps handbook
where they try to land the technicalities
of what the Phoenix project was trying to explain.
Gene and all the guys, even the Unicorn project expands a little bit,
trying to give a few more explanations there.
And it's something that I was half, way more than half over the book.
When I read the Phoenix, the Unicorn and the Dev of Handbook, I was like,
this is something that I'm warning for them. I am going to shamelessly try to imitate that.
I think it's also worth to tell a story around a performance testing project where you can add all
these little variations, examples, how a performance person should try to tackle those.
That's what I was saying.
Probably because that's a challenge.
You need to be also a very good storyteller to put together something that is engaging and you can relate a little bit like The Office or a Dilbert comic when everyone says like, yeah, I had a pointy hair boss that asked me these ridiculous things.
And you'll see why that happens.
And I don't know, water tank stories or things like that.
I love that explanation.
And hopefully I'll be able to push some stories for that.
But for now, enjoy the examples, enjoy the fun.
It's like a video game.
You move on level by level, little by little,
and you'll be doing progress with the book.
Thank you, Leandro.
Not for your primera vez, I think for your segunda vez,
and hopefully not for your what's the last
last time
ultima vez
yeah I'm sure
you will be back
on more stories
especially because
you have a lot of
experience
field experience
hands-on experience
and I'm pretty sure
we should figure out
what other topics
that you deal with
in a day-to-day life
we should bring
to our audience here
especially also
as our
profession is changing
and merging with other fields.
I think that would be great.
Make sure that you are sending over some of the links
to your other YouTube channels, as you mentioned.
Especially the content in Spanish would be great.
We'll then do a write-up and folks that are listening in,
check out all the links in the summary.
Brian, what is the last final words from you?
I want to commend you on your background, Leandro.
I've seen it change several times.
You've got your studio set up really nice.
Is that the only nice-looking part of your room? I forget.
I remember sometimes there'd be the pan and there's all the other gear and the equipment and
all it's very studio so so for people and I bring this up because people who should check out Leandro
on YouTube because it is a really phenomenal video production when he does it it's really
inspiring and in fact I've got my DSLR my old DSLR camera hooked up here with a bad frontal light so
I have a shiny face which I'm looking at Leandro I'm like oh he's probably got side lights anyhow this is completely off topic but you'll see it all in the
picture uh if you look at the picture that we post with this um anyway uh thank you for being on
I didn't I did not practice a wrap-up in Espanol so I'll just have to do it I'm gonna press we'll
keep keep pressing one for English and uh thank you. If anybody has any questions, comments,
please contact us at pure underscore DT
at dynatrace.com
or you can tweet us at
oh no, it's pure underscore DT at Twitter
or pureperformance at dynatrace.com for email.
See, I messed that up, Andy.
Andy's going to beat me later for that.
Thank you so much, Landry.
It's been a pleasure.
I haven't talked to you in a while,
so it's great to talk to you in a while. I wish you the best of luck with the book and hopefully it'll
be out soon uh if not i know who to pester in fact should we give out their email and tell everyone
to get to get the book out okay let's talk some let's dox them and tell them to yeah for for now
we don't we have a well and this is a small announcement,
it's going to be published through Perfides Press.
Yes.
And we're going to have an email for any questions, comments related to the book.
But in the meantime, anyone can find me on all the social networks,
as Leandro Melendez or as SeƱor Performo.
Specifically, as you mentioned, in YouTube, there are two channels,
one that ends up in
ENG for English
and no
post something
is Spanish
as you mentioned we're trying to keep up
the video production, make everything
sound, look and especially
the content to be fun, engaging
and happy as I hope the book
will be.
So guys, thank you very much. Muchas gracias. Los quiero. Besos. Adios.
Adios. Adios. Gracias.