PurePerformance - 007 Attack of the Bots & Spiders from Mars with Richard Dominguez

Episode Date: July 1, 2016

In Part II, Richard Dominguez, Developer in Operations at PrepSportswear, is explaining the significance of understanding and dealing with bot and spider traffic on their eCommerce site. He explains w...hy they route search bot traffic to dedicated servers, how to better serve good bots and how to block the bad ones. Most importantly: we learn about a lot of metrics he is providing for the DevOps but also the marketing teams to run a better online experience!

Transcript
Discussion (0)
Starting point is 00:00:00 It's time for Pure Performance! Get your stopwatches ready, it's time for Pure Performance with Andy Grabner and Brian Wilson. Hello and welcome back to your performance. My name is Brian Wilson and as always we have with me Andy Grabner, a.k.a. Tony Grabner. Well, you know, honestly, we need to change that second name now. Tony, really? Come on. I know we coined it. I'll drop it after this one. Okay.
Starting point is 00:00:46 I would actually have assumed Arnold would make more sense, right, with the reference to Arnold Schwarzenegger. But anyway, welcome, audience. And I think thanks, Richard, for being back because this is a back-to-back recording from the How to Sell Performance to Marketing. And Richard, welcome back. Yes, thanks for having me, guys. This is a great opportunity to discuss bots, as everyone loves. It's the Attack of the Bot Show. So, yes, today we're going to be talking about bots.
Starting point is 00:01:19 And Richard, just in case you all haven't listened to the previous episode, episode six, Richard comes to us from Prep Sportswear, and he can tell you a little bit about himself. And Richard, just in case you all haven't listened to the previous episode, Episode 6, Richard comes to us from PrepSportsWare, and he can tell you a little bit about himself, but there's much more information about who he is in Episode 6. So, Richard, who are you again? Remind me. Who is Richard? I'm basically the DevOps release engineer slash insert here guy at PrepSportsWare.com. We're an e-commerce company that sells custom attire. Basically just handle anything release related,
Starting point is 00:01:50 anything build related, performance related, test related. I basically insert here related. End of the day, it's my goal is to have a stable site. That's basically it. And the best practice is to do that. And in the previous episode, we talked about marketing and you,
Starting point is 00:02:14 in the end, actually, I thought, when we did the summary, I thought there were some cool things that actually came out. You have some metrics that you present to your marketing team, page load time. You look at the DOM time of individual pages that are SEO optimized.
Starting point is 00:02:30 You also said conversion rate, obviously. But then another metric that you brought up in the very end was number of times people click on add-on cart. And I'm sure there's some other metrics you have on your dashboards. And the very – what got me excited about, because I didn't think about this, you said page load time for our SEO pages. And I assumed, obviously, the sole purpose of that is you want to optimize these pages to give your real end users a good experience because we know performance impacts user behavior and the faster the page, the more people click on it.
Starting point is 00:03:08 But then you said, well, not only that, but there's another type of users that might not be the real ones that we actually care a lot about too. Who are they? These users are, at least for us specifically, are Google bots or bots in general. But these are the SEO spiders that go on your site and figure out what you have to offer. And if your site performance is slow, well, they're not going to like that. They're going to look away. They're going to come back later.
Starting point is 00:03:37 They may not have the latest information. And we actually have dedicated servers just for these spiders to crawl. And we want to make sure that these particular landing pages that they're crawling are as performant as possible. You know, at my previous, um, job, I was at, um, WebMD. We, uh, started getting a lot of bot traffic that was slowing down our site. And they actually at that point split out a second set of servers for the bots. I don't know, I don't think they at the time
Starting point is 00:04:12 had the idea to optimize for that, which I think is a really, really interesting use case. But even just for the extra load, because they're, you know, this is going back almost five, six years, there was already so much bot traffic at the time i really love the idea of splitting out a second set so an interesting metric that i think you guys would like to hear is our overall traffic 80 is from automation said again 80 80 wow percent is all from automation. Whether it's Googlebot or whatnot. Well, think about it.
Starting point is 00:04:54 If these Googlebots would have credit cards, there would be a lot of money you could make, right? If only. If only, right? If you're a Googlebot, please click here. Exactly. Wow. 80 percent um well which doesn't which makes me feel better now because on our i think on our blog and our websites we also have a large amount of i think we have like 70 percent of the traffic is some bot traffic which we filter out uh when we do our metric delivery
Starting point is 00:05:20 to our marketing team to show them you know which content is actually consumed but so here's what's interesting for me and how do you identify the bots uh through the user agent the user agent string um that's is that uh the proceed the the true way to tell no i mean they can always a affair is an individual can always just pass that right but at least as far as if it's googlebot or bada boom bot or anything like that um yeah they'll actually have an entry in the user agent string what indicating what bot what version and uh that kind of information and we just in within our load balancer it actually looks into all these user agent strings and will sift through them and put them into the appropriate uh pool
Starting point is 00:06:10 that's just dedicated for google traffic so the the other question that i have i mean user agent string as you said is one thing and i'm sure most bots use a user agent string but some of them are probably a little let's say more sneaky and they pretend they are a regular browser have you thought about also using certain ip address ranges where you know they probably have this is probably google because it comes after out of the data center ips we we do have some ip range as well uh but mainly we use that if we detect a lot of high traffic so we get a lot of spiders from i won't say from which universities from mars probably you know probably i wouldn't be surprised but we get a lot of spiders from universities interestingly enough uh most kids
Starting point is 00:07:03 must be playing with something. And they're not going to label themselves as bots. So, yeah, I've had to include a few IP blocks temporarily for that. And so that's, I mean, another aspect. Typically, these bots, they always have the same click sequence, right? I mean, they basically go through different pages. What I've seen some of our customers do, and I know you also use UEM, use experience management, actually understanding the click path of these visits. And then based on the click path, knowing, well, this is a bot because they always start here and then they go through this particular sequence. So it's very predictable also how they navigate through the pages.
Starting point is 00:07:41 I think that's another aspect of identifying bots. The challenge, I mean, what you can do with that information, you can find the IP addresses and the user agent that they have and then configure your load balancer after the fact. But still for reporting purposes, even though they end up on your regular servers, because in the beginning they are kind of disguised, you can still look at Dynatrace and say, hey, we have 70% of the users that are clicking exactly these five pages in the sequence. It has to be a bot, or it has to be
Starting point is 00:08:14 some synthetic script. That's another thing. Yeah, you can definitely see whether an action sequence is synthetic or organic, because you'll see a whole bunch of the same ip address click on the same stuff and you can definitely determine oh okay what is this and a lot of times it'll it'll it'll identify itself as some sort of spider it won't say
Starting point is 00:08:35 bot or anything so it'll be a new spider and we'll just have to put that we'll do a little bit of research is this a nefarious spider what this used for? Did we ask these people to do this? If not, we'll block them out. Yeah. So before we go to spiders, because I think that's an interesting thing for the bots, how many servers do you have to handle that load? So out of all of our web servers, not image servers, but the web servers that's handling all our traffic, about 40% is dedicated solely for Google traffic or bot traffic. I mean, think about it.
Starting point is 00:09:17 I mean, obviously, it's costs that are hopefully good invested because it helps your SEO ranking. So is this a cost that your marketing team understands that it's basically part of marketing? that are good, hopefully good invested because it helps your SEO ranking. So is this a cost that your marketing team understands that it's basically part of marketing because it's the marketing engine? Oh, yeah. No, yeah. This is our sole source of income, I guess, as far as marketing is all SEO. It's all a Google search or Bing search. We don't have salespeople.
Starting point is 00:09:47 And it's all based on what people see when they hit on that Google site and type in, you know, and in my case, Riverside T-shirt, you know, prep sportswear should, you know, would be hopefully around the first hits. That's it so because of that we we dedicate a considerable sum of our marketing budget and everything else to ensuring that google's happy with us have you ever thought about and i'm not sure if i should say this out loud have you ever thought about delivering an optimized different version of the website to google just so that google thinks you're faster than you really are oh yes um i know that they do not like
Starting point is 00:10:35 that and i know that they somehow check for that i don't know how specific they do that but they do check and if they find inconsistencies, you will get knocked out. Okay. Yeah. So this is just, I know, I exactly know that this is happening, so that's why I want to... I thought it's a good idea when I came up with it the first time and I told it something, but I thought it's an awesome idea. And I said, are you crazy?
Starting point is 00:10:58 You know what happens? Google will penalize you for that. So, yeah. But you basically... It's kind of like Google skynet you know i mean it is no it is it's we're talking about bots but they're like the over overlord bot that controls everything uh it's kind of scary in a way yeah i make all pretty much all all e-commerce is you know everything yeah unless you're like really being it doesn't matter but like for us especially we have to follow we follow the rules very strictly and they change the rules all the time and there was a time
Starting point is 00:11:32 i would say uh six months ago that they've changed the way how they were going to do the ranking and for a few days a whole bunch of businesses all of a sudden got shot from their rankings and there was like a small panic that happened google will just do that they'll do that uh so and you want to be on top of how you deliver that information and all that stuff and from a monitoring perspective so i assume you have dashboards that show you actually how many bots come in what their page load time is what else do you monitor well what other metrics do you look at um because it's pretty random as far as when we don't actually have a strong idea of when they will always hit the site or even what they're going to look for uh so the real the only metrics that i'm
Starting point is 00:12:26 currently gathering is the load time uh is there any errors especially did google did the spiders incur any kind of errors or anything like that um timeouts of any kind uh another thing and this is more i think this is more indicative of how we have our current routing uh implementation is that they all look into a page that does no longer exists and because our site is very dynamically rendered uh we'll get a lot of errors but that's because the particular page that they're looking for doesn't exist. And it's always curious, why are they still looking for it? It doesn't exist.
Starting point is 00:13:10 The first time I get it, but two weeks later, they're doing the same one. Why? So what can you do about this? Is there anything you can do? Anything proactively, basically unregistering the page
Starting point is 00:13:20 from the Google network? Is that an option? I don't know specifically. That's something that our SEO manager, I think, looks into and figures out at that point. So that was actually my point, what I wanted to make. So I hope your SEO managers actually look at the pages that Google bots are crawling through
Starting point is 00:13:44 to see what are they actually, A, interested in, and B, which content of the index right now that either is no longer, let's say, valid, up-to-date, or actually no longer in existence. I think that's also very interesting. Yeah, GA, or Good Analytics, does provide a synopsis summary of all the URLs that it attempted to hit, and it will provide you a list of errors it incurred.
Starting point is 00:14:11 And I believe in the tool, there's ways to remove these URLs, and I think that's how it's done, but it's a very manual process, I believe. I wish there was a smarter way to do this um actually there's one thing that i actually should point out and this is definitely googlebot related uh if you are doing a maintenance page and you don't want to be hit by googlebot like let's say your site's going to be down whether on purpose or you're trying to update it, but the site has to be down for whatever reason,
Starting point is 00:14:48 and you do a maintenance page, make sure that you return a 503. If you return a 503, Google will not hit. It'll just ignore and try later. Ah. 503 is a service temporarily out of, what's he called it?
Starting point is 00:15:04 A 503, yeah. Cool. That's a great3, yeah, cool. That's a great advice, yeah. So just kind of like FYI on that. Yeah, that's cool. Is there any, is there such thing as an SEO index rating or something that you can look at and compare with, you know, what you're capturing metrics on your own performance? And, you know, if you think about conversion rate, right, you can capture your conversion rate and you can look at that in comparison
Starting point is 00:15:27 to the performance of the site. Suddenly you see a degradation in the site performance and oh, now we see a degradation in conversion rates. Is there any way that you all can kind of almost real time-ish or even a day or two delayed, have an idea of how your ranking is going and what kind of impact something might have made or how is that you look at the seo i i is a question more like could we predict what our
Starting point is 00:15:53 ranking would be or well not necessarily is is there is there is there and this this comes back to me not knowing too much about seo and how you know but how do you know how well you're doing on seo if we're if we're doing well we do generally get a number value from from google um how up to date is that or what's the uh that's the that's actually the question because we actually don't know and google is purposefully very um lucid with that okay so it's not like you can put up a lot of these metrics you're collecting and track next to it the um your ranking number and maybe you know that there's a three-day offset so you can see issues and know hey this is going to impact or right yeah it's it's it's a guessing
Starting point is 00:16:43 game it's always been a guess game um and if we were just using uh google analytics tools and nothing else uh for one thing all this like i said earlier there the information they provide is very very average we don't have any specifics um on anything that they're hit besides urls that they incurred as an error. But if our ranking is dropping, we wouldn't know until days later. It might be two days, might be three days. It's kind of unknown. And it's very much a guessing game, and it's almost designed that way.
Starting point is 00:17:26 So it's good to have reviews and monitoring. it's good to have really using monitoring it's good to have uem yeah oh yeah we want to make sure that if there's a change that's kind of the mind that we have to do on ourselves to make sure that if there's going to be a change we at least know about and we can fix it as soon as possible so that google doesn't penalize us and when you're collecting all the data do you split it out between real users and bots i mean i know you can easily do that but do you actually split it out and compare like all right here's the real user page loads here's what the bots are seeing here maybe there's errors that real users are getting you can check to see if the bots are getting them or vice versa where you're kind of looking at it all at all Actually, right now, I do have a split business transaction
Starting point is 00:18:09 that just looks at Googlebot errors and everything like that. So I can look into this particular set of charts and look into the page still time just from Google, any exceptions that occurred that Google might be causing or our servers are causing or any kind of client errors that spiders might be incurring. And it's pretty easy because it's just a business transaction based on user agent again. It's one of the, I believe, web request measures, if I remember correctly.
Starting point is 00:18:43 There's a field there that allows you to to look at that i guess then you can prioritize like hey there's an exception that the google bots are getting alone right and it might not be impacting users too much but you could that could help you get it more higher prioritized to fix because that can obviously then impact the business that's absolutely that's interesting yeah and i haven't really thought too much about this whole like you know bot setup and bot analysis in terms of google ranking so this is uh actually pretty fascinating it's huge i mean and especially what what i think is so interesting and you mentioned this in the previous episode we all just have google analytics but you said i asked the question why do you still have Google Analytics if Diamond Trace gives you much more up-to-date information
Starting point is 00:19:28 and fulfills more use cases? But we just use Google Analytics because we are always used to it. I think the marketing teams need to understand that if three days down the road, they tell you that you have a bad SEO ranking now or a worse one than you had before, then this is three days later that allows you to react. So I think this is why it's so cool that you, Richard, are actually giving all of this data
Starting point is 00:19:54 live to your marketing team so that they can immediately react. I think that's the point. That's also the DevOps aspect. You're working closely with these teams, providing them the data that they need so that they can make better decisions on, okay, what is going wrong? What do we need to focus on to make things better fast? Absolutely. You know, we can't always prevent something from breaking, but we can at least fix it fast. And that's the whole agile thing, really, right?
Starting point is 00:20:23 It's find it fast, fix it fast. And this can apply towards everything yeah so let me do like the other topic i mean bots and spiders you mentioned spiders just for people that may never thought about it what's the difference between a bot and a spider a bot you know i'm not even entirely too sure the difference is the entire truth um i always i like to do a google search on it and he's very good at putting people on the spot with questions like this by the way he's done it to me give it a shot and you know okay without searching hold it against you uh a spider so a spider is something that will like crawl the site it will actually grab information from the site or as a bot is just more of a generic term that this is just an automation
Starting point is 00:21:14 doing something from a script yeah for me the yeah that's i think i mean you put it and technically it's nothing i mean technically do the same thing right they're crawling there they're executing that's kind of growing pages yeah but i think what you said also maybe this was as the preparation in the preparation for this podcast you have third-party providers out there that are crawling your website they're spidering your website and basically try to figure out which products do you offer for which price and either they use it for the competitive advantage or they might be sites that do price comparisons and do whatever they do with the data but i think that's that's the the spider the crawler they're really you said, they're scrapping some information and then using this information for whatever purpose.
Starting point is 00:22:09 And do you, I mean, are there any things you particularly do with when you see that this and this spider comes in? What do you do with this information? There's not too much um if i see if i see a big spike of traffic coming in and it's coming from uh some sort of spider bot network that i'm unfamiliar usually at first i do some research like what what is i don't want to just cut them off because a lot of times uh maybe someone from the marketing folks did this unknowingly you know maybe they went to a site and this service provides some analytics that they want and as part of that package they have a spider network that goes into your site to
Starting point is 00:22:58 get this information um and unknowingly they activated that and all of a sudden I'm getting hit with more traffic than I realized. So that's actually another consideration is when you want to get analytics to your site and you start allowing this kind of automation to attack your site. Well, I shouldn't say attack, but to crawl your site. That can slow things down. And if I don't have the appropriate filters in place, it can start crawling our user traffic, which can slow them down, which is not something we want. Yeah.
Starting point is 00:23:40 Yeah, and I think what you, I mean, this is obviously stopping them where it was unintentional or where it's just a bad spider. Or if maybe it is a spider that you want, maybe you have partners. I don't know. Maybe you have some partner companies that are crawling your site. Oh, we do. We definitely have a few of them that we want. Not just Google, but we have a few others.
Starting point is 00:24:04 Yeah, and then maybe you just, again, want to make sure that, A, they get the content fast. But, B, maybe also if you see that there are these new type of services coming up that use the information that we have on our website for a good reason to help us promote, maybe if we see them crawling our site, our content more and more, you can even start a conversation with them and say, hey, what type of data do you need? Maybe you can just tap into our REST APIs instead of crawling the whole website if this is the data that you want, right? I mean, I think this also allows you to learn which potential business partners are out there and how can you deliver the data that they want in a more efficient way because they don't need to load the whole page they may just need access to a lower level rest api that gives them the product catalog with the price information yeah we actually do this specific thing with a
Starting point is 00:24:58 third party called seo clarity and they i believe they help us with I know I'm not sure exactly how they plug into our whole SEO strategy but they help aggregate some of the our information and the way we provide this is we give them a snapshot of some
Starting point is 00:25:20 of our IIS logs filter them out and we supply we FTP that information to their servers without them having to crawl all of our stuff. So we just provide them the raw data. Cool. Wow, I have to tell you, I've never, I mean, I'm really still just amazed by the fact that,
Starting point is 00:25:41 first of all, you said 80% of the traffic comes in from bots and spiders. Oh, yeah. And that you really have a dedicated server infrastructure where you route these requests to make sure that, A, they get the good performance, and, I guess, B, to also not impact the performance for your real users, which in the end matter a lot, obviously, right? Oh, yeah. I'm like, spiders tell the people where to go,
Starting point is 00:26:10 but the end users are the ones clicking add to cart at the end. Yeah. And is it a separate application layer as well, or is it just a separate web server layer? This is all happening at the lowest level, so the load balancing layer. Well, I mean, as far as like, you know, you said you had like certain amount of web servers dedicated to the bots. Are those reporting to their own dedicated sets of application tiers or does that then hit the larger pooled application tier that everyone's hitting?
Starting point is 00:26:40 It would be the same pool that everyone's hitting. Okay. Cool. I was just asking in case anybody wanted to take this idea back to their own organization. Say, hey, we could set it up, you know, split it at the load balancer, separate out the web servers, and just make sure you have enough horsepower underneath. Pretty much. It works. Currently, it's our last webinar, the whole microservices to monolith. Everything's running on a single massive app for us, except images. That was the big win that we finally were able to break out images in the image server. And it's interesting because it does kind of make it easy in an infrastructure kind of way
Starting point is 00:27:31 to separate a physical host, or not physical, but a VM in this case, but as a dedicated Googlebot. But once we split out into a truly microservices field, that actually will switch up a lot as far as how we do this architecture. So to kind of wrap up here, Richard, do you want to remind everybody again which metrics you look at
Starting point is 00:28:01 and actually which metrics the marketing team is also interested in? Kind of like, hey, this is is my advice this is what we do um sure uh so the very the very first metric that i think everyone should look into is just page slow time is it responsive is it loading um another metric and i did not actually mention this last podcast is number of exceptions we have another dashboard that shows this in the bottom and if there's any particular spikes that can also slow stuff down or cause other issues and maybe not necessarily front-end issues because they're being swallowed up, but other kinds of issues.
Starting point is 00:28:47 So we have that. That's a little bit more of a technical. It's not necessarily a marketing KPI, but it's something that they started to look at as well, which is actually good. Yeah. Well, that's the whole concept of getting people together, right? I mean, cross-pollinate. Exactly, cross-pollinate. yeah well that's the whole the whole concept of getting people together right i mean exactly
Starting point is 00:29:05 pollinate exactly cross pollinate just to stop on that one for a minute when you're talking about exceptions are you talking about like back-end code exceptions or client errors javascript errors i have both listed okay yeah so any particular spike, it's clearly listed. This is client errors and this is server errors. And this is definitely true after a release. If for some reason after a release we have a high number of exceptions, it's all part of that same massive dashboard that they all have alongside the conversion rate and everything else. So yeah, conversion rate, document object model uh server response time
Starting point is 00:29:47 um we have add to cart you know how many times that that button's being clicked on so we have conversion rate and conversions which is the absolute number of people who actually bought something um and those are the main ones yeah yeah i actually like that you mentioned that now because conversion rate if you have a marketing push and you attract a lot of more people then i guess the conversion rate overall may actually go down a little bit because with marketing campaigns you reach a lot of people that make it to your website but then are not interested in buying. So that means hopefully the total number of conversions still goes up but maybe not in the same rate,
Starting point is 00:30:34 which can tell your marketing team you were attracting a lot of people but they were not the right audience because these people are not buying at the same rate as the people that we had on the page without your campaign right yep yeah we want to make sure that the increase that the net increase of people coming in um that this particular campaign is we're anything from and if it's if we're just grabbing a whole bunch of people and we're not getting an additional purchase from them or the percentage does not come up, that indicates that they didn't like something. And that actually goes bad with us because now there's a negative taste and maybe these people's mouths that we don't quite understand.
Starting point is 00:31:18 Why didn't they buy anything? Yeah. And then quickly, the recap on the bot metrics. Mm-hmm. The right quickly, the recap on the bot metrics. Mm-hmm. The number of bots, obviously. That was shocking for me. Oh, so, yeah, 80% traffic. Over 80%, really. Traffic is everything bot-related. We have 40% of our infrastructure is dedicated just to pod traffic, just to handle pod traffic and automation.
Starting point is 00:31:50 And then we have a business transaction that looks at the product page loading time, which is the first page hit from Googlebot. They hit other pages as well, but this is one of the most important pages because it just lists out all our available products at that particular time. So we want to make sure that that particular page loads up very, very quickly for them. Cool. Anything else, Brian, any other final thoughts from your side? Yeah. So just listening to this, I think an important question for
Starting point is 00:32:26 everybody to ask is what is your organization doing about bots and spiders? Obviously, there was a lot about the marketing in the previous podcast, but particularly for this episode, what are you doing? What is your organization doing about bots and spiders? Find out because there is a chance hopefully it's there's something but there's still a chance there might be being nothing done they might not even be looking at they might not be paying attention you know you have a a chance to be the the john connor and take on the robots uh that's my terminator reference again curtis because of andy or tony but you know bring it up get that conversation going and find out what's going on
Starting point is 00:33:07 and see what you can do about it. Because if they aren't doing much, you know, just listening to what Richard's talking about, there's so much going on there between the SEO stuff, between the impact on the users, you know, just even from a performance level, the impact of the users, but also the bots who might be doing things or the spiders really, more of the case that might be doing things you don't want them to do on your site, really start shining a light on it. It really, it always just blows my mind how much there is to be on the lookout for from a performance point of view and how greatly that expands. Even just Andy, as you had mentioned, the difference between, you know, the conversion rate and the conversion number, all these different metrics, you know, there's, there's so much to be aware of. And I think it's
Starting point is 00:33:54 great that a lot of this stuff is being brought up here because it's probably things that a lot of people aren't thinking about. And if they are thinking about it, hopefully it's adding to what that is. But, yeah, take on the bots. Be the bot. No, don't be the bot, but be the bot tamer. Exactly. Use them wisely. Yeah, know what they're doing. Know how they're affecting the site.
Starting point is 00:34:18 Exactly. And then also exclude them. So just the last thing that I want to say because it happened to us. We monitor conversions and viewers on our blog. And then also exclude – so just the last thing that I want to say because it happened to us. We monitor conversions and viewers on our blog. And I remember for several months we thought, wow, it's amazing. For whatever reason, we got a great increase in views and we were excluding the most common bots. But we forgot to reanalyze what all could be a bot. And for a long, long time, we just ignored the fact that there's more new bots coming on.
Starting point is 00:34:51 And so I think as a lesson learned, you should constantly look at are there new bots out there? And Richard, you're doing this, right? Which bots are new? Are there any new IP addresses we need to block? Any new user agent strings. Because if you don't do it on a constant basis, you might be fooled by the numbers that you're looking at and that you report back to your business. So that's very important.
Starting point is 00:35:14 You might see a big spike of increase during maybe a particular holiday. You're like, oh, sweet, we're getting a lot of traffic. It's like, no, you actually just got hit with a set of bots. Yeah. Yeah. I hope the numbers that we're seeing on the podcast listens aren't coming from bots. They probably are. I think it's part of Andy's bots.
Starting point is 00:35:36 Exactly. As I said before. They're called the Tony bots. Sorry, I had to edit that there. That's okay. All right. Brian, what do you say? Another successful episode? Yeah, I had to edit that there. That's okay. All right. Brian, what do you say? Another successful episode?
Starting point is 00:35:49 Yeah, I hope so. And this one, I like being transparent about things. This one is being recorded in advance, but the release date is July 26th. So, Andy, I think you'll be salsa dancing tonight. I will be. I hope so. I'm actually going to be on the way to Slovenia to the Slovenian salsa Congress, uh, and dancing, uh, at the Mediterranean. Oh, but you can dance with Andy, show up, um, just wearing anything right in inappropriate
Starting point is 00:36:19 shoes and all that. Andy is a very serious dancer. So, uh, don't, yeah, don't expect to just have a silly dance with him. Any performance? Are you all, do you have any speaking engagements coming up anytime soon, or is that kind of too far?
Starting point is 00:36:36 Well, the only thing that I know is going to be probably end of August, the DevOps Days in Boston and Chicago. I'm not sure if I'm going to speak there. I'm still waiting for the proposals to be accepted, but chances are very high that I'm going to be at least there. All right, excellent. The only one that I know of is a little bit further out,
Starting point is 00:36:55 but that's, of course, Dynatrace's big event, conference. Perform. Exactly. And where is this going to be the next time? Yes, please plug our conference. The lovely place of Las Vegas. Las Vegas. So,
Starting point is 00:37:12 of course you know I'm going there. It's going to be in February. So, folks, if you are interested, perform 2017 February in Las Vegas. I might actually get to go this year. Say it again?
Starting point is 00:37:27 I said I might actually be able to go this year. Well, this year I don't think so because there's one in 2016. Oh, come on, Tony. And I actually hope you can fly because it's quite a long walk from Denver.
Starting point is 00:37:44 A lot of people in Denver are flying all the time. Yeah. Let's hope my jokes are getting better. I know they were not the best quality. Well, I think if you stick with the Tony moniker, what was Andy Kaufman had a – I'm not sure if you guys know Andy Kaufman, the comedian. He had an alter ego, Tony something, I forget.
Starting point is 00:38:10 He was very obnoxious. Tony had a cigar and a mustache. He would just go on and be very rude. So I'm just saying, Andy, look him up, channel him, and then that'll help you with the humor. Just feel soprano. All right. All right, Alright well thank you Once again everyone
Starting point is 00:38:26 Yeah that wraps it up Hopefully this one is I think this is a bit of a This one might go on The record books As our shortest episode But it's really a part two So we're cheating
Starting point is 00:38:37 Thank you all for listening Yes thank you very much And Richard thank you So much for being our guest It was an honor to have you Oh absolutely Thank you for having me Goodbye everybody Bye Yes, thank you very much. And Richard, thank you so much for being our guest. It was an honor to have you. Oh, absolutely. Thank you for having me. Goodbye, everybody.
Starting point is 00:38:47 Bye.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.